Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • The necessarity of between-sample normalization?

    I'm currently perform some analyses involving cross-project expression data. Because it involves linear equations, we have decided to take log(TPM) as that algorithm's input.

    While using older workflows involving rsubread or htseq-count would always require us to perform between-sample normalization, newer transcript quantification tools such as RSEM, Kalisto and Salmon gives out reads and (at a minimum) TPM as their raw output.

    But even in that case, should I take out the reads, normalize it with DESeq2/edgeR, and calculate the TPMs instead? I'm not particularly comfortable with not doing between-sample normalizations, but I have a feeling that it's the norm these days.

  • #2
    Somebody more knowledgeable may correct me, but the statistical methods used by DESeq2 rely on having the raw read counts to calculate power and significance, and therefore can't use normalized values like TPM (There's a huge statistical difference in seeing one count in a million reads vs a thousand counts in a billion that's lost in normalization)

    Comment


    • #3
      Originally posted by cmbetts View Post
      Somebody more knowledgeable may correct me, but the statistical methods used by DESeq2 rely on having the raw read counts to calculate power and significance, and therefore can't use normalized values like TPM (There's a huge statistical difference in seeing one count in a million reads vs a thousand counts in a billion that's lost in normalization)
      Of course I know things like DESeq2 can't take TPM. The options I mentioned above are:
      1. Take the TPM from the quantifier directly to downstream.
      2. Take the expected read count from the quantifier, normalized by DESeq2 (etc), and then calculate TPM from this normalized number.

      Comment

      Latest Articles

      Collapse

      • seqadmin
        Best Practices for Single-Cell Sequencing Analysis
        by seqadmin



        While isolating and preparing single cells for sequencing was historically the bottleneck, recent technological advancements have shifted the challenge to data analysis. This highlights the rapidly evolving nature of single-cell sequencing. The inherent complexity of single-cell analysis has intensified with the surge in data volume and the incorporation of diverse and more complex datasets. This article explores the challenges in analysis, examines common pitfalls, offers...
        Today, 07:15 AM
      • seqadmin
        Latest Developments in Precision Medicine
        by seqadmin



        Technological advances have led to drastic improvements in the field of precision medicine, enabling more personalized approaches to treatment. This article explores four leading groups that are overcoming many of the challenges of genomic profiling and precision medicine through their innovative platforms and technologies.

        Somatic Genomics
        “We have such a tremendous amount of genetic diversity that exists within each of us, and not just between us as individuals,”...
        05-24-2024, 01:16 PM

      ad_right_rmr

      Collapse

      News

      Collapse

      Topics Statistics Last Post
      Started by seqadmin, Today, 08:18 AM
      0 responses
      10 views
      0 likes
      Last Post seqadmin  
      Started by seqadmin, Today, 08:04 AM
      0 responses
      12 views
      0 likes
      Last Post seqadmin  
      Started by seqadmin, 06-03-2024, 06:55 AM
      0 responses
      13 views
      0 likes
      Last Post seqadmin  
      Started by seqadmin, 05-30-2024, 03:16 PM
      0 responses
      27 views
      0 likes
      Last Post seqadmin  
      Working...
      X