Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by kerplunk412 View Post
    I have noticed something similar when quantifying gDNA by Nanodrop. What I saw was values that would vary quite a bit when reading different "drops" from the same tube of gDNA. This was solved by vortexing the DNA for 10 seconds. The idea is that the genomic DNA molecules are so large that one microliter might have varying amount of these large DNA molecules. Following vortexing the gDNA is in much smaller fragments, which allows it to exist more evenly in solution such that every microliter will have a much more similar amount of DNA. Think of it of grabbing handfuls of sand versus handfuls of medium sized rocks and weighing them. The handful of sand will be very close to the same weight each time, but the rocks will vary much more. I haven't tested the theory about large DNA vs sheared DNA, but we have tested vortexing DNA for 10 seconds prior to reading on the Nanodrop and it definitely results in much more consistent readings.
    Mass of genome in one human cell is 6.6 pg, so in a DNA solution of 10 ng/ul we would have equivalent of DNA from 1515 cells which would be 45,450,000 fragments of 100kb. Most standard extraction methods will result in fragments less than 100kb. So, I do not see how one can justify that 45.5 million fragments in 1ul will aggregate in a solution to give 10x variation in consecutive reads. I think just a gentle flick would be enough to have a homogenous solution (if sample was frozen) and vortexing definitely would damage large DNA fragments.

    Comment


    • #17
      Originally posted by nucacidhunter View Post
      Linear detection range of PicoGreen is four orders of magnitude in 1 ng/ml to 1000ng/ml DNA concentration. As far as one calibrates fluorometer at 0 and 1000 range there is no need to any other concentration in between or standard curve. It seems to be waste of money and time. With correct calibration one needs only to multiply the fluorescence value in dilution factor to calculate original concentration of DNA sample.
      The 500 reaction kit is cheap (60 cents/sample) and it takes a matter of minutes to add a few extra standards. You are correct regarding the linear detection range; however, I think you should review linear regression.

      Comment


      • #18
        Originally posted by nucacidhunter View Post
        Mass of genome in one human cell is 6.6 pg, so in a DNA solution of 10 ng/ul we would have equivalent of DNA from 1515 cells which would be 45,450,000 fragments of 100kb. Most standard extraction methods will result in fragments less than 100kb. So, I do not see how one can justify that 45.5 million fragments in 1ul will aggregate in a solution to give 10x variation in consecutive reads. I think just a gentle flick would be enough to have a homogenous solution (if sample was frozen) and vortexing definitely would damage large DNA fragments.
        Your logic makes sense to me, so maybe the difference in size before and after vortexing does not explain my observations. However, I tested this fairly rigorously and a few of my colleagues have tried this as well, so I can say with confidence that with the gDNA samples I was working with a gentle flick was not enough to get a consistent reading, vortexing was required. As far as damaging the DNA, I am pretty sure 10 seconds of vortexing will not cause enough DNA fragmentation to matter for most NGS applications. If it was that easy to fragment DNA into small pieces no one would need to buy a Covaris!

        Edit: I should also mention that the variation seen before vortexing was at most about 2x. Variation after vortexing was ~1%.
        Last edited by kerplunk412; 03-05-2015, 04:52 PM.

        Comment

        Latest Articles

        Collapse

        • seqadmin
          Best Practices for Single-Cell Sequencing Analysis
          by seqadmin



          While isolating and preparing single cells for sequencing was historically the bottleneck, recent technological advancements have shifted the challenge to data analysis. This highlights the rapidly evolving nature of single-cell sequencing. The inherent complexity of single-cell analysis has intensified with the surge in data volume and the incorporation of diverse and more complex datasets. This article explores the challenges in analysis, examines common pitfalls, offers...
          Yesterday, 07:15 AM
        • seqadmin
          Latest Developments in Precision Medicine
          by seqadmin



          Technological advances have led to drastic improvements in the field of precision medicine, enabling more personalized approaches to treatment. This article explores four leading groups that are overcoming many of the challenges of genomic profiling and precision medicine through their innovative platforms and technologies.

          Somatic Genomics
          “We have such a tremendous amount of genetic diversity that exists within each of us, and not just between us as individuals,”...
          05-24-2024, 01:16 PM

        ad_right_rmr

        Collapse

        News

        Collapse

        Topics Statistics Last Post
        Started by seqadmin, Today, 06:58 AM
        0 responses
        13 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, Yesterday, 08:18 AM
        0 responses
        19 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, Yesterday, 08:04 AM
        0 responses
        18 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, 06-03-2024, 06:55 AM
        0 responses
        13 views
        0 likes
        Last Post seqadmin  
        Working...
        X