Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Cannot Normalize DNA for GBS/RadSeq, Nonsense Qbit results

    Our lab is carrying out an initial pilot test of two different RADseq/GBS protocols to see which will work best for us. Both protocols emphasize the importance of normalizing DNA extractions before beginning the library prep using a Qbit or similar to around 10–20 ng/µl. So far this deceptively simple first step has been near impossible:

    I measure my 48 samples on the Qbit, and then after individually diluting the samples to bring them to the same concentration, I remeasure and the values are most often nowhere near where expected. Some are in the ballpark (possible by coincidence), many vary from 5–100% from where they should be. So for example, I will measure say four samples which read around 20 ng/µl and after diluting them all to half their original concentration and re-measuring, I get varied results like 3 ng/µl, 9 ng/µl, 19 ng/µl, and 22 ng/µl, instead of the expected 10 ng/µl. Since this problem arose, I have done just about every iteration of troubleshooting I can think of over the past month and can't make these non-sensical values go away. I have:

    - used the broad-range and high-sensitivity kits.
    - ordered a new kit to see if our old one was compromised.
    - used a different qBit at a different facility.
    - troubleshooted with "cleaner" DNA (e.g. lambda and oligos from biotech companies)

    ....and the above-mentioned problem still occurs in all of these situations.

    It would be great if I could get some input from someone who has faced similar problems and what they did about it. At this point my instinct is to measure the samples once, and just trust the number and move on, but I am very nervous about having over- or under-represented samples in multiplexed library. I am very curious to know what is actually happening behind the scenes when someone says something like "We normalized all DNA to 20ng/µl", at the beginning of a library prep protocol, as I am starting to feel like this is basically impossible...


    Thanks in advance.

  • #2
    Have you run out the samples on a gel to look at the relative levels? Not that I understand your problem, but it is good to get an orthogonal measurement when things look weird (and to check for quality issues as well).
    Providing nextRAD genotyping and PacBio sequencing services. http://snpsaurus.com

    Comment


    • #3
      Thanks for the suggestion. We haven't run gels yet. I don't have any experience running gels for quantification but have been reading about it. That seems like a good side check that the input samples are at least relatively at the same concentrations...

      Comment


      • #4
        There are very careful ways to do it, and there are ladders to help estimate the amounts, but you can just run out an agarose gel and see what you see, and in this case that would help.
        Providing nextRAD genotyping and PacBio sequencing services. http://snpsaurus.com

        Comment


        • #5
          I can think of three reasons:

          1- Pipetting inaccuracy caused by calibration/maintenance issues or operator
          2- Presence of viscous material in DNA preps (mostly in non-column based extractions). This would show as streaks in wells of gel. This can be avoided by spinning DNA tube or plate at high speed for 5 min to precipitate non-soluble materials and transferring from top of wells.
          3- Proper mixing of DNA with reagents or after dilution

          Comment


          • #6
            Originally posted by nucacidhunter View Post
            I can think of three reasons:

            1- Pipetting inaccuracy caused by calibration/maintenance issues or operator
            2- Presence of viscous material in DNA preps (mostly in non-column based extractions). This would show as streaks in wells of gel. This can be avoided by spinning DNA tube or plate at high speed for 5 min to precipitate non-soluble materials and transferring from top of wells.
            3- Proper mixing of DNA with reagents or after dilution
            Okay, thank you, this makes a lot of sense. We do have viscous material in our CTAB extractions which is one of the ideas we came up with for why this is happening. When I decrease the volume of the sample in a SpeedVac from 100µl to 50µl the material gets a lot more viscous. I have been trying to mix things very well before measuring on the Qbit but now I realize that maybe I should be spinning them down first and taking only the water liquid on the top.

            The other option I guess is to re-extract with a column extraction- I was just worried about getting enough DNA, as we are already riding a thin line far as that is concerned...

            Comment


            • #7
              One way to get rid of viscous material is column clean-up. In this case sample need to be diluted before addition of binding buffer (as much as kit binding buffer volumes allows) to prevent clogging columns. In addition DNA concentration can be adjusted by elution volume.

              Comment


              • #8
                okay thank you very much—I will go from there... Do you have any feeling for whether or not I should be concerned about DNA loss with filter extraction, or filter cleanups? I know that is a common fear with filters but don't know how founded it is. We have been told by the manufacturer that there should be 95% recovery unless pieces are over 50kb. I have no idea if we have pieces of DNA that big or not. I assume that I am probably overly-fearful since I imagine that others have done GBS/RADseq protocols with filter extractions of DNA...

                Comment


                • #9
                  In this case you need to do clean up only as DNA have already been extracted. Sample loss depends on column specifications but one can maximise recovery by eluting with hot buffer and double elution. In my experience 10-15% loss is normal. In your case it will depend on how strongly DNA is bound to viscous material as well which may go through column. Best approach is to trial with less precious sample by quantifying with dsDNA specific reagents before and after cleans up because dsDNA will contribute to final library.

                  Comment


                  • #10
                    Okay, that makes a lot of sense. Thanks so much for your help everyone, we've been struggling with this for a while. It's nice to have something else to go off of.

                    Comment


                    • #11
                      Hello,

                      Did you re-run the undiluted DNA side by side with the diluted DNA as a control?

                      What do the raw fluorescence readings for controls look like between the two runs?

                      Assuming you have a homogeneous mixture of DNA with no clumping, try adding additional Qubit standards. I have seen a fair deal of variation in the past so now I run 10 controls (2 x each of 0, 25, 50, 75, and 100 ng) and plot it myself to calculate my DNA concentrations.

                      Comment


                      • #12
                        Originally posted by Terminator View Post
                        Hello,

                        Did you re-run the undiluted DNA side by side with the diluted DNA as a control?

                        What do the raw fluorescence readings for controls look like between the two runs?

                        Assuming you have a homogeneous mixture of DNA with no clumping, try adding additional Qubit standards. I have seen a fair deal of variation in the past so now I run 10 controls (2 x each of 0, 25, 50, 75, and 100 ng) and plot it myself to calculate my DNA concentrations.
                        I did run undiluted controls a couple of times but should start doing it every time. If I run undiluted controls over and over (without changing anything) there variation of maybe around 5–20% or so (which I have no problem with, at least it's qualitatively similar). The larger error comes in when we change the dilution and especially when we concentrate the DNA to a lower volume but then it is more viscous—so I imagine as suggested above that this is a serious source of error.

                        That is a great idea to do the extra standards across the range and then plot- thanks!

                        Comment


                        • #13
                          Originally posted by Myrmex View Post
                          I did run undiluted controls a couple of times but should start doing it every time. If I run undiluted controls over and over (without changing anything) there variation of maybe around 5–20% or so (which I have no problem with, at least it's qualitatively similar). The larger error comes in when we change the dilution and especially when we concentrate the DNA to a lower volume but then it is more viscous—so I imagine as suggested above that this is a serious source of error.

                          That is a great idea to do the extra standards across the range and then plot- thanks!
                          I'm not sure why the Qubit only requires two standards (seems crazy).

                          I found a post (I don't recall the specific thread) where a user recommended running larger volumes for dilute DNA samples. This may also be worth a shot for reducing variability.

                          Best of luck!

                          Comment


                          • #14
                            I have noticed something similar when quantifying gDNA by Nanodrop. What I saw was values that would vary quite a bit when reading different "drops" from the same tube of gDNA. This was solved by vortexing the DNA for 10 seconds. The idea is that the genomic DNA molecules are so large that one microliter might have varying amount of these large DNA molecules. Following vortexing the gDNA is in much smaller fragments, which allows it to exist more evenly in solution such that every microliter will have a much more similar amount of DNA. Think of it of grabbing handfuls of sand versus handfuls of medium sized rocks and weighing them. The handful of sand will be very close to the same weight each time, but the rocks will vary much more. I haven't tested the theory about large DNA vs sheared DNA, but we have tested vortexing DNA for 10 seconds prior to reading on the Nanodrop and it definitely results in much more consistent readings.

                            Comment


                            • #15
                              Originally posted by Terminator View Post
                              I'm not sure why the Qubit only requires two standards (seems crazy).
                              Linear detection range of PicoGreen is four orders of magnitude in 1 ng/ml to 1000ng/ml DNA concentration. As far as one calibrates fluorometer at 0 and 1000 range there is no need to any other concentration in between or standard curve. It seems to be waste of money and time. With correct calibration one needs only to multiply the fluorescence value in dilution factor to calculate original concentration of DNA sample.

                              Comment

                              Latest Articles

                              Collapse

                              • seqadmin
                                Recent Advances in Sequencing Technologies
                                by seqadmin







                                Innovations in next-generation sequencing technologies and techniques are driving more precise and comprehensive exploration of complex biological systems. Current advancements include improved accessibility for long-read sequencing and significant progress in single-cell and 3D genomics. This article explores some of the most impactful developments in the field over the past year.

                                Long-Read Sequencing
                                Long-read sequencing has...
                                12-02-2024, 01:49 PM
                              • seqadmin
                                Genetic Variation in Immunogenetics and Antibody Diversity
                                by seqadmin



                                The field of immunogenetics explores how genetic variations influence immune responses and susceptibility to disease. In a recent SEQanswers webinar, Oscar Rodriguez, Ph.D., Postdoctoral Researcher at the University of Louisville, and Ruben Martínez Barricarte, Ph.D., Assistant Professor of Medicine at Vanderbilt University, shared recent advancements in immunogenetics. This article discusses their research on genetic variation in antibody loci, antibody production processes,...
                                11-06-2024, 07:24 PM

                              ad_right_rmr

                              Collapse

                              News

                              Collapse

                              Topics Statistics Last Post
                              Started by seqadmin, 12-02-2024, 09:29 AM
                              0 responses
                              148 views
                              0 likes
                              Last Post seqadmin  
                              Started by seqadmin, 12-02-2024, 09:06 AM
                              0 responses
                              51 views
                              0 likes
                              Last Post seqadmin  
                              Started by seqadmin, 12-02-2024, 08:03 AM
                              0 responses
                              42 views
                              0 likes
                              Last Post seqadmin  
                              Started by seqadmin, 11-22-2024, 07:36 AM
                              0 responses
                              73 views
                              0 likes
                              Last Post seqadmin  
                              Working...
                              X