Our lab is carrying out an initial pilot test of two different RADseq/GBS protocols to see which will work best for us. Both protocols emphasize the importance of normalizing DNA extractions before beginning the library prep using a Qbit or similar to around 10–20 ng/µl. So far this deceptively simple first step has been near impossible:
I measure my 48 samples on the Qbit, and then after individually diluting the samples to bring them to the same concentration, I remeasure and the values are most often nowhere near where expected. Some are in the ballpark (possible by coincidence), many vary from 5–100% from where they should be. So for example, I will measure say four samples which read around 20 ng/µl and after diluting them all to half their original concentration and re-measuring, I get varied results like 3 ng/µl, 9 ng/µl, 19 ng/µl, and 22 ng/µl, instead of the expected 10 ng/µl. Since this problem arose, I have done just about every iteration of troubleshooting I can think of over the past month and can't make these non-sensical values go away. I have:
- used the broad-range and high-sensitivity kits.
- ordered a new kit to see if our old one was compromised.
- used a different qBit at a different facility.
- troubleshooted with "cleaner" DNA (e.g. lambda and oligos from biotech companies)
....and the above-mentioned problem still occurs in all of these situations.
It would be great if I could get some input from someone who has faced similar problems and what they did about it. At this point my instinct is to measure the samples once, and just trust the number and move on, but I am very nervous about having over- or under-represented samples in multiplexed library. I am very curious to know what is actually happening behind the scenes when someone says something like "We normalized all DNA to 20ng/µl", at the beginning of a library prep protocol, as I am starting to feel like this is basically impossible...
Thanks in advance.
I measure my 48 samples on the Qbit, and then after individually diluting the samples to bring them to the same concentration, I remeasure and the values are most often nowhere near where expected. Some are in the ballpark (possible by coincidence), many vary from 5–100% from where they should be. So for example, I will measure say four samples which read around 20 ng/µl and after diluting them all to half their original concentration and re-measuring, I get varied results like 3 ng/µl, 9 ng/µl, 19 ng/µl, and 22 ng/µl, instead of the expected 10 ng/µl. Since this problem arose, I have done just about every iteration of troubleshooting I can think of over the past month and can't make these non-sensical values go away. I have:
- used the broad-range and high-sensitivity kits.
- ordered a new kit to see if our old one was compromised.
- used a different qBit at a different facility.
- troubleshooted with "cleaner" DNA (e.g. lambda and oligos from biotech companies)
....and the above-mentioned problem still occurs in all of these situations.
It would be great if I could get some input from someone who has faced similar problems and what they did about it. At this point my instinct is to measure the samples once, and just trust the number and move on, but I am very nervous about having over- or under-represented samples in multiplexed library. I am very curious to know what is actually happening behind the scenes when someone says something like "We normalized all DNA to 20ng/µl", at the beginning of a library prep protocol, as I am starting to feel like this is basically impossible...
Thanks in advance.
Comment