Really dumb question: how do you mathematically convert reads into gigabases and then calculate coverage?
if I have 600 million clusters sequenced at 2*100, then is it 600*200=120,000 gigabases sequenced? Or is that megabases?
Then do you divide 120,000/3,000 (3 billion bases in human genome) = 40. So 40x coverage?
I ask because I'm trying to figure out how many clusters/lane I need to get when sequencing a whole genome, and then how many lanes our lab would need to run. Would also be curious what people are doing typically nowadays.
if I have 600 million clusters sequenced at 2*100, then is it 600*200=120,000 gigabases sequenced? Or is that megabases?
Then do you divide 120,000/3,000 (3 billion bases in human genome) = 40. So 40x coverage?
I ask because I'm trying to figure out how many clusters/lane I need to get when sequencing a whole genome, and then how many lanes our lab would need to run. Would also be curious what people are doing typically nowadays.
Comment