Chad Nusbaum showed an interesting slide during his tallk at the recent "Sequencing at the Tipping Point" meeting in San Diego. It plotted Moore's law and the sequencing cost against time. Personally, I found the slide particularly apt, because it brought some hard data to something most of us will have noticed. In essense: "my data sets don't cost any more but my computers are too slow to process them!"
I don't have access to Chad's actual slide, but I found basically the same figure here:
(It is about 1/2 way down the page and is entitled "Baseline information")
Until 2005, the slope of the sequencing cost line (semi-log plot of the falling cost of DNA sequencing) roughly matched that of Moore's law. Then the sequencing cost line slopes down, pulling rapidly away from the dawdling Moore's Law slope.
Moore's Law, to over simplify, says you will pay 1/2 as much money for a computer of equal "power" every X years -- where X is somewhere between 1 and 2 years. If you accept my analysis and The Broad's graph then, up until 2005, you could spend roughly the same fraction of money on new computers every time you bought a new sequencer. But after 2005, something happened (presumably second generation sequencers), and you needed to spend more on computation. But my question is: "how much more?"
To extrapolate further from The Economist's cartoon graph, on the page I link to above, I would estimate the "sequence cost halving time" pre-2005 was about 1.4 years. After 2005 it looks to have dropped to about 0.5 years. Meanwhile Moore's law looks like its halving time is plotted as a steady 1.2 years according to the cartoon in question.
Please correct me if I have the math wrong, but as long as these relative rates hold, it seems one needs to double ones expenditures on computation every 0.7 years to keep pace with a static sequencing budget.
Comments?
--
Phillip
I don't have access to Chad's actual slide, but I found basically the same figure here:
(It is about 1/2 way down the page and is entitled "Baseline information")
Until 2005, the slope of the sequencing cost line (semi-log plot of the falling cost of DNA sequencing) roughly matched that of Moore's law. Then the sequencing cost line slopes down, pulling rapidly away from the dawdling Moore's Law slope.
Moore's Law, to over simplify, says you will pay 1/2 as much money for a computer of equal "power" every X years -- where X is somewhere between 1 and 2 years. If you accept my analysis and The Broad's graph then, up until 2005, you could spend roughly the same fraction of money on new computers every time you bought a new sequencer. But after 2005, something happened (presumably second generation sequencers), and you needed to spend more on computation. But my question is: "how much more?"
To extrapolate further from The Economist's cartoon graph, on the page I link to above, I would estimate the "sequence cost halving time" pre-2005 was about 1.4 years. After 2005 it looks to have dropped to about 0.5 years. Meanwhile Moore's law looks like its halving time is plotted as a steady 1.2 years according to the cartoon in question.
Please correct me if I have the math wrong, but as long as these relative rates hold, it seems one needs to double ones expenditures on computation every 0.7 years to keep pace with a static sequencing budget.
Comments?
--
Phillip
Comment