I'm using a small data set to test Velvet-Oases of about a million (av. length 200bp) reads. Pretty soon we're going to get a much larger data set, but the memory requirements for the much smaller assembly is crazy.
When I turn on read-tracking to run Oases with 15G of memory it will max out and kill at about 60-70% nodes visited. How would this ever work with 10-100 million reads if the largest-memory instance I can create has about 70G of memory?
Seqanswers Leaderboard Ad
Collapse
Announcement
Collapse
No announcement yet.
Latest Articles
Collapse
-
by seqadmin
The complexity of cancer is clearly demonstrated in the diverse ecosystem of the tumor microenvironment (TME). The TME is made up of numerous cell types and its development begins with the changes that happen during oncogenesis. “Genomic mutations, copy number changes, epigenetic alterations, and alternative gene expression occur to varying degrees within the affected tumor cells,” explained Andrea O’Hara, Ph.D., Strategic Technical Specialist at Azenta. “As...-
Channel: Articles
07-08-2024, 03:19 PM -
ad_right_rmr
Collapse
News
Collapse
Topics | Statistics | Last Post | ||
---|---|---|---|---|
Started by seqadmin, Yesterday, 06:46 AM
|
0 responses
9 views
0 likes
|
Last Post
by seqadmin
Yesterday, 06:46 AM
|
||
Started by seqadmin, 07-24-2024, 11:09 AM
|
0 responses
26 views
0 likes
|
Last Post
by seqadmin
07-24-2024, 11:09 AM
|
||
Started by seqadmin, 07-19-2024, 07:20 AM
|
0 responses
160 views
0 likes
|
Last Post
by seqadmin
07-19-2024, 07:20 AM
|
||
Started by seqadmin, 07-16-2024, 05:49 AM
|
0 responses
127 views
0 likes
|
Last Post
by seqadmin
07-16-2024, 05:49 AM
|