Hi folks, just a quick observation and question.
I recently ran TopHat for human derived mRNA 2x100bp paired end files with 12411993 reads for each pair. I used 5 processors and 20GB of RAM to process on our SGE cluster. When I got the report file back it was reported that the max memory used was just under 4.25GB (which surprised me a little).
So, I want to limit the resources that I'm using on our cluster (obviously). Does anyone have a feel for how much memory I should request as a function of the number of processors that I use in TopHat?
Thanks,
Sam
I recently ran TopHat for human derived mRNA 2x100bp paired end files with 12411993 reads for each pair. I used 5 processors and 20GB of RAM to process on our SGE cluster. When I got the report file back it was reported that the max memory used was just under 4.25GB (which surprised me a little).
So, I want to limit the resources that I'm using on our cluster (obviously). Does anyone have a feel for how much memory I should request as a function of the number of processors that I use in TopHat?
Thanks,
Sam
Comment