I have a systematic segmentation fault problem with cufflinks (v.2.0.2), which is independent of the dataset analyzed and caused by the input parameters --frag-len-mean (m) and --frag-len-mean (s), arguably because I've got unusual fragment mean length and SD.
(This is not like the other cufflinks segmentation fault reported, which is at a different step in the run).
The process stops when estimating frag length distribution, right after the "Inspecting bundle ..." output:
I've tried requesting lots of memory, but it didn't make a difference, and I also get the same result using cufflinks' test_data.sam with the S.cerevisiae genome, suggesting that insufficient memory is not the issue.
Using another GTF or changing cufflinks version (I've tried the last three) also didn't make a difference.
Any of the following changes independently allowed cufflinks to run smoothly:
Any help would be greatly appreciated,
Pierre-Luc
(This is not like the other cufflinks segmentation fault reported, which is at a different step in the run).
The process stops when estimating frag length distribution, right after the "Inspecting bundle ..." output:
Code:
cufflinks -q -u -p 1 -m 800 -s 200 -g $GTF -M $MASK -b $GENOME.fa data.bam [10:14:40] Loading reference annotation. [10:14:43] Loading reference annotation. [10:14:43] Inspecting reads and determining fragment length distribution. Processed 189391 loci. > Map Properties: > Normalized Map Mass: 52735571.57 > Raw Map Mass: 52735571.57 Segmentation fault
Using another GTF or changing cufflinks version (I've tried the last three) also didn't make a difference.
Any of the following changes independently allowed cufflinks to run smoothly:
- using -G instead of -g AND not providing a genome (-b)
- removing the -m and -s parameters
- reducing -m and -s to something more "normal", like 500 and 100 respectively.
Any help would be greatly appreciated,
Pierre-Luc
Comment