Something wrong with bam_merge?
TopHat run (v2.0.6)
Bowtie version: 0.12.7.0
Samtools version: 0.1.18.0
tophat --bowtie1 -o ./SRR486241 --solexa-quals -p 10 -g 1 --no-coverage-search --no-novel-juncs --library-type fr-firststrand -G genes.gtf GRCh37 ESC_1.fastq ESC_2.fastq
Reporting output tracks
[FAILED]
Error running /usr/local/bin/tophat_reports --min-anchor 8 --splice-mismatches 0 --min-report-intron 50 --max-report-intron 500000 --min-isoform-fraction 0.15 --output-dir ./SRR486241/ --max-multihits 1 --max-seg-multihits 10 --segment-length 25 --segment-mismatches 2 --min-closure-exon 100 --min-closure-intron 50 --max-closure-intron 5000 --min-coverage-intron 50 --max-coverage-intron 20000 --min-segment-intron 50 --max-segment-intron 500000 --read-mismatches 2 --read-gap-length 2 --read-edit-dist 2 --read-realign-edit-dist 3 --max-insertion-length 3 --max-deletion-length 3 --bowtie1 -z gzip -p10 --inner-dist-mean 50 --inner-dist-std-dev 20 --gtf-annotations genes.gtf --gtf-juncs ./SRR486241/tmp/genes.juncs --no-closure-search --no-coverage-search --no-microexon-search --solexa-quals --library-type fr-firststrand --sam-header ./SRR486241/tmp/GRCh37_genome.bwt.samheader.sam --report-discordant-pair-alignments --report-mixed-alignments --samtools=/Users/yingtao/Jerry/Tools/samtools/samtools --bowtie2-max-penalty 6 --bowtie2-min-penalty 2 --bowtie2-penalty-for-N 1 --bowtie2-read-gap-open 5 --bowtie2-read-gap-cont 3 --bowtie2-ref-gap-open 5 --bowtie2-ref-gap-cont 3 GRCh37.fa ./SRR486241/junctions.bed ./SRR486241/insertions.bed ./SRR486241/deletions.bed ./SRR486241/fusions.out ./SRR486241/tmp/accepted_hits ./SRR486241/tmp/left_kept_reads.m2g.bam,./SRR486241/tmp/left_kept_reads.m2g_um.mapped.bam,./SRR486241/tmp/left_kept_reads.m2g_um.candidates ./SRR486241/tmp/left_kept_reads.bam ./SRR486241/tmp/right_kept_reads.m2g.bam,./SRR486241/tmp/right_kept_reads.m2g_um.mapped.bam,./SRR486241/tmp/right_kept_reads.m2g_um.candidates ./SRR486241/tmp/right_kept_reads.bam
Error: bam_merge failed to open BAM file ./SRR486241/tmp/right_kept_reads.m2g_um.candidates3.bam
The input files are very large. There are about 200 million pairs of paired-end reads.
When I tried the same parameter using the first 10,000 pairs, everything was fine and there was no failure.
Therefore, is it because the input FASTQ file is too large?
TopHat run (v2.0.6)
Bowtie version: 0.12.7.0
Samtools version: 0.1.18.0
tophat --bowtie1 -o ./SRR486241 --solexa-quals -p 10 -g 1 --no-coverage-search --no-novel-juncs --library-type fr-firststrand -G genes.gtf GRCh37 ESC_1.fastq ESC_2.fastq
Reporting output tracks
[FAILED]
Error running /usr/local/bin/tophat_reports --min-anchor 8 --splice-mismatches 0 --min-report-intron 50 --max-report-intron 500000 --min-isoform-fraction 0.15 --output-dir ./SRR486241/ --max-multihits 1 --max-seg-multihits 10 --segment-length 25 --segment-mismatches 2 --min-closure-exon 100 --min-closure-intron 50 --max-closure-intron 5000 --min-coverage-intron 50 --max-coverage-intron 20000 --min-segment-intron 50 --max-segment-intron 500000 --read-mismatches 2 --read-gap-length 2 --read-edit-dist 2 --read-realign-edit-dist 3 --max-insertion-length 3 --max-deletion-length 3 --bowtie1 -z gzip -p10 --inner-dist-mean 50 --inner-dist-std-dev 20 --gtf-annotations genes.gtf --gtf-juncs ./SRR486241/tmp/genes.juncs --no-closure-search --no-coverage-search --no-microexon-search --solexa-quals --library-type fr-firststrand --sam-header ./SRR486241/tmp/GRCh37_genome.bwt.samheader.sam --report-discordant-pair-alignments --report-mixed-alignments --samtools=/Users/yingtao/Jerry/Tools/samtools/samtools --bowtie2-max-penalty 6 --bowtie2-min-penalty 2 --bowtie2-penalty-for-N 1 --bowtie2-read-gap-open 5 --bowtie2-read-gap-cont 3 --bowtie2-ref-gap-open 5 --bowtie2-ref-gap-cont 3 GRCh37.fa ./SRR486241/junctions.bed ./SRR486241/insertions.bed ./SRR486241/deletions.bed ./SRR486241/fusions.out ./SRR486241/tmp/accepted_hits ./SRR486241/tmp/left_kept_reads.m2g.bam,./SRR486241/tmp/left_kept_reads.m2g_um.mapped.bam,./SRR486241/tmp/left_kept_reads.m2g_um.candidates ./SRR486241/tmp/left_kept_reads.bam ./SRR486241/tmp/right_kept_reads.m2g.bam,./SRR486241/tmp/right_kept_reads.m2g_um.mapped.bam,./SRR486241/tmp/right_kept_reads.m2g_um.candidates ./SRR486241/tmp/right_kept_reads.bam
Error: bam_merge failed to open BAM file ./SRR486241/tmp/right_kept_reads.m2g_um.candidates3.bam
The input files are very large. There are about 200 million pairs of paired-end reads.
When I tried the same parameter using the first 10,000 pairs, everything was fine and there was no failure.
Therefore, is it because the input FASTQ file is too large?
Comment