Greetings.
I have been using Velvet 1.2.10 to create de novo assembled contigs with good results using 100bp paired-end reads.
Now I have more and different data, because I have longer reads and because I now trim for quality using Btrim64. As a result, from the two original sets of paired-end reads, 100 and 250bp reads (with 400 and 600 bp insert sizes, respective) I have 4 sets of reads:
Sample1_100bp_R1.fastq.pe
Sample1_100bp_R2.fastq.pe
Sample1_100bp_R1.fastq.se
Sample1_100bp_R2.fastq.se
Sample1_250bp_R1.fastq.pe
Sample1_250bp_R1.fastq.pe
Sample1_250bp_R2.fastq.pe
Sample1_250bp_R1.fastq.se
Sample1_250bp_R2.fastq.se
(The single reads result when the pair is sacrificed in the quality trimming)
I have 3 questions:
Is it possible to combine all of these data for a de novo assembly in Velvet?
Are the 250bp reads still considered short reads or are they long reads?
Is there a problem now that the read-length is no longer consistent due to trimming?
Thanks!
I have been using Velvet 1.2.10 to create de novo assembled contigs with good results using 100bp paired-end reads.
Now I have more and different data, because I have longer reads and because I now trim for quality using Btrim64. As a result, from the two original sets of paired-end reads, 100 and 250bp reads (with 400 and 600 bp insert sizes, respective) I have 4 sets of reads:
Sample1_100bp_R1.fastq.pe
Sample1_100bp_R2.fastq.pe
Sample1_100bp_R1.fastq.se
Sample1_100bp_R2.fastq.se
Sample1_250bp_R1.fastq.pe
Sample1_250bp_R1.fastq.pe
Sample1_250bp_R2.fastq.pe
Sample1_250bp_R1.fastq.se
Sample1_250bp_R2.fastq.se
(The single reads result when the pair is sacrificed in the quality trimming)
I have 3 questions:
Is it possible to combine all of these data for a de novo assembly in Velvet?
Are the 250bp reads still considered short reads or are they long reads?
Is there a problem now that the read-length is no longer consistent due to trimming?
Thanks!
Comment