Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • divon
    replied
    You might also want to try my program genozip. It is usually compresses FASTQ 2x-5x better than gzip/pigz/xz/bzip2.

    Leave a comment:


  • Torst
    replied
    Originally posted by darked89 View Post
    so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
    of gzip called pigz:

    It is about to 3.8x faster on 4 cores machine than gzip.
    Before falling for it head first, simple question: does anybody used it for some time without problems?
    Yes, I've been using it for about a year for compression tasks eg. compressing illumina fastq files. On our 8 core SMP system it is about 7.6x faster - GZIP definitely seems CPU bound rather than I/O bound.

    Leave a comment:


  • lh3
    replied
    There are several very good benchmarks on compression algorithms with various input. You can google them. Personally, I prefer gzip because:

    1) It is open source and has good APIs (as well as bzip2/lzop/xz).

    2) It is widely available (as well as bzip2).

    3) It achieves reasonable compression ratio (as well as bzip2).

    4) It is very fast on decompression, several times faster than bzip2.

    The last point makes gzip much more practical than bzip2 because we usually compress once but decompress many times. A few people asked me why BAM is gzip'ed but not bzip2'ed which has better compression ratio. The last point is the leading reason.

    Leave a comment:


  • darked89
    replied
    Originally posted by nilshomer View Post
    Note that there is no speed up upon decompression. For archiving, we use pbzip2 which runs with multiple threads when compressing and decompressing. The only issue is that some programs (i.e. aligners) support reading directly from gzip or bzip2 compressed files. In this case, it is much faster (in the aligner code) to decompress gzip files since the reading is typically not multi-threaded.
    Thank you. It is always good to know that others are using non-standard compressors and do not run into problems because of that.

    Once I get back to my office in a week or so I will try to do some benchmarking of gzip/bzip2 in single/multi-threded versions. I was also thinking about adding two other algorithms:

    lzop (the very fast one) http://en.wikipedia.org/wiki/Lzop
    xz (faster than (single-threaded) bzip2 and producing smaller files) http://tukaani.org/xz/

    It will be good to create a list of programs able to use various compressed formats. I am aware of ABYSS able to use gz, bz2 or xz, and there should be few others happy with fastq.gz.

    Darek

    Leave a comment:


  • nilshomer
    replied
    Originally posted by darked89 View Post
    Hi,

    so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
    of gzip called pigz:


    written by the same guy who wrote zlib and gzip itself:


    It is about to 3.8x faster on 4 cores machine than gzip.

    Before falling for it head first, simple question: does anybody used it for some time without problems?

    Best,

    Darek
    Note that there is no speed up upon decompression. For archiving, we use pbzip2 which runs with multiple threads when compressing and decompressing. The only issue is that some programs (i.e. aligners) support reading directly from gzip or bzip2 compressed files. In this case, it is much faster (in the aligner code) to decompress gzip files since the reading is typically not multi-threaded.

    Leave a comment:


  • darked89
    started a topic Fastq data compression with pigz

    Fastq data compression with pigz

    Hi,

    so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
    of gzip called pigz:


    written by the same guy who wrote zlib and gzip itself:


    It is about to 3.8x faster on 4 cores machine than gzip.

    Before falling for it head first, simple question: does anybody used it for some time without problems?

    Best,

    Darek

Latest Articles

Collapse

  • seqadmin
    Recent Developments in Metagenomics
    by seqadmin





    Metagenomics has improved the way researchers study microorganisms across diverse environments. Historically, studying microorganisms relied on culturing them in the lab, a method that limits the investigation of many species since most are unculturable1. Metagenomics overcomes these issues by allowing the study of microorganisms regardless of their ability to be cultured or the environments they inhabit. Over time, the field has evolved, especially with the advent...
    09-23-2024, 06:35 AM
  • seqadmin
    Understanding Genetic Influence on Infectious Disease
    by seqadmin




    During the COVID-19 pandemic, scientists observed that while some individuals experienced severe illness when infected with SARS-CoV-2, others were barely affected. These disparities left researchers and clinicians wondering what causes the wide variations in response to viral infections and what role genetics plays.

    Jean-Laurent Casanova, M.D., Ph.D., Professor at Rockefeller University, is a leading expert in this crossover between genetics and infectious...
    09-09-2024, 10:59 AM

ad_right_rmr

Collapse

News

Collapse

Topics Statistics Last Post
Started by seqadmin, Yesterday, 04:51 AM
0 responses
8 views
0 likes
Last Post seqadmin  
Started by seqadmin, 10-01-2024, 07:10 AM
0 responses
13 views
0 likes
Last Post seqadmin  
Started by seqadmin, 09-30-2024, 08:33 AM
0 responses
16 views
0 likes
Last Post seqadmin  
Started by seqadmin, 09-26-2024, 12:57 PM
0 responses
16 views
0 likes
Last Post seqadmin  
Working...
X