Header Leaderboard Ad

Collapse

Fastq data compression with pigz

Collapse

Announcement

Collapse

SEQanswers June Challenge Has Begun!

The competition has begun! We're giving away a $50 Amazon gift card to the member who answers the most questions on our site during the month. We want to encourage our community members to share their knowledge and help each other out by answering questions related to sequencing technologies, genomics, and bioinformatics. The competition is open to all members of the site, and the winner will be announced at the beginning of July. Best of luck!

For a list of the official rules, visit (https://www.seqanswers.com/forum/sit...wledge-and-win)
See more
See less
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Fastq data compression with pigz

    Hi,

    so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
    of gzip called pigz:


    written by the same guy who wrote zlib and gzip itself:


    It is about to 3.8x faster on 4 cores machine than gzip.

    Before falling for it head first, simple question: does anybody used it for some time without problems?

    Best,

    Darek

  • #2
    Originally posted by darked89 View Post
    Hi,

    so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
    of gzip called pigz:


    written by the same guy who wrote zlib and gzip itself:


    It is about to 3.8x faster on 4 cores machine than gzip.

    Before falling for it head first, simple question: does anybody used it for some time without problems?

    Best,

    Darek
    Note that there is no speed up upon decompression. For archiving, we use pbzip2 which runs with multiple threads when compressing and decompressing. The only issue is that some programs (i.e. aligners) support reading directly from gzip or bzip2 compressed files. In this case, it is much faster (in the aligner code) to decompress gzip files since the reading is typically not multi-threaded.

    Comment


    • #3
      Originally posted by nilshomer View Post
      Note that there is no speed up upon decompression. For archiving, we use pbzip2 which runs with multiple threads when compressing and decompressing. The only issue is that some programs (i.e. aligners) support reading directly from gzip or bzip2 compressed files. In this case, it is much faster (in the aligner code) to decompress gzip files since the reading is typically not multi-threaded.
      Thank you. It is always good to know that others are using non-standard compressors and do not run into problems because of that.

      Once I get back to my office in a week or so I will try to do some benchmarking of gzip/bzip2 in single/multi-threded versions. I was also thinking about adding two other algorithms:

      lzop (the very fast one) http://en.wikipedia.org/wiki/Lzop
      xz (faster than (single-threaded) bzip2 and producing smaller files) http://tukaani.org/xz/

      It will be good to create a list of programs able to use various compressed formats. I am aware of ABYSS able to use gz, bz2 or xz, and there should be few others happy with fastq.gz.

      Darek

      Comment


      • #4
        There are several very good benchmarks on compression algorithms with various input. You can google them. Personally, I prefer gzip because:

        1) It is open source and has good APIs (as well as bzip2/lzop/xz).

        2) It is widely available (as well as bzip2).

        3) It achieves reasonable compression ratio (as well as bzip2).

        4) It is very fast on decompression, several times faster than bzip2.

        The last point makes gzip much more practical than bzip2 because we usually compress once but decompress many times. A few people asked me why BAM is gzip'ed but not bzip2'ed which has better compression ratio. The last point is the leading reason.

        Comment


        • #5
          Originally posted by darked89 View Post
          so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
          of gzip called pigz:

          It is about to 3.8x faster on 4 cores machine than gzip.
          Before falling for it head first, simple question: does anybody used it for some time without problems?
          Yes, I've been using it for about a year for compression tasks eg. compressing illumina fastq files. On our 8 core SMP system it is about 7.6x faster - GZIP definitely seems CPU bound rather than I/O bound.

          Comment


          • #6
            You might also want to try my program genozip. It is usually compresses FASTQ 2x-5x better than gzip/pigz/xz/bzip2.

            Comment

            Latest Articles

            Collapse

            ad_right_rmr

            Collapse

            News

            Collapse

            Topics Statistics Last Post
            Started by seqadmin, 06-07-2023, 07:14 AM
            0 responses
            11 views
            0 likes
            Last Post seqadmin  
            Started by seqadmin, 06-06-2023, 01:08 PM
            0 responses
            11 views
            0 likes
            Last Post seqadmin  
            Started by seqadmin, 06-01-2023, 08:56 PM
            0 responses
            164 views
            0 likes
            Last Post seqadmin  
            Started by seqadmin, 06-01-2023, 07:33 AM
            0 responses
            299 views
            0 likes
            Last Post seqadmin  
            Working...
            X