Header Leaderboard Ad

Collapse

Fastq data compression with pigz

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Fastq data compression with pigz

    Hi,

    so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
    of gzip called pigz:
    http://zlib.net/pigz/

    written by the same guy who wrote zlib and gzip itself:
    http://en.wikipedia.org/wiki/Mark_Adler

    It is about to 3.8x faster on 4 cores machine than gzip.

    Before falling for it head first, simple question: does anybody used it for some time without problems?

    Best,

    Darek

  • #2
    Originally posted by darked89 View Post
    Hi,

    so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
    of gzip called pigz:
    http://zlib.net/pigz/

    written by the same guy who wrote zlib and gzip itself:
    http://en.wikipedia.org/wiki/Mark_Adler

    It is about to 3.8x faster on 4 cores machine than gzip.

    Before falling for it head first, simple question: does anybody used it for some time without problems?

    Best,

    Darek
    Note that there is no speed up upon decompression. For archiving, we use pbzip2 which runs with multiple threads when compressing and decompressing. The only issue is that some programs (i.e. aligners) support reading directly from gzip or bzip2 compressed files. In this case, it is much faster (in the aligner code) to decompress gzip files since the reading is typically not multi-threaded.

    Comment


    • #3
      Originally posted by nilshomer View Post
      Note that there is no speed up upon decompression. For archiving, we use pbzip2 which runs with multiple threads when compressing and decompressing. The only issue is that some programs (i.e. aligners) support reading directly from gzip or bzip2 compressed files. In this case, it is much faster (in the aligner code) to decompress gzip files since the reading is typically not multi-threaded.
      Thank you. It is always good to know that others are using non-standard compressors and do not run into problems because of that.

      Once I get back to my office in a week or so I will try to do some benchmarking of gzip/bzip2 in single/multi-threded versions. I was also thinking about adding two other algorithms:

      lzop (the very fast one) http://en.wikipedia.org/wiki/Lzop
      xz (faster than (single-threaded) bzip2 and producing smaller files) http://tukaani.org/xz/

      It will be good to create a list of programs able to use various compressed formats. I am aware of ABYSS able to use gz, bz2 or xz, and there should be few others happy with fastq.gz.

      Darek

      Comment


      • #4
        There are several very good benchmarks on compression algorithms with various input. You can google them. Personally, I prefer gzip because:

        1) It is open source and has good APIs (as well as bzip2/lzop/xz).

        2) It is widely available (as well as bzip2).

        3) It achieves reasonable compression ratio (as well as bzip2).

        4) It is very fast on decompression, several times faster than bzip2.

        The last point makes gzip much more practical than bzip2 because we usually compress once but decompress many times. A few people asked me why BAM is gzip'ed but not bzip2'ed which has better compression ratio. The last point is the leading reason.

        Comment


        • #5
          Originally posted by darked89 View Post
          so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
          of gzip called pigz:
          http://zlib.net/pigz/
          It is about to 3.8x faster on 4 cores machine than gzip.
          Before falling for it head first, simple question: does anybody used it for some time without problems?
          Yes, I've been using it for about a year for compression tasks eg. compressing illumina fastq files. On our 8 core SMP system it is about 7.6x faster - GZIP definitely seems CPU bound rather than I/O bound.

          Comment


          • #6
            You might also want to try my program genozip. It is usually compresses FASTQ 2x-5x better than gzip/pigz/xz/bzip2.

            Comment

            Latest Articles

            Collapse

            • seqadmin
              A Brief Overview and Common Challenges in Single-cell Sequencing Analysis
              by seqadmin


              ​​​​​​The introduction of single-cell sequencing has advanced the ability to study cell-to-cell heterogeneity. Its use has improved our understanding of somatic mutations1, cell lineages2, cellular diversity and regulation3, and development in multicellular organisms4. Single-cell sequencing encompasses hundreds of techniques with different approaches to studying the genomes, transcriptomes, epigenomes, and other omics of individual cells. The analysis of single-cell sequencing data i...

              01-24-2023, 01:19 PM
            • seqadmin
              Introduction to Single-Cell Sequencing
              by seqadmin
              Single-cell sequencing is a technique used to investigate the genome, transcriptome, epigenome, and other omics of individual cells using high-throughput sequencing. This technology has provided many scientific breakthroughs and continues to be applied across many fields, including microbiology, oncology, immunology, neurobiology, precision medicine, and stem cell research.

              The advancement of single-cell sequencing began in 2009 when Tang et al. investigated the single-cell transcriptomes
              ...
              01-09-2023, 03:10 PM

            ad_right_rmr

            Collapse
            Working...
            X