Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Which sonicator did you use for your chromatin? Usually, it is very difficult to achieve such small inserts.

    The seq facility is certainly correct that the size distribution is not optimal general purposes (small inserts generate redundant data). However, for ChIP-seq you only want to count and get the most precise mapping data possible. Shorter fragments give more precise data -- but only in case you can rule out sample degradation.


    Originally posted by KB* View Post
    Dear all,

    Dear @arctan @GenoMax, please help :-) I am lost.....

    I got the answer from the sequencing facility. The libraries were failed because they have "short fragments". They are not primer-dimers or adapter-hexamers. They take the shortest fragment, subtract 120 and get this:
    #15-152bp; #18-87bp; #19 -110bp etc.

    They wrote: "The fragment size is less than 300bp, so there is small fragment in “remarks”.

    I am not sure what they wanted to say

    We were going to sequence (Novaseq) with paired reads of 150 bp. I guess that the core is trying to say that I have fragments shorter than the reads. Is that really a problem?

    Comment


    • #17
      Click image for larger version

Name:	degraded lib2.jpg
Views:	2
Size:	70.0 KB
ID:	307246
      Originally posted by luc View Post
      Which sonicator did you use for your chromatin?
      Usually, it is very difficult to achieve such small inserts.
      Hi @luc,

      Than you very much for your question about sonication. It is a deviation, but I am glad I can share my experience. I hope somebody might find it interesting/useful.

      Either because our main sonicator gradually stopped working (Diagenode Bioruptor) or because I work with "difficult" cells (hematopoietic) or my samples were overcrosslinked (I used glycine stop crosslinking. I read that it is better to use Tris) - the sonication was failing to produce well fragmented chromatin. Moreover, the immunoprecipitation was always pulling down the longer fragments. I hypothesised that if I over-shear the chromatin I might be able to precipitate shorted fragments. Therefore, I switched to micococcal nuclease (MN). In the did not overdigest my chromatin (overdigested = only mono-nucleosomes), but I found out that with MN I can prepare larger batches of more concnetrated chromatin. Digested with MN niclei were "opened" by sonication on the probe sonicator Q sonica. It still was not easy to to break the nuclei! I believe probe sonication also shears some chromatin, but the small fragments are most probably the products of MN digestion.

      Degradation
      I think degradation will be visible on the electropherogram. Not sure about the core report. It is too flattened and no gel image. I believe the core is checking for the degradation.
      I think I might caught one degrading library before. Look at the image - here we see multiple randomly sized fragments and the electrophoregrame is serrated. However, I do not know for sure.
      Prepared libraries look different, so I hope there is no degradation.
      Last edited by KB*; 04-09-2019, 04:03 AM.

      Comment


      • #18
        Originally posted by luc View Post
        Which sonicator did you use for your chromatin? Usually, it is very difficult to achieve such small inserts.

        The seq facility is certainly correct that the size distribution is not optimal general purposes (small inserts generate redundant data). However, for ChIP-seq you only want to count and get the most precise mapping data possible. Shorter fragments give more precise data -- but only in case you can rule out sample degradation.
        Not sure how the forum works.. tried to reply 2 times and the answer did not get through :/

        Hi, @luc

        I am glad I can share my experience on chromatin fragmentation. I hope somebody finds it interesting/useful.

        Either because our main sonicator gradually stopped working (Diagenode Bioruptor) or because I work with "difficult" cells (hematopoietic), or my samples were overcrosslinked (I used glycine to stop crosslinking. I read that it is better to use Tris) - the sonication was failing to produce well fragmented chromatin.
        Moreover, the immunoprecipitation was always pulling down longer fragments. I though if I over-shear chromatin I may be able to precipitate
        shorted fragments. For this I started to use micrococcal nuclease (MN). In the end I did not overdigest this chromatin (overdigested = only mono-nucleosomes left), but I have learned that I can produce larger volumes of more concnetrated chromatin. After digestion nuclei were "opened up" with Q sonica probe-sonicator. I believe the probe sonication also shears some chromatin, but the short fragment shall be due to MN treatment.

        Degradation
        I think degradation will be visible on the electropherogram. Not sure about the core report. It is too flattened and no gel images. I believe the core is checking for degradation.
        On tapestation I think I once caught degrading library. It looks like having multiple randomly sized fragments. The electrophoregrame is "serrated" (attachedClick image for larger version

Name:	degraded lib2.jpg
Views:	2
Size:	70.0 KB
ID:	307247 image). Prepared libraries look different, so, hopefully no degradation.

        Comment


        • #19
          Thank you very much, @arctan and @GenoMax for you help.

          I now can only go Novaseq.

          I got some more information from the core (below). Could anybody please comment?

          1) The core fails libraries with fragments outside 200-400 bp range;
          2) They say, "shorter fragments will not be sequenced with 150bp pair end read length, however, if you want to know the sequence of shorter fragments, we could provide you original data without demultiplexing, in that case, shorter fragments sequence will be included..."

          Q1. Do I need data from the reads shorter then 150 bp? I know people are doing 50 bp reads. Shall I discard these reads?

          Q2. What insert size I had to aim to?
          I am using human derived cell lines. There is human reference genome. But... The cell lines were derived from the tumor tissues and also mutated during the cell lines derivation. It is known that the used cell lines have multiple chromosomal rearrangements, gene translocations, duplications, deletions etc. Therefore, it looks like the input DNA has to be sequenced without gaps as much as possible. Hence, 150 - 300 bp inserts (270-450 bp library fragments) are actually the best. Right now my libraryes have a half of fragments shorter then the lower optimal size, but there is also ~1/2 of the libraries within the optimal size. I understand that shorter fragments cluster more efficiently.

          Q3. I have nothing to loose with my stock libraries. All libraries except #15 are safely stored in the core. Shall I try to tweak the library size, for example, size selecting with 0.7X (or similar working) "left size selection" [R]. The will lead to some loss of optimally sized library fragments. Does it worth it?
          Will I be able to sequence all of the loaded libraries? If so, it does not matter if some reads are not informative.

          Alternatively, I can do 0.95X Ampure beads removal of library fragments below ~100 bp. This shall be "safe" in sense of decreasing libraries complexity.

          [R] https://ls.beckmancoulter.co.jp/file...SPRIselect.pdf

          Comment


          • #20
            Hi KB*,

            thanks, the Nuclease treatment explains the shorter than usual fragments.
            You really became creative with the protocol. I hope it works out.

            I am not worried about a library degrading - we have never encountered such a problem. Your traces indicated that perhaps your sample was degrading before the prep - but the MNse treatment explains that.

            Comment

            Latest Articles

            Collapse

            • seqadmin
              Best Practices for Single-Cell Sequencing Analysis
              by seqadmin



              While isolating and preparing single cells for sequencing was historically the bottleneck, recent technological advancements have shifted the challenge to data analysis. This highlights the rapidly evolving nature of single-cell sequencing. The inherent complexity of single-cell analysis has intensified with the surge in data volume and the incorporation of diverse and more complex datasets. This article explores the challenges in analysis, examines common pitfalls, offers...
              Today, 07:15 AM
            • seqadmin
              Latest Developments in Precision Medicine
              by seqadmin



              Technological advances have led to drastic improvements in the field of precision medicine, enabling more personalized approaches to treatment. This article explores four leading groups that are overcoming many of the challenges of genomic profiling and precision medicine through their innovative platforms and technologies.

              Somatic Genomics
              “We have such a tremendous amount of genetic diversity that exists within each of us, and not just between us as individuals,”...
              05-24-2024, 01:16 PM

            ad_right_rmr

            Collapse

            News

            Collapse

            Topics Statistics Last Post
            Started by seqadmin, Today, 08:18 AM
            0 responses
            10 views
            0 likes
            Last Post seqadmin  
            Started by seqadmin, Today, 08:04 AM
            0 responses
            12 views
            0 likes
            Last Post seqadmin  
            Started by seqadmin, 06-03-2024, 06:55 AM
            0 responses
            13 views
            0 likes
            Last Post seqadmin  
            Started by seqadmin, 05-30-2024, 03:16 PM
            0 responses
            27 views
            0 likes
            Last Post seqadmin  
            Working...
            X