Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • ekkogecko
    replied
    Originally posted by RamakrishnanRS View Post
    Plus, even if it were MPI compatible, the PBS script/command may need to be modified with an option or directive to pick an MPI-enabled node.
    Thanks for the recommendation. As of yet, I've been unable to determine if it is truly MPI compatible. Running without MPI still fails on files >20,000 sequences. I ran a couple of test trials without MPI overnight to no avail.

    Leave a comment:


  • RamakrishnanRS
    replied
    Plus, even if it were MPI compatible, the PBS script/command may need to be modified with an option or directive to pick an MPI-enabled node.

    Leave a comment:


  • GenoMax
    replied
    @ekkogecko: Are you sure metavelvet is MPI compatible? Have you tried running it without MPI?

    Leave a comment:


  • RamakrishnanRS
    replied
    It does look like a PID. I have not seen numeric identifiers for any other exposed entity in a cluster.

    Leave a comment:


  • ekkogecko
    replied
    I'm not sure what "MPI: could not run executable (case #3)" means, I can send a message to the system admins for some clarification there.
    As far as storage limits, there's 100s of free gigabytes open. I've been able to assemble much larger genomes on this system using Ray, so I expect that this smaller dataset would be able to run with MetaVelvet (unless the requirements are wildly different).
    I'm not sure if "369997" is a torque PID, could you explain how to check that?

    Leave a comment:


  • GenoMax
    replied
    Do you know what "MPI: could not run executable (case #3)" is referring to? Are you running into any other limits (storage, /tmp space)? Is "369997" a torque PID? If it is, can you ask the admins to see why that process was killed?

    Leave a comment:


  • GenoMax
    replied
    You can cross-post. If an answer is posted in the other forum many a times OP does not come back to this forum to indicate that a solution has been found. Link in post #2 is there as a reference for your biostars post.

    Leave a comment:


  • ekkogecko
    replied
    Originally posted by GenoMax View Post
    I've cross posted to increase visibility, is that not allowed?

    Leave a comment:


  • GenoMax
    replied
    Also on Biostars: https://www.biostars.org/p/150101/

    Leave a comment:


  • ekkogecko
    started a topic MetaVelvet caught in an infinite loop?

    MetaVelvet caught in an infinite loop?

    Hello,

    I've been trying to run MetaVelvet assembly of an environmental DNA sample (short, paired end, fastq, illumina reads) on a supercomputer cluster, and have been unable to generate output. I am using Velvet 1.2.10 and MetaVelvet -1.1.01. The program will run to completion (in a fraction of a second) on very small files (containing 20,000 sequences each) for both single and paired end reads, and produce a functional meta-velvet contigs file. Vevleth and Velvetg both run successfully on files containing 40,000 sequences each. However, when running MetaVelvetg on these results, the program will run until reaching the wall timer.

    I am using the following sequence of commands:
    • mpirun -np $PBS_NP ~/bin/velvet-master/velveth $fol 31 -fastq -shortPaired $sequence1 $sequence2
    • mpirun -np $PBS_NP ~/bin/velvet-master/velvetg $fol -read_trkg yes -scaffolding no
    • mpirun -np $PBS_NP ~/MetaVelvet-1.1.01/meta-velvetg $fol -scaffolding no


    Where $sequence1 and $sequence2 point to their respective ends of each sequence, and $fol points to the output directory. I have used a different output directory with each run. I typically run the process with 6 nodes, each with 8 2.66 GHz processors and access to some allocation of 2.62 TB shared memory. I have tried up to 24 of those nodes, and MetaVelvet has never run to completion, so it seems like it's not a matter of processing power.

    When I've killed the process, the error file contained the following:
    MPI: could not run executable (case #3)
    MPI: No details available, no log files found
    /opt/torque/4.2.9/spool/mom_priv/jobs/77068.hokieone.SC: line 34: 369997 Killed
    mpirun -np $PBS_NP /home/slvt16/bin/velvet-master/velveth $fol 31 -fastq $sequence1
    MPI: could not run executable (case #3)
    MPI: No details available, no log files found
    /opt/torque/4.2.9/spool/mom_priv/jobs/77068.hokieone.SC: line 35: 374736 Killed
    mpirun -np $PBS_NP /home/slvt16/bin/velvet-master/velvetg $fol -read_trkg yes -scaffolding no

    The output log shows that the program is hanging on the ------Scafolding------ step, even when run with the '-scaffolding no' flag.

    I would like to run MetaVelvet on paired end files of ~400,000 reads each, eventually. I'm running out of ideas for troubleshooting, so any help would be appreciated!

    EDIT: Cross-Posted to https://www.biostars.org/p/150101/
    Last edited by ekkogecko; 07-08-2015, 11:50 AM. Reason: Sourced Xpost

Latest Articles

Collapse

  • seqadmin
    Exploring the Dynamics of the Tumor Microenvironment
    by seqadmin




    The complexity of cancer is clearly demonstrated in the diverse ecosystem of the tumor microenvironment (TME). The TME is made up of numerous cell types and its development begins with the changes that happen during oncogenesis. “Genomic mutations, copy number changes, epigenetic alterations, and alternative gene expression occur to varying degrees within the affected tumor cells,” explained Andrea O’Hara, Ph.D., Strategic Technical Specialist at Azenta. “As...
    07-08-2024, 03:19 PM

ad_right_rmr

Collapse

News

Collapse

Topics Statistics Last Post
Started by seqadmin, 07-25-2024, 06:46 AM
0 responses
9 views
0 likes
Last Post seqadmin  
Started by seqadmin, 07-24-2024, 11:09 AM
0 responses
26 views
0 likes
Last Post seqadmin  
Started by seqadmin, 07-19-2024, 07:20 AM
0 responses
160 views
0 likes
Last Post seqadmin  
Started by seqadmin, 07-16-2024, 05:49 AM
0 responses
127 views
0 likes
Last Post seqadmin  
Working...
X