Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NKAkers
    replied
    Hi Chandra,

    I've had success with Pathseq so far, but I just hit a snag on a sample. I let it run over 40 hours, however it never progressed beyond Job 2. I'm wondering if my log file can provide any insight on what happened.

    Thank you!
    Code:
    rmr: cannot remove config: No such file or directory.
    rmr: cannot remove s3config: No such file or directory.
    rmr: cannot remove load: No such file or directory.
    Master data_loader
    12/03/10 20:56:46 WARN streaming.StreamJob: -jobconf option is deprecated, please use -D instead.
    packageJobJar: [/root/mapper_data_compsub.py, /mnt/hadoop/hadoop-unjar449601472672210086/] [] /tmp/streamjob6443148539614277804.jar tmpDir=null
    12/03/10 20:56:47 INFO mapred.FileInputFormat: Total input paths to process : 20
    12/03/10 20:56:47 INFO streaming.StreamJob: getLocalDirs(): [/mnt/hadoop/mapred/local]
    12/03/10 20:56:47 INFO streaming.StreamJob: Running job: job_201203102044_0001
    12/03/10 20:56:47 INFO streaming.StreamJob: To kill this job, run:
    12/03/10 20:56:47 INFO streaming.StreamJob: /usr/local/hadoop-0.19.0/bin/../bin/hadoop job  -Dmapred.job.tracker=hdfs://ip-10-34-46-200.ec2.internal:50002 -kill job_201203102044_0001
    12/03/10 20:56:48 INFO streaming.StreamJob: Tracking URL: [url]http://ip-10-34-46-200.ec2.internal:50030/jobdetails.jsp?jobid=job_201203102044_0001[/url]
    12/03/10 20:56:49 INFO streaming.StreamJob:  map 0%  reduce 0%
    12/03/10 20:57:02 INFO streaming.StreamJob:  map 10%  reduce 0%
    12/03/10 20:57:03 INFO streaming.StreamJob:  map 30%  reduce 0%
    12/03/10 20:57:04 INFO streaming.StreamJob:  map 45%  reduce 0%
    12/03/10 20:57:06 INFO streaming.StreamJob:  map 55%  reduce 0%
    12/03/10 20:57:07 INFO streaming.StreamJob:  map 100%  reduce 0%
    12/03/10 22:05:24 INFO streaming.StreamJob: Job complete: job_201203102044_0001
    12/03/10 22:05:24 INFO streaming.StreamJob: Output: load
    
    real	68m38.462s
    user	0m3.772s
    sys	0m0.738s
    Master loader completed
    ERROR: Bucket 'ami-kippsample03job-stat' does not exist
    Bucket 's3://ami-kippsample03job-stat/' removed
    Bucket 's3://ami-kippsample03job-stat/' created
    ERROR: Bucket 'ami-kippsample03job-output' does not exist
    Bucket 's3://ami-kippsample03job-output/' removed
    Bucket 's3://ami-kippsample03job-output/' created
    File s3://ami-kippsample03reads/input1.local saved as '/usr/local/hadoop-0.19.0/input1.local' (97 bytes in 0.1 seconds, 1678.37 B/s)
    File s3://ami-kippsample03reads/input10.local saved as '/usr/local/hadoop-0.19.0/input10.local' (98 bytes in 0.1 seconds, 1848.92 B/s)
    File s3://ami-kippsample03reads/input11.local saved as '/usr/local/hadoop-0.19.0/input11.local' (98 bytes in 0.0 seconds, 2.34 kB/s)
    File s3://ami-kippsample03reads/input12.local saved as '/usr/local/hadoop-0.19.0/input12.local' (98 bytes in 0.1 seconds, 1576.68 B/s)
    File s3://ami-kippsample03reads/input13.local saved as '/usr/local/hadoop-0.19.0/input13.local' (98 bytes in 0.1 seconds, 741.02 B/s)
    File s3://ami-kippsample03reads/input14.local saved as '/usr/local/hadoop-0.19.0/input14.local' (98 bytes in 0.1 seconds, 1012.16 B/s)
    File s3://ami-kippsample03reads/input15.local saved as '/usr/local/hadoop-0.19.0/input15.local' (98 bytes in 0.1 seconds, 1903.09 B/s)
    File s3://ami-kippsample03reads/input16.local saved as '/usr/local/hadoop-0.19.0/input16.local' (98 bytes in 0.0 seconds, 1989.20 B/s)
    File s3://ami-kippsample03reads/input17.local saved as '/usr/local/hadoop-0.19.0/input17.local' (98 bytes in 0.0 seconds, 1997.18 B/s)
    File s3://ami-kippsample03reads/input18.local saved as '/usr/local/hadoop-0.19.0/input18.local' (98 bytes in 0.0 seconds, 2.15 kB/s)
    File s3://ami-kippsample03reads/input19.local saved as '/usr/local/hadoop-0.19.0/input19.local' (98 bytes in 0.0 seconds, 2.40 kB/s)
    File s3://ami-kippsample03reads/input2.local saved as '/usr/local/hadoop-0.19.0/input2.local' (97 bytes in 0.1 seconds, 1263.51 B/s)
    File s3://ami-kippsample03reads/input20.local saved as '/usr/local/hadoop-0.19.0/input20.local' (98 bytes in 0.1 seconds, 1340.78 B/s)
    File s3://ami-kippsample03reads/input21.local saved as '/usr/local/hadoop-0.19.0/input21.local' (98 bytes in 0.1 seconds, 1857.47 B/s)
    File s3://ami-kippsample03reads/input22.local saved as '/usr/local/hadoop-0.19.0/input22.local' (98 bytes in 0.1 seconds, 1100.16 B/s)
    File s3://ami-kippsample03reads/input23.local saved as '/usr/local/hadoop-0.19.0/input23.local' (98 bytes in 0.1 seconds, 1780.13 B/s)
    File s3://ami-kippsample03reads/input24.local saved as '/usr/local/hadoop-0.19.0/input24.local' (98 bytes in 0.1 seconds, 1927.05 B/s)
    File s3://ami-kippsample03reads/input25.local saved as '/usr/local/hadoop-0.19.0/input25.local' (98 bytes in 0.1 seconds, 1430.26 B/s)
    File s3://ami-kippsample03reads/input26.local saved as '/usr/local/hadoop-0.19.0/input26.local' (98 bytes in 0.1 seconds, 1714.27 B/s)
    File s3://ami-kippsample03reads/input27.local saved as '/usr/local/hadoop-0.19.0/input27.local' (98 bytes in 0.0 seconds, 2.18 kB/s)
    File s3://ami-kippsample03reads/input28.local saved as '/usr/local/hadoop-0.19.0/input28.local' (98 bytes in 0.0 seconds, 2.59 kB/s)
    File s3://ami-kippsample03reads/input29.local saved as '/usr/local/hadoop-0.19.0/input29.local' (98 bytes in 0.0 seconds, 2.21 kB/s)
    File s3://ami-kippsample03reads/input3.local saved as '/usr/local/hadoop-0.19.0/input3.local' (97 bytes in 0.0 seconds, 2.20 kB/s)
    File s3://ami-kippsample03reads/input30.local saved as '/usr/local/hadoop-0.19.0/input30.local' (98 bytes in 0.1 seconds, 1836.96 B/s)
    File s3://ami-kippsample03reads/input31.local saved as '/usr/local/hadoop-0.19.0/input31.local' (98 bytes in 0.1 seconds, 1788.36 B/s)
    File s3://ami-kippsample03reads/input32.local saved as '/usr/local/hadoop-0.19.0/input32.local' (98 bytes in 0.0 seconds, 2.07 kB/s)
    File s3://ami-kippsample03reads/input33.local saved as '/usr/local/hadoop-0.19.0/input33.local' (98 bytes in 0.1 seconds, 793.01 B/s)
    File s3://ami-kippsample03reads/input34.local saved as '/usr/local/hadoop-0.19.0/input34.local' (98 bytes in 0.0 seconds, 2.05 kB/s)
    File s3://ami-kippsample03reads/input35.local saved as '/usr/local/hadoop-0.19.0/input35.local' (98 bytes in 0.0 seconds, 2.04 kB/s)
    File s3://ami-kippsample03reads/input36.local saved as '/usr/local/hadoop-0.19.0/input36.local' (98 bytes in 0.0 seconds, 2004.88 B/s)
    File s3://ami-kippsample03reads/input37.local saved as '/usr/local/hadoop-0.19.0/input37.local' (98 bytes in 0.1 seconds, 1686.98 B/s)
    File s3://ami-kippsample03reads/input38.local saved as '/usr/local/hadoop-0.19.0/input38.local' (98 bytes in 0.1 seconds, 1425.54 B/s)
    File s3://ami-kippsample03reads/input39.local saved as '/usr/local/hadoop-0.19.0/input39.local' (98 bytes in 0.0 seconds, 3.35 kB/s)
    File s3://ami-kippsample03reads/input4.local saved as '/usr/local/hadoop-0.19.0/input4.local' (97 bytes in 0.0 seconds, 2.06 kB/s)
    File s3://ami-kippsample03reads/input40.local saved as '/usr/local/hadoop-0.19.0/input40.local' (98 bytes in 0.0 seconds, 2.07 kB/s)
    File s3://ami-kippsample03reads/input41.local saved as '/usr/local/hadoop-0.19.0/input41.local' (98 bytes in 0.0 seconds, 1960.43 B/s)
    File s3://ami-kippsample03reads/input42.local saved as '/usr/local/hadoop-0.19.0/input42.local' (98 bytes in 0.0 seconds, 2.09 kB/s)
    File s3://ami-kippsample03reads/input43.local saved as '/usr/local/hadoop-0.19.0/input43.local' (98 bytes in 0.1 seconds, 1370.40 B/s)
    File s3://ami-kippsample03reads/input44.local saved as '/usr/local/hadoop-0.19.0/input44.local' (98 bytes in 0.3 seconds, 358.54 B/s)
    File s3://ami-kippsample03reads/input45.local saved as '/usr/local/hadoop-0.19.0/input45.local' (98 bytes in 0.0 seconds, 2.85 kB/s)
    File s3://ami-kippsample03reads/input46.local saved as '/usr/local/hadoop-0.19.0/input46.local' (98 bytes in 0.0 seconds, 2013.97 B/s)
    File s3://ami-kippsample03reads/input47.local saved as '/usr/local/hadoop-0.19.0/input47.local' (98 bytes in 0.0 seconds, 3.13 kB/s)
    File s3://ami-kippsample03reads/input48.local saved as '/usr/local/hadoop-0.19.0/input48.local' (98 bytes in 0.1 seconds, 1585.68 B/s)
    File s3://ami-kippsample03reads/input49.local saved as '/usr/local/hadoop-0.19.0/input49.local' (98 bytes in 0.1 seconds, 1626.69 B/s)
    File s3://ami-kippsample03reads/input5.local saved as '/usr/local/hadoop-0.19.0/input5.local' (97 bytes in 0.0 seconds, 2.61 kB/s)
    File s3://ami-kippsample03reads/input50.local saved as '/usr/local/hadoop-0.19.0/input50.local' (98 bytes in 0.0 seconds, 2.09 kB/s)
    File s3://ami-kippsample03reads/input51.local saved as '/usr/local/hadoop-0.19.0/input51.local' (98 bytes in 0.1 seconds, 1158.50 B/s)
    File s3://ami-kippsample03reads/input52.local saved as '/usr/local/hadoop-0.19.0/input52.local' (98 bytes in 0.0 seconds, 2.41 kB/s)
    File s3://ami-kippsample03reads/input53.local saved as '/usr/local/hadoop-0.19.0/input53.local' (98 bytes in 0.0 seconds, 2018.21 B/s)
    File s3://ami-kippsample03reads/input54.local saved as '/usr/local/hadoop-0.19.0/input54.local' (98 bytes in 0.1 seconds, 1402.40 B/s)
    File s3://ami-kippsample03reads/input55.local saved as '/usr/local/hadoop-0.19.0/input55.local' (98 bytes in 0.0 seconds, 2.10 kB/s)
    File s3://ami-kippsample03reads/input56.local saved as '/usr/local/hadoop-0.19.0/input56.local' (98 bytes in 0.0 seconds, 2.37 kB/s)
    File s3://ami-kippsample03reads/input57.local saved as '/usr/local/hadoop-0.19.0/input57.local' (98 bytes in 0.0 seconds, 2.38 kB/s)
    File s3://ami-kippsample03reads/input58.local saved as '/usr/local/hadoop-0.19.0/input58.local' (98 bytes in 0.1 seconds, 1810.96 B/s)
    File s3://ami-kippsample03reads/input59.local saved as '/usr/local/hadoop-0.19.0/input59.local' (98 bytes in 0.0 seconds, 2.03 kB/s)
    File s3://ami-kippsample03reads/input6.local saved as '/usr/local/hadoop-0.19.0/input6.local' (97 bytes in 0.0 seconds, 2.09 kB/s)
    File s3://ami-kippsample03reads/input60.local saved as '/usr/local/hadoop-0.19.0/input60.local' (98 bytes in 0.0 seconds, 2.50 kB/s)
    File s3://ami-kippsample03reads/input7.local saved as '/usr/local/hadoop-0.19.0/input7.local' (97 bytes in 0.0 seconds, 2.15 kB/s)
    File s3://ami-kippsample03reads/input8.local saved as '/usr/local/hadoop-0.19.0/input8.local' (97 bytes in 0.1 seconds, 1397.15 B/s)
    File s3://ami-kippsample03reads/input9.local saved as '/usr/local/hadoop-0.19.0/input9.local' (97 bytes in 0.0 seconds, 2.47 kB/s)
    rmr: cannot remove test: No such file or directory.
    rmr: cannot remove maq: No such file or directory.
    Maq alignments + Duplicate remover
    12/03/10 22:05:39 WARN streaming.StreamJob: -jobconf option is deprecated, please use -D instead.
    packageJobJar: [/root/mapper_maqalignment.py, /root/Sam2Fastq.java, /root/FQone2Fastq.java, /root/Fastq2FQone.java, /root/removeduplicates_new.java, /root/MAQunmapped2FQone.java, /root/MAQunmapped2fastq.java, /mnt/hadoop/hadoop-unjar7647762000067869068/] [] /tmp/streamjob8505619080343769434.jar tmpDir=null
    12/03/10 22:05:40 INFO mapred.FileInputFormat: Total input paths to process : 60
    12/03/10 22:05:40 INFO streaming.StreamJob: getLocalDirs(): [/mnt/hadoop/mapred/local]
    12/03/10 22:05:40 INFO streaming.StreamJob: Running job: job_201203102044_0002
    12/03/10 22:05:40 INFO streaming.StreamJob: To kill this job, run:
    12/03/10 22:05:40 INFO streaming.StreamJob: /usr/local/hadoop-0.19.0/bin/../bin/hadoop job  -Dmapred.job.tracker=hdfs://ip-10-34-46-200.ec2.internal:50002 -kill job_201203102044_0002
    12/03/10 22:05:40 INFO streaming.StreamJob: Tracking URL: [url]http://ip-10-34-46-200.ec2.internal:50030/jobdetails.jsp?jobid=job_201203102044_0002[/url]
    12/03/10 22:05:41 INFO streaming.StreamJob:  map 0%  reduce 0%
    12/03/10 22:06:00 INFO streaming.StreamJob:  map 3%  reduce 0%
    12/03/10 22:06:01 INFO streaming.StreamJob:  map 17%  reduce 0%
    12/03/10 22:06:02 INFO streaming.StreamJob:  map 25%  reduce 0%
    12/03/10 22:06:04 INFO streaming.StreamJob:  map 28%  reduce 0%
    12/03/10 22:06:05 INFO streaming.StreamJob:  map 33%  reduce 0%
    12/03/10 22:06:06 INFO streaming.StreamJob:  map 47%  reduce 0%
    12/03/10 22:06:07 INFO streaming.StreamJob:  map 55%  reduce 0%
    12/03/10 22:06:08 INFO streaming.StreamJob:  map 62%  reduce 0%
    12/03/10 22:06:09 INFO streaming.StreamJob:  map 65%  reduce 0%
    12/03/10 22:06:10 INFO streaming.StreamJob:  map 70%  reduce 0%
    12/03/10 22:06:11 INFO streaming.StreamJob:  map 83%  reduce 0%
    12/03/10 22:06:12 INFO streaming.StreamJob:  map 92%  reduce 0%
    12/03/10 22:06:14 INFO streaming.StreamJob:  map 95%  reduce 0%
    12/03/10 22:06:15 INFO streaming.StreamJob:  map 98%  reduce 0%
    12/03/10 22:06:16 INFO streaming.StreamJob:  map 100%  reduce 0%

    Leave a comment:


  • pcs_murali
    replied
    Hi Rts,

    thank you very much for posting. Please let me know if you can run it without any problem.

    Chandra

    Leave a comment:


  • pcs_murali
    replied
    Hi,

    Pathseq don't analyze the data as paired-end data.

    Current version, uses the Maq as step 1 aligner and then BLAST as second set of aligner.

    Please let me know if you have any questions.

    Thanks
    Chandra

    Leave a comment:


  • NKAkers
    replied
    Hello fellow Pathseq users,

    First, thanks to everyone who posted so far, this thread has been a lifesaver!

    Does anyone know if Pathseq will analyze paired-end data in pairs? In other words, when the pipeline has a pile of unmapped non-human reads, does it align each read by itself, or does it do paired-end alignment with both ends? I'm thinking it does not do paired-end alignment, because as far as I can tell BLAST doesn't have that capability. However I'm new to this and a lot of what this pipeline is doing behind the scenes is a mystery to me.

    Thanks!

    Leave a comment:


  • zehnderm
    replied
    PathSeq Test Run- Completing too quickly

    Question redacted!
    Last edited by zehnderm; 01-05-2012, 01:56 PM.

    Leave a comment:


  • rts
    replied
    Accessing log file to monitor Pathseq_Launch.com progress

    Hi all,
    In this forum there are numerous posts that show the log file for the Pathseq_Launch.com run on the cluster.

    Here is how you access the log file:
    go into ~/PathseqDirectory/bin
    ./hadoop-ec2 login test-cluster (test-cluster is what you named the cluster in jobs.config)
    Then you will see a log file in root directory.
    You can view the log,txt. w/ vim log.txt
    Last edited by rts; 12-15-2011, 02:04 PM. Reason: clarify

    Leave a comment:


  • rts
    replied
    Figured it out...to advance past this using Ubuntu 11.04, need to uninstall csh and install tcsh

    Leave a comment:


  • rts
    replied
    wondering about changing n_para=$# at top of script to n_para=$status ...as it seems several references mention changing variable=$? to variable=$status works to resolve the problem is some linux environoments. I am using ubuntu 11.04 ...hesitant to change the script without running it by you


    http://shell.deru.com/~gdt/unix2/lectures/13.shtml (bottom)

    Leave a comment:


  • rts
    replied
    Chanda,
    I figured out question one from above. Reran w/ javac and no apparent difference.
    I still cannot figure out the the "Newline in variable name" error.

    Seems like no matter what I w/ Preprocessed_Reads.com I get that message.
    I checked the Preprocessed_Reads.com and see no reference to that.
    Would really appreciate that any suggestions you may have
    I am trying to read up on it http://tinyurl.com/7lyaqpm

    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$ sudo ./Preprocessed_Reads.com /media/f64f8101-5d47-4e9b-b138-977bf462383d/RNAseq_november/111102_TENNISON_0143_BC028AACXX_L2_CGATGT_1_pf.fastq_111102_TENNISON_0143_BC028AACXX_L2_CGATGT_2_pf.fastq_Qualityreads.fq1
    Newline in variable name.
    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$ sudo ./Preprocessed_Reads.com
    Newline in variable name.
    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$ ./Preprocessed_Reads.com
    Newline in variable name.

    Leave a comment:


  • rts
    replied
    rerunning QualityFilter.com on the data w javac installed

    Leave a comment:


  • rts
    replied
    Preprocessed_Reads.com

    Hi Chandra,
    May I run a couple more questions by you?

    1) during ./QualityFilter.com run, javac command not found. I guess I have the user rather developer version of java but the fq1 files were complete and the error did not stop the program. Before installing it thought i would ask if I need javac for the quality filter?

    Quality filter processing
    Read 1: /media/f64f8101-5d47-4e9b-b138-977bf462383d/RNAseq_november/111102_TENNISON_0143_BC028AACXX_L2_CGATGT_1_pf.fastq
    Running the Quality filtering with default paramters
    2
    Conversion from Fastq to FQ one
    javac: Command not found.
    175655756 reads in this dataset. Amount of time: 1368.345
    Here is the Quality reads /media/f64f8101-5d47-4e9b-b138-977bf462383d/RNAseq_november/111102_TENNISON_0143_BC028AACXX_L2_CGATGT_1_pf.fastq_111102_TENNISON_0143_BC028AACXX_L2_CGATGT_2_pf.fastq_Qualityreads.fq1
    javac: Command not found.
    Read length: 101bp

    175655756 reads in this dataset.
    175655756 reads have passed the quality filter.
    Amount of time: 1870.882

    2) Do you recognize this error?
    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$ sudo ./Preprocessed_Reads.com 111102_TENNISON_0143_BC028AACXX_L2_CGATGT_1_pf.fastq_111102_TENNISON_0143_BC028AACXX_L2_CGATGT_2_pf.fastq_Qualityreads.fq1
    [sudo] password for shared:
    Newline in variable name.
    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$

    I checked to make sure jobs.config looked good...and it seems to
    # Name of the Job (Sample: testjob)
    NAMEJOB=testjobs

    # S3 Bucket name for Reference Genomes in the Cloud (Sample: ami-ref)
    REFBUCKET=ami-refs

    # Local directory to download the reference genomes (Sample: test)
    LOCALREFDIR=test_refs

    # S3 Bucket name for Illumina sequencing reads in the Cloud (Sample: ami-testreads)
    READSBUCKET=ami-testreads

    My S3 account has ami-pathseqimage1000, ami-refs and ami-refs-tmp

    Thank you for helping us!

    Leave a comment:


  • rts
    replied
    Got it with your suggestions...thanks Chandra!

    Leave a comment:


  • pcs_murali
    replied
    Hi,

    I am trying to simulate the problem in my machine. I didn't see any problem.

    Could you please recheck whether you followed all the points listed in Installation manual?

    If you changed any of the scripts, please download the Pathseq once again. Then, run it without changing the scripts.

    /mnt is in the cloud not in your local machine. One thing i will suggest is, when you download the Pathseq and extract..please give permission for R+W for all scripts and subfolders in the directory of Pathseq.

    Last part of log printed out:
    Uploaded hadoop-0.19.0-x86_64.part.82
    Uploaded hadoop-0.19.0-x86_64.part.83
    Uploaded hadoop-0.19.0-x86_64.part.84
    Uploaded hadoop-0.19.0-x86_64.part.85
    Uploaded hadoop-0.19.0-x86_64.part.86
    Uploading manifest ...
    Uploaded manifest.
    Bundle upload completed.
    Done
    IMAGE ami-298c4640
    Terminate with: ec2-terminate-instances i-35737856
    INSTANCE i-35737856 running shutting-down
    Creation completed
    rm: cannot remove `install_cross': No such file or directory
    rm: cannot remove `install_rlib': No such file or directory

    Thanks
    Chandra

    Originally posted by rts View Post
    Hi,
    Getting close....!
    But an error I can't seem to figure out....following create_ownAMI command.
    Error out out context:
    bash: /mnt/create-h-i-remote: Permission denied
    Client.InvalidManifest: HTTP 404 (Not Found) response for URL http://s3.amazonaws.com:80/ami-paths....manifest.xml: check your manifest path is correct and in the correct region.

    I have tried changing permission on /mnt (chmod 777), I have tried changing permissions on ./Build_OwnAMI/create-h-i-remote (chmod 600).
    No files are in the /mnt folder.
    I am trying to install w/o cross_match and repeatmasker for now

    Here is the full output....

    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$ ./create_ownAMI.com
    Creation of your Amazon Machine Image
    /home/shared/PathSeq/Pathseq_cloud_V5.2/Build_OwnAMI
    /home/shared/PathSeq/Pathseq_cloud_V5.2/Build_OwnAMI
    ami-5fb67036
    Starting a AMI with ID ami-5fb67036.

    Instance is i-b3f0f9d0.
    Polling server status (ec2-describe-instances i-b3f0f9d0)
    ...........................The server is available at ec2-204-236-213-0.compute-1.amazonaws.com.
    Warning: Permanently added 'ec2-204-236-213-0.compute-1.amazonaws.com,204.236.213.0' (RSA) to the list of known hosts.
    Copying scripts.
    cluster.config 100% 1979 1.9KB/s 00:00
    env.sh 100% 3983 3.9KB/s 00:00
    create-h-i-remote 100% 3531 3.5KB/s 00:00
    ec2-run-user-data 100% 1763 1.7KB/s 00:00
    N
    pk-HURMYZGNG74DWNEPWDVDFAJTWSF7O575.pem 100% 924 0.9KB/s 00:00
    cert-HURMYZGNG74DWNEPWDVDFAJTWSF7O575.pem 100% 916 0.9KB/s 00:00
    bash: /mnt/create-h-i-remote: Permission denied
    Client.InvalidManifest: HTTP 404 (Not Found) response for URL http://s3.amazonaws.com:80/ami-paths....manifest.xml: check your manifest path is correct and in the correct region.
    Terminate with: ec2-terminate-instances i-b3f0f9d0
    INSTANCE i-b3f0f9d0 running shutting-down
    Creation completed
    rm: cannot remove `install_cross': No such file or directory
    rm: cannot remove `install_rlib': No such file or directory

    when I try w/ sudo...i get error of EC2_HOME not set in ec2-run-instances.

    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$ sudo ./create_ownAMI.comCreation of your Amazon Machine Image
    /home/shared/PathSeq/Pathseq_cloud_V5.2/Build_OwnAMI
    /home/shared/PathSeq/Pathseq_cloud_V5.2/Build_OwnAMI
    ami-5fb67036
    Starting a AMI with ID ami-5fb67036.

    /home/shared/ec2-api-tools-1.5.0.0/bin/ec2-run-instances: line 9: EC2_HOME: EC2_HOME is not set
    Instance is .
    Polling server status (ec2-describe-instances )
    ./home/shared/ec2-api-tools-1.5.0.0/bin/ec2-describe-instances: line 9: EC2_HOME: EC2_HOME is not set
    ./home/shared/ec2-api-tools-1.5.0.0/bin/ec2-describe-instances: line 9: EC2_HOME: EC2_HOME is not set
    ^X./home/shared/ec2-api-tools-1.5.0.0/bin/ec2-describe-instances: line 9: EC2_HOME: EC2_HOME is not set

    I followed instructions for my.bashrc and have tried setting EC2_HOME variable manually too w/ export EC2_HOME=/home/shared/ec2-api-tools-1.5.0.0 prior to running sudo create_ownAMI.com

    Thanks for any guidance you may have.
    rts

    Leave a comment:


  • rts
    replied
    Almost there w/ installation create_ownAMI

    Hi,
    Getting close....!
    But an error I can't seem to figure out....following create_ownAMI command.
    Error out out context:
    bash: /mnt/create-h-i-remote: Permission denied
    Client.InvalidManifest: HTTP 404 (Not Found) response for URL http://s3.amazonaws.com:80/ami-paths....manifest.xml: check your manifest path is correct and in the correct region.

    I have tried changing permission on /mnt (chmod 777), I have tried changing permissions on ./Build_OwnAMI/create-h-i-remote (chmod 600).
    No files are in the /mnt folder.
    I am trying to install w/o cross_match and repeatmasker for now

    Here is the full output....

    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$ ./create_ownAMI.com
    Creation of your Amazon Machine Image
    /home/shared/PathSeq/Pathseq_cloud_V5.2/Build_OwnAMI
    /home/shared/PathSeq/Pathseq_cloud_V5.2/Build_OwnAMI
    ami-5fb67036
    Starting a AMI with ID ami-5fb67036.

    Instance is i-b3f0f9d0.
    Polling server status (ec2-describe-instances i-b3f0f9d0)
    ...........................The server is available at ec2-204-236-213-0.compute-1.amazonaws.com.
    Warning: Permanently added 'ec2-204-236-213-0.compute-1.amazonaws.com,204.236.213.0' (RSA) to the list of known hosts.
    Copying scripts.
    cluster.config 100% 1979 1.9KB/s 00:00
    env.sh 100% 3983 3.9KB/s 00:00
    create-h-i-remote 100% 3531 3.5KB/s 00:00
    ec2-run-user-data 100% 1763 1.7KB/s 00:00
    N
    pk-HURMYZGNG74DWNEPWDVDFAJTWSF7O575.pem 100% 924 0.9KB/s 00:00
    cert-HURMYZGNG74DWNEPWDVDFAJTWSF7O575.pem 100% 916 0.9KB/s 00:00
    bash: /mnt/create-h-i-remote: Permission denied
    Client.InvalidManifest: HTTP 404 (Not Found) response for URL http://s3.amazonaws.com:80/ami-paths....manifest.xml: check your manifest path is correct and in the correct region.
    Terminate with: ec2-terminate-instances i-b3f0f9d0
    INSTANCE i-b3f0f9d0 running shutting-down
    Creation completed
    rm: cannot remove `install_cross': No such file or directory
    rm: cannot remove `install_rlib': No such file or directory

    when I try w/ sudo...i get error of EC2_HOME not set in ec2-run-instances.

    shared@redwoodpath:~/PathSeq/Pathseq_cloud_V5.2$ sudo ./create_ownAMI.comCreation of your Amazon Machine Image
    /home/shared/PathSeq/Pathseq_cloud_V5.2/Build_OwnAMI
    /home/shared/PathSeq/Pathseq_cloud_V5.2/Build_OwnAMI
    ami-5fb67036
    Starting a AMI with ID ami-5fb67036.

    /home/shared/ec2-api-tools-1.5.0.0/bin/ec2-run-instances: line 9: EC2_HOME: EC2_HOME is not set
    Instance is .
    Polling server status (ec2-describe-instances )
    ./home/shared/ec2-api-tools-1.5.0.0/bin/ec2-describe-instances: line 9: EC2_HOME: EC2_HOME is not set
    ./home/shared/ec2-api-tools-1.5.0.0/bin/ec2-describe-instances: line 9: EC2_HOME: EC2_HOME is not set
    ^X./home/shared/ec2-api-tools-1.5.0.0/bin/ec2-describe-instances: line 9: EC2_HOME: EC2_HOME is not set

    I followed instructions for my.bashrc and have tried setting EC2_HOME variable manually too w/ export EC2_HOME=/home/shared/ec2-api-tools-1.5.0.0 prior to running sudo create_ownAMI.com

    Thanks for any guidance you may have.
    rts

    Leave a comment:


  • pcs_murali
    replied
    Hi

    Thank you very much for your interest.

    You don't have the create the IMAGEBUCKET manually, the createOwn_AMI.com create the s3 bucket for you.

    Please try to make the IMAGEBUCKET name as unique as possible.

    Please let me know if you have trouble with the Pathseq installation / runs.

    Thanks

    Originally posted by rts View Post
    Hi All,
    I am learning Amazon Cloud for the PathSeq pipeline.
    From the installation manual and reading I have gotten fairly far.
    But, I am struggling the the IMAGEBUCKET variable in cluster.config file.
    I have a S3 bucket named redwoodpath and have an instance w/ an AMI-ID but I do not get how to give an S3 bucket an "ami-name" (where the hadoop AMI is stored) as shown in the installation manual.

    # The Amazon S3 bucket where the Hadoop AMI is stored.(Edit)
    # so you can store it in a bucket you own.
    IMAGEBUCKET=ami-pathseqimageXXXXXXXXXXXXXX

    I discussed this with another PathSeq user who is struggling and they too were unsure and said "I'm actually having the same problem with naming the IMAGEBUCKET. I just put random letters for the "X's" assuming that it just needs to be unique."

    Any help would be much appreciated.

    thanks,
    rts

    Leave a comment:

Latest Articles

Collapse

  • seqadmin
    Recent Advances in Sequencing Analysis Tools
    by seqadmin


    The sequencing world is rapidly changing due to declining costs, enhanced accuracies, and the advent of newer, cutting-edge instruments. Equally important to these developments are improvements in sequencing analysis, a process that converts vast amounts of raw data into a comprehensible and meaningful form. This complex task requires expertise and the right analysis tools. In this article, we highlight the progress and innovation in sequencing analysis by reviewing several of the...
    05-06-2024, 07:48 AM
  • seqadmin
    Essential Discoveries and Tools in Epitranscriptomics
    by seqadmin




    The field of epigenetics has traditionally concentrated more on DNA and how changes like methylation and phosphorylation of histones impact gene expression and regulation. However, our increased understanding of RNA modifications and their importance in cellular processes has led to a rise in epitranscriptomics research. “Epitranscriptomics brings together the concepts of epigenetics and gene expression,” explained Adrien Leger, PhD, Principal Research Scientist...
    04-22-2024, 07:01 AM

ad_right_rmr

Collapse

News

Collapse

Topics Statistics Last Post
Started by seqadmin, Yesterday, 07:03 AM
0 responses
15 views
0 likes
Last Post seqadmin  
Started by seqadmin, 05-10-2024, 06:35 AM
0 responses
37 views
0 likes
Last Post seqadmin  
Started by seqadmin, 05-09-2024, 02:46 PM
0 responses
43 views
0 likes
Last Post seqadmin  
Started by seqadmin, 05-07-2024, 06:57 AM
0 responses
39 views
0 likes
Last Post seqadmin  
Working...
X