Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GenoMax
    replied
    Do not forget to provision some way of doing regular backup of the data you are going to be working on.

    When you are using a local computer (BTW: I second geneticform's reco in post #8 to see if you can hop on to some shared infrastructure but if that may not be an option in your case) it is easy to overlook this important need. It will potentially save you hours of frustration down the road.

    A backup solution can be as simple as an external drive (or two if you want to be thorough) with regular backup via scripts/programs. It sounds like you are not going to deal with more than a couple of TB of data so this should be sufficient.

    Leave a comment:


  • adaptivegenome
    replied
    Originally posted by yumtaoist View Post
    If you don't care about efficiency, a computer with four CPUs is enough. But the memory is very important, 16GB RAM is necessary. What's more, you can set another 16GB hard disk as SWAP, I think this will solove the crash problem.

    Actually, the hardware you needed is fully connected to the software you are using, some software have pointed out the hardware it required.

    Sometimes, renting the server of a supercomputer center is also a good idea. You will know your requirement after a period time of trial.

    Additionally, the space of hard disk is also very important. Generally speaking, 1TB is basically, 2TB or more is better.
    I agree. People often jump into developing computer infrastructure when it might be more cost-effective and efficient to use someone's else existing system. If you develop a long term need you can always acquire the hardware at a later point.

    Leave a comment:


  • Giorgio C
    replied
    Thank you yumtaoist !!!

    Leave a comment:


  • yumtaoist
    replied
    If you don't care about efficiency, a computer with four CPUs is enough. But the memory is very important, 16GB RAM is necessary. What's more, you can set another 16GB hard disk as SWAP, I think this will solove the crash problem.

    Actually, the hardware you needed is fully connected to the software you are using, some software have pointed out the hardware it required.

    Sometimes, renting the server of a supercomputer center is also a good idea. You will know your requirement after a period time of trial.

    Additionally, the space of hard disk is also very important. Generally speaking, 1TB is basically, 2TB or more is better.
    Last edited by yumtaoist; 01-03-2012, 07:05 AM.

    Leave a comment:


  • Giorgio C
    replied
    Hy Guys thank you all for your answers !!!

    In the details i need this new Computer to work on only 454 data in the context of transcitpomic, metagenomic, chip-seq, amplicon sequencing ( for the most). Now i'm working with an "home PC" > intel dual core with 4Gb of Ram;
    It 's good but in some cases it crashed down out of memory ( For example doing a big local blast or a multiple alignment).

    Can you suggest me some brands to looking for with the best requirements for my purpose ??

    Thanks like always for your precious help !!!

    Giorgio

    Leave a comment:


  • severin
    replied
    Example System

    In my facility, we have a 7 node system where each node consists of 32 cpu and 256G of RAM. For most problems, 16 cores and 128G RAM is sufficient to work on the biological problem of interest and get an answer in a couple of hours to a couple of days. For a genome assembly larger than bacteria, expect to utilize at least 32 core and 128G RAM for 2 days to run one assembly with one set of parameters. If you want to try multiple parameters then you will need twice as much. If you are talking about getting this for your department with 4-5 people using it none stop you will want at least as many nodes as their are people and that would be at the lower end. This all assumes you are working on datasets of the size of a two condition (stress vs controll etc) or perhaps a small time/tissue series of 5-10 points all with 2-3 replicates. If you start thinking bigger (1000 genome project, 10000 vertebrate project, screening populations of individuals) you will need considerably more computational power.

    As previous posts have mentioned you can do most smaller problems on a smaller system. Bacteria can be assembled on a laptop.

    Leave a comment:


  • ulz_peter
    replied
    Hi Giorgio,

    Like Jordi said: It depends: If your focus is on targetted resequencing, there are some cheap solutions (maybe a quad core for parallel alignment and some 16Gb RAM or so). Depending on the amount of data you are aiming to produce you should consider thinking about a long-term storage in the TB region.

    If you're planning to do de-novo sequencing you'll need a lot more RAM (depending on the genome size you're gonna sequence).

    So to give you a concrete advise we would need some more details on what you're planning to do...

    Leave a comment:


  • jordi
    replied
    Wow! I think you'll have plenty of answers to that question. As usual one could say: it depends! Let me know that NCBI had budgets constraints in order to maintain the SRA repository... I mean, if your department is planning to do a lot of 454 runs or multiple Illumina lines, you should have a lot of free space. Instead of this, a single analysis could be as long as 6 GB.
    Moreover, if you are going to do a parallel process (grid blast: http://www.springerlink.com/content/2rlxkun1cxalxrjd/) you should need a Portable Batch System in order to schedule several jobs between processors.
    Regard the RAM needs, I've run a Newbler assembly program in a 4GB RAM computer.
    Hope this helps.

    Leave a comment:


  • Computer requirements to do Bioinformatic analysis

    Hy all,

    my department want to buy a new computer to do bioinformatic analysis for next generation sequencing. Can you suggest me something good with some paramount requirements ?? (CPU; RAM....etc.etc.).

    Thanks you very much

Latest Articles

Collapse

  • seqadmin
    Exploring the Dynamics of the Tumor Microenvironment
    by seqadmin




    The complexity of cancer is clearly demonstrated in the diverse ecosystem of the tumor microenvironment (TME). The TME is made up of numerous cell types and its development begins with the changes that happen during oncogenesis. “Genomic mutations, copy number changes, epigenetic alterations, and alternative gene expression occur to varying degrees within the affected tumor cells,” explained Andrea O’Hara, Ph.D., Strategic Technical Specialist at Azenta. “As...
    07-08-2024, 03:19 PM
  • seqadmin
    Exploring Human Diversity Through Large-Scale Omics
    by seqadmin


    In 2003, researchers from the Human Genome Project (HGP) announced the most comprehensive genome to date1. Although the genome wasn’t fully completed until nearly 20 years later2, numerous large-scale projects, such as the International HapMap Project and 1000 Genomes Project, continued the HGP's work, capturing extensive variation and genomic diversity within humans. Recently, newer initiatives have significantly increased in scale and expanded beyond genomics, offering a more detailed...
    06-25-2024, 06:43 AM

ad_right_rmr

Collapse

News

Collapse

Topics Statistics Last Post
Started by seqadmin, Yesterday, 06:53 AM
0 responses
13 views
0 likes
Last Post seqadmin  
Started by seqadmin, 07-10-2024, 07:30 AM
0 responses
34 views
0 likes
Last Post seqadmin  
Started by seqadmin, 07-03-2024, 09:45 AM
0 responses
204 views
0 likes
Last Post seqadmin  
Started by seqadmin, 07-03-2024, 08:54 AM
0 responses
213 views
0 likes
Last Post seqadmin  
Working...
X