Storage Optimization


Solving Cornell’s Storage Problems

Posted in Storage by storageoptimization on February 5, 2009
Tags: , , , , , , ,
Great news for Ocarina Networks today — we’re working with the Cornell Center for Advanced Computing (CAC) and DataDirect Networks (DDN) to perform extensive data compression testing on a diverse array of research applications. The goal here is address a problem they (and so many other research institutions) are facing — the exponential growth of online data and complex file types that have to be stored in such a way that they’re readily accessible.

For Cornell, one of their biggest pain points was storage of genomics files. Genomics research is accelerating at a dizzying rate, and it turns out to be a very image intensive research area. Here’s one way to think about it: when J. Craig Venter and his team first sequenced the human genome, it took up 2GB of storage. Nowadays, a single molecular sequence can generate 100GB of data per HOUR. That’s not to say that all of it needs to be stored every time, but you get the idea. In fact, all around the world, genome sequencers are spitting out files as they race to find cures for life-threatening diseases such as cancer, Alzheimer’s and heart disease. If they don’t get storage under control, the pace of genomic research could actually be slowed.

For more on our work with Cornell and DataDirect Networks, go here. And if you’re interested in getting the full download on Ocarina’s work with life sciences storage, go to this page for access to a white paper, “Coping with the Explosion of Data in Life Sciences Research.”

Advertisements