I just watched the $6 billion experiment where CERN is going to run its large Hadron Collider (Fondly called by its multi-team of scientists as the LHC). Then I saw this El reg article:
Based at Oxford University, WTCHG is part of a world-wide collaborative programme which is researching the genetic causes of diabetes, obesity and other common ailments. Its own compute resources include a 120-node Linux cluster and 25 core servers, plus a Fibre Channel SAN.
It also has a server that now hosts four 21TB Nexsan SATABeast arrays, mirrored for a total of 42TB, said Dr Tim Bardsley, WTCHG's IT manager. It manages its storage using DataCore's SANmelody software, which allows users to access data via iSCSI and Fibre Channel.
and compared to that CERN's LHC is to generate 2 Petabytes of data per second! A bit more about the LHC:
Cern is building the Large Hadron Collider (LHC) particle accelerator to study subatomic matter and the forces that hold it together.
- The LHC is a 27km ring being built 100m beneath Switzerland and France. It will be live in 2008 ( It has been delayed to May 2008 though)
- The ring will be cooled to -271C, just 2C above absolute zero
- Particles will circle around the LHC 1000 times per second
- The collision between particle beams will simulate what happened in the first millionth of a millionth of a second after the Big Bang
- Particles will collide 800 million times a second
- Two petabytes of data will be generated every second - a petabyte is 1,000,000,000,000,000 bytes
Read more about CERN and LHC and the El Reg news and WTCHG.