The Biggest Big Data of All Time

Posted by | · · · | Big Data | Comments Off on The Biggest Big Data of All Time

Imagine generating so much data that each day’s worth needs the processing power of one hundred million PCs and would take two million years to playback on an ipod*.

The Square Kilometer Array (SKA), http://www.skatelescope.org/, is about to do exactly that, by scanning the deep universe for radiation data about conditions within a sliver of the Big Bang. When online, the radio telescope will collect 268 billion billion gigabytes of data each year, which it will filter down to about 7 billion billion gigabytes for analysis. To put that in perspective, the entire Internet data traffic for one year is 3.5 billion billion gigabytes: each year, the SKA will analyze double the amount of data carried by the Internet. Outside black box sectors, it’s the biggest data and computation project on Earth, and it’s a global effort. It’s not been announced whether any data will be made available to public hard drives for the distributed processing of routine tasks, like it was in the SETI project.

For data junkies, the excitement is not in the size of data gathered or analyzed, but in the filters and algorithms used to get from the former to the latter.

*https://www.skatelescope.org/amazingfacts/


No Comments

Comments are closed.