Tag Archives: Data Explosion

World Catches On to the Exaflood

Researchers Martin Hilbert and Priscila Lopez add to the growing literature on the data explosion (what we long ago termed the “exaflood”) with a study of analog and digital information storage, transmission, and computation from 1986 through 2007. They found in 2007 globally we were able to store 290 exabytes, communicate almost 2 zettabytes, and compute around 6.4 exa-instructions per second (EIPS?) on general purpose computers. The numbers have gotten much, much larger since then. Here’s the Science paper (subscription), which appears along side an entire special issue, “Dealing With Data,” and here’s a graphic from the Washington Post:

(Thanks to @AdamThierer for flagging the WashPost article.)