Being very bored waiting all day for a
passenger, I elected to crunch some more numbers:
4TB drives are available now, so 1.25 billion drives would be required to store 5ZB.
Each drive is about 146 x 101.6 x 19mm, or .281 cubic meters. Total size would be 352,298,000 cubic meters. Roughly the volume of a large container ship.
Each drive costs about $180, using the low end of newegg prices, so about 225 billion dollars.
And all this doesn't include anything but bare drives. You need to mount them, power them, and cool them, which is no easy task that that scale. Somewhere in the 10GW range to power the drives alone, plus the rest of the computing hardware required to actually try and use that much data. And then you have to dissipate all that heat.
Finding a supplier for over a billion hard drives may also prove a challenge, as this would be about two years worth of global production spent entirely on these 4tb models. (Obviously the bulk of drives produced are not 4tb)
I'm going off memory here from a few years ago working with a data center.
First their was two utillity feeds into the data center, grid isolated so if one went down the other would be there to pick up the slack.
There were massive 2.7 Megawatt generators for reduncancy. Huge tanks of diesel were kept on site with tank heaters to keep the water out of the fuel. Diesels very hydroscopic
Cooling was accomplished with multiple 700 Ton Trane Centrifigal Water Chillers that sent out 42 degree water into 30 Ton Chill Water Cooled AC units. A " ton" of air is 400 CFM.
Rows of servers were run off of PDUs and everything was sent back to a massive switchgear that would perform the instant switch over in case of a powet failure.
Each data center ran a continuous 2,000 Amp Load on the Utillity. At 480/3 phase stepped down through transformers at the PDUs, the AC ran on 480 3 phase power, thats a massive amount of power.....use OHMs law to figure Watts.
Their UPS facillity was massive, filled with racks and racks of 900 dollar 6 volt batteries for a few minutes of emergency power. It also had to be cooled.
There was also Co-Lo cation and a team of 24/7 techs working to keep each Server up and running.
Those data farms I worked with cant hold a candle to the UTAH facillity.