There’s no denying that data demand is increasing at an alarming rate.
Then there’s Facebook, which currently stores in the neighborhood of 100 petabytes of data from photos and videos.
Worldwide, consumers gobbled up 322 petabytes per day in 2010. And currently, that number has reached 499 petabytes.
And although demand is already testing current data storage limits, this new technology makes today’s capacity constraints seem minor…
On Tap for a Major Data Initiative
IBM (NYSE: IBM) has teamed up with Astron (the Netherlands Institute of Radio Astronomy) to develop a new telescope, dubbed the Square Kilometer Array (SKA).
This radio telescope is scheduled to go online in 2024, and will be the largest and most sensitive in the world. And IBM’s role over the next five years is to discover a way to handle the massive amount of data that the telescope will consume: one exabyte, each day.
That’s equal to two times the world’s daily data consumption mentioned above.
Keeping all that data running smoothly isn’t the only problem, either. IBM also needs to figure out how to deal with the insane amount of energy the telescope’s data deluge would consume
You see, some datacenter servers are currently delivering two terabytes (around 0.002 petabytes) of memory right now. And energy is already a big issue. “Energy costs are the fastest-rising cost element in the data center portfolio… Removing a single x86 server from a data center will result in savings of more than $400 a year in energy costs alone,” according to Gartner analyst, Rakesh Kumar.
So if removing 0.002 petabytes can save $400, one exabyte calculates to $204.8 million!
Do NOT Deposit Another Dollar in Your Bank Account Until You Read THIS
A CIA insider has launched an urgent mission to expose the government’s secret money lockdown plan…
Once you see what could happen next time you go to an ATM, you’ll understand why he’s sending a FREE copy of his new book to any American who answers right here.
“We need to be very creative… If we were to use standard servers of today, we’d need millions of them. They would use so much space and use so much energy that we couldn’t afford to build the machines let alone operate them,” IBM researcher, Ronald Luijten, told Mashable.
Now, if IBM figures out a way to solve this dilemma, such data processing technology would likely be implemented around the world.
But either way, the company’s involvement in this project shows that – although it’s over 100 years old now – it’s still the go-to company for institutions in need of innovative data solutions.
IBM’s Broader Mission
Recall back in July, I wrote about IBM’s work with the Coriell Institute for Medical Research, which runs the world’s largest library of living genetic samples. Essentially, this “biobank” of 4.5 million human cell samples was having trouble analyzing data efficiently.
As I said then, “a person’s genome is made up of two million points of data – or 1.5 GB – about how much space an average movie download requires. And with 4.5 million samples, Coriell’s basically shouldering 225 times more data than Netflix’s (Nasdaq: NFLX) entire streaming movie library.”
But with new software and hardware from IBM, Coriell is now able to successfully analyze and manage each sample.
Ultimately, IBM’s work with Coriell – and now with the SKA telescope – ties in with CEO, Samuel Palmisano’s, plans for transformation: “As IBM begins its second century, we continue a process of transformation, positioning the company to lead in the future.”
In other words, IBM is dead set on discovering ways to avoid becoming an obsolete dinosaur.
Obviously, the plan seems to be working. For investors, too, considering the company’s stock has jumped 14% since I mentioned IBM’s work with Coriell.
And as long as the company stays ahead of the pack in this increasingly data-driven world, shares should continue their upward trajectory.