Next week I will be attending the next iteration of the xldb group events organized around eXtreme Large Database Applications. xldb workshop
With 100s of Peta Bytes of information waiting to be captured and analyzed, new concepts are required to scale today's platforms by 1-3 orders of magnitudes.
Today we 'limit' ourselves to 'only' capture 40TB/day of incremental incoming data volumes, next generation requirements demand a much more detailed collection of event detail data. 100TB/day are already on the horizon giving us just 10 days of history per Peta Byte. With deep historical requirements of 3+ years of information, data volume growth will outpace Moor's Law. And I would not be surprised if next year this time we will be thinking about how to deal with 250TB/day — the writing is on the wall.
Improvements in Processing Power per CPU, advances in Memory and Storage are not going to the able to make up for the exponential growth of data processing requirements.