This blog post continues our expert analysis of complex investments and their regulation.
The New York Times Bits Section has a nice article on Jeff Hawkins and his ideas on real-time big data analysis. (NYT) Here's the crux:
“It only makes sense to look at old data if you think the world doesn’t change,” said Mr. Hawkins. “You don’t remember the specific muscles you just used to pick up a coffee cup, or all the words you heard this morning; you might remember some of the ideas.”
If no data needs to be saved over a long term and real-time data can stream in all the information that is needed, a big part of the tech industry has a problem. Data storage companies like EMC and Hewlett-Packard thrive on storing massive amounts of data cheaply. Data analysis companies including Microsoft, I.B.M., and SAS fetch that data and crunch the history to find patterns. They and others rely on both the traditional relational databases from Oracle, and newer “unstructured” databases like Hadoop.
Much of this will be a relic within a few years, according to Mr. Hawkins. “Hadoop won’t go away, but it will manage a lot less stuff,” he said in an interview at Numenta’s headquarters in Redwood City, Calif. “Querying databases won’t matter as much, as people worry instead about millions of streams of real-time data.” In a sensor-rich world of data feeds, he is saying, we will model ourselves more closely on the constant change that is the real world.
If true, this would be a paradigm shift.