Live from IBM Edge2015: powering the future
If Monday’s sessions were focused on building and powering the experiences users are demanding today, Tuesday was about looking ahead. IBM’s Ken Keverian kicked off the day with by stating that the three biggest technological developments of the past 100 years have been the transistor, the Internet, and deep data analytics. And of those three, the last may be the one with the most untapped potential.
Many organizations still lack the means to extract meaningful insights from the mountains of data they are generating. It takes a combination of raw computing power and powerful analytics engines to parse data quickly, and Ken said that increasingly companies are opting to forgo the investment in hardware/software completely, and simply outsource the work. In other words, they’re realizing that the insight is what’s valuable, not the means to create it.
SVP of Research Arvind Krishna gave some insights into the future of hardware development. He explained that three forces are combining that are creating a computing bottleneck. First, data is being created faster than it can be processed. Second, physical disk I/O speeds have been declining as capacity increases. And third, processor clock speeds have stalled. In fact, just this week Gordon Moore said that in the next 5-10 years, the law that bears his name will no longer apply. If companies are to address that mountain of data, something else needs to happen besides cramming more transistors onto silicon chip.
Arvind proposed several ways that this challenge could be met. The first is by moving away from the model where a processor accesses data in memory or long-term storage, and instead moving processing power to the places where the data resides. By putting the processing power closer to the data–at multiple levels–it will be possible to increase processing speed and power by factors of 100x or more. He predicted that by the end of the current decade, it will be possible to easily process exaflops of data, giving scientists and researchers the ability to model the human body down to the cellular level.
Another possibility is the eventual development of a quantum computer, which IBM has been working on for many years. While still more theory than reality, quantum computing would represent a massive leap forward into a new era of computing technology.
Moving away from technology, it could be possible to teach computers how to “think,” reducing the need for human analysis and decision making. As an example, IBM researchers are teaching their Watson program how to use reason. The hope is that Watson can eventually analyze data or problems, and then build devise possible options or solutions from which human operators can select. The system could also be trained to identify anomalies in data and react to them independently.
It’s all a lot to think about, and it’s easy to ask why we would really need this much computing power. Tomorrow, I’ll discuss some specific use cases that the medical industry is working on that will put it to good use.