For a non-gambler, I seem to spend a lot of time in Las Vegas. But instead of hitting the roulette table, this week I’m immersed in IBM’s vision for the future of cognitive computing here at World of Watson. Why is this so exciting to me? The short answer is that this event is at the crossroads of some of my major passions: using data virtualization on mainframes to accelerate business transformation. The longer answer is that after years of people grumbling about ETL, we are finally about to put it in a boat, set the boat on fire, and send it off into the sea.
That may seem a little harsh – but only a little. That’s because ETL (Extract, Transform, and Load) is based on an outmoded paradigm that requires users to pull data off their systems, do the analytics, and then report on it. Needless to say, this takes time and resources and makes real-time analytics impossible. And in a world dominated by on-demand computing, that’s just not good enough. Can you imagine waiting hours or even days to transfer money between your accounts from your phone?
All around me here in Vegas are examples of cognitive computing and several examples are using data that is being copied from multiple heterogeneous systems. It doesn’t have to be this way – organizations that are enabling cognitive systems can access data without copying it using DV. Luckily, I also found examples here that are doing exactly this.
One of the things that I’m most proud of is that Rocket is playing a leading role in making DV a reality. We’re working with IBM on a number of products for the mainframe that use DV, and our footprint in this field is growing every day. So if you’re using ETL to feed your analytical systems, the day is coming where you’re going to have to make a choice: do you set the boat on fire at one end or both?