Live from SHARE 2017: Machine Learning and the Modern Mainframe–Why You Should Care

Share on Facebook6Share on Google+0Tweet about this on Twitter19Share on LinkedIn85Email this to someoneShare on Reddit0

Throughout human history, people have used machines to automate their tasks. The industrial revolution, for example, was all about automation. During this period, engineers and industrialists designed and built new machines that could perform repetitive, often tedious, tasks at an unprecedented speed and scale. Another great period of innovation, the computer revolution of the 20th century, was also all about automation because machines could perform calculations and operate on data at a speed humans couldn’t possibly match.

What’s the point of this history lesson? The bottom line is that automation powers innovation, and that organizations that are serious about innovation should be proactive in figuring out how to automate tasks. One case in point is machine learning. You’ll hear the term defined in a variety of ways, but one way I like to think about it is the process of automating automation. Computers already automate all kinds of tasks; the question driving machine learning is, “How might we teach computers to teach themselves so that they no longer need human supervision?”

The idea that machines might automate themselves is a pretty exciting proposition, and the business applications are numerous. But what’s the mainframe’s role in all this? The answer has to do with data. Organizations that rely on mainframes for mission-critical tasks (such as transaction processing) are constantly creating amounts of data that would have been inconceivable a decade ago. This information contains all kinds of useful information that can be used to mine for things like analytics or machine learning. Unfortunately, mainframe data was often challenging to access because of formatting and data transformation issues. Today, however, there are technologies that run on the mainframe itself that allow data scientists to get real-time access into this gold mine of data without the roadblocks.

So, don’t let those insights go to waste, and start seeing how you might incorporate your mainframe data into your machine learning initiatives.

The following two tabs change content below.
Azeem Ahmed
Azeem Ahmed joined Rocket Software in 2003 as a Software Engineer after having graduated from UT Austin. Over the course of past fifteen years, he has held many engineering and management roles. In his current role as Chief Technologist for Cloud, Azeem helps lead Rocket R&D around Hybrid & Public Cloud, is responsible for Rocket’s Cloud strategy, and leads Emerging Tech Research Group focused on Big Data Analytics & Cloud. In addition to this role, Azeem was instrumental in conceiving the Rocket.Build program – an internal hackathon for Rocket Engineers. He continues to direct that program every year for our engineers.

, , ,

No comments yet.

Leave a Reply