• May 17, 2022

Adapting to the new normal: Remote work and the mainframe

Work from home has become an essential business proposition and it’s not just for today. It’s something that we’re going to live with for many years. As a result, we need to rethink how we’re connecting to the mainframe today.

Accessing the mainframe

First things first, a connection is necessary. Rocket BlueZone Web 3270 emulator. As a web-based offering, no special software needs to be installed on the end user device. This can be placed on a corporate PC or and end users home device in order to make the connection. And it will be a secure encrypted connection back to the hosting server. Here is a limited-time free offer.

Rocket BlueZone Web

Securing the mainframe
But will it be who you expect it to be accessing your systems? Within your work environment, you might have a private network and badge access to your buildings, so you feel comfortable with who is accessing your mainframe IT systems. It’s an entirely different world when those accesses come off an unsecured network or someone’s home computing device. You can have a virtual private network installed between the work-at-home computer and your mainframe. What you don’t know, is who is actually using that “work from home” computer. One of the means to ensure you are getting the right person is deploying the IBM Z Multi-Factor Authentication product. It will consider something you know, something you have and something you are during the authentication of that user.

You also want to ensure that all network connections are encrypted. The Z Encryption Readiness Tool (ZERT) is a free function within the Communication Server of z/OS. It will tell you any ports that might be left unencrypted in your network. Lock down your network from outside hackers looking for weakness and opportunities to breach the system.

Assuming you know who is accessing your systems, you need to assure that they only access what they are allowed to do. The Guardium S-TAP offerings for IMS, Db2 and datasets will alert you, based on rules you’ve set up, to people that are trying to access data where they are not authorized.

Who’s accessing the mainframe?

There are a wide variety of people that might access the mainframe. Each of these personas will access critical data on the mainframe. As a result, a business needs to know who they are and what data has been accessed. The z MFA and Guardium alert capabilities should apply to all users of the mainframe, not just the privileged users.

Monitoring and automating systems
Many mainframe businesses have seen a phenomenal increase in transaction volumes for retail sales, for shipping and for cashless financial transactions.  Businesses need to ensure that their systems maintain the capability to run in a highly available and high-volume transaction processing environment. They should automatically or autonomically react to changes and system demand.

They can improve database maintenance to ensure space is available in all critical databases in order to stay highly available. Transaction and database monitoring can quickly address alerts, aggregating information across middleware.  New streaming services to visualize the data provide more creative ways than traditional 3270 based green screen solutions do today.

Some of the products that help in this area are the IBM IMS and DB2 database solution packs, IMS Connect Extensions, each of the IBM Performance Analysis tools for Db2, IMS and CICS, the Transaction Analysis Workbench and finally, the suite of IBM Omegamon offerings.

Earlier, I mentioned  streaming. Using JSON formats, products stream output to open source tools like Elastic, Grafana, Kibana and Splunk. This provides valuable graphics in a web browser that makes it more intuitive to understand system issues and identify problems faster and easier without the need for specialized skills. The difficulty with deploying  these tools is a business has to put up a container on another platform. With z/OS Container Extensions (zCX) available in z/OS 2.4, these open source containers built for Linux for z can run within z/OS. This simplifies deployment and skill requirements.

Is your mainframe truly hybrid?

By definition, the mainframe will be hybrid because it must leverage a PC, mobile device, ATM, Point of Sale terminal or other device as its front-end user interface. But how does it work with other servers? As IT professionals, we tend to know a lot about the current mainframe environment.

But there’s a sea of other servers that could be deployed within a cloud environment from Amazon, Microsoft Azure, Google or IBM’s cloud. And in many businesses, these cloud solutions are managed independently from the mainframe. And likely, the mainframe is what it is and has been for many years, while the cloud is capturing the opportunities for new business and new applications.

Typical cloud environment

Let’s look deeper into that cloud environment. First thing is hosted data. No development environment exists without its data. Now the data needs to come from someplace, in fact it could come from a collection of different data sources and be known as a Data Lake within the cloud.

As businesses need to develop new applications, they might have a DevOps environment that will clone some of that data to build or modify a next generation application.

And finally, the business will have an environment to run those applications in a production environment and access the data that they’re hosting in their Lake.

A Mainframe Island?

Now let’s look a little deeper at the mainframe. A typical business has multiple logical partitions within a single mainframe.  They have a mainframe in another location for disaster recovery. They spread the work across these environments for a high availability execution environment.

But they also throw data “over the wall” into that cloud Data Lake. It could be as simple as a file transfer program (FTP), or a data replication offering.

Making the mainframe truly hybrid

If you look at these two islands of computing, the business may claim they are hybrid, but the reality is they are independent of each other. In this case, we need to go fishing in the cloud environment to look for opportunities to build a truly hybrid IT environment.

Make Db2 available at the speed of developers

A good place to start is to set up a hybrid DevOps environment. IBM has its own tools, as well as open source tools, that enable z/OS to be a full participant within a hybrid DevOps environment. The IBM Db2 DevOps Experience enables a business to make Db2 available at the speed of the developers.

Cloud-hosted front ends to the IBM Z

There’s no reason that PCs, cellular and other devices can’t continue to be the system of engagement or launch point to applications hosted on the IBM Z. In this example, it’s an analytics application using the database capabilities available on the IBM Z but accessed from a cloud environment.

Understanding the end user needs and the new engagement interface that a business wants to deploy is critical to understanding what parts are necessary to make the mainframe a critical part of a hybrid deployment strategy.


That’s a quick trip toward understanding some new opportunities that a business can look for with their remote workforce.

  1. Make the connection to the IBM Z
  2. Ensure it remains the most secure deployment environment.
  3. Autonomically manage it and reduce the effort necessary to maintain a highly available environment.
  4. Create an IT infrastructure that leverages the best platforms versus making it an either-or decision on specific platforms.

Start leveraging new ways to improve your remote work environment!



Jim Porell 9 Posts

I am a Solutions Architect at Rocket Software, focusing on pre and post sales technical assistance for Rocket developed products from IBM. Prior to joining Rocket, I was an independent consultant and retired IBM Distinguished Engineer. I held various roles as Chief Architect of IBM’s mainframe software and led Business Development and marketing of Security and Application Development for the mainframe. My last IBM role was Chief Business Architect for Federal Sales. I held a TS/SCI clearance for the US Government, was a member of the US Secret Service Electronic Crimes Taskforce in Chicago and co-authored several security books. I've done cybersecurity forensic work at a number of Retail, Financial and Government agencies and created a methodology for interviewing customers to avoid security breaches for large enterprises. I have over 40 years working with Information Technology.


Leave a Comment

Your email address will not be published.