What is mainframe modernization? It depends on who you ask.
The mainframe is an interesting beast. In fact, just as the term mainframe modernization means different things to different people, so does the term mainframe by itself. The core of the mainframe includes the operating system, applications, and access interfaces.
Let’s take a standard mainframe environment. The mainframe is running CICS or IMS. Users access the applications using a terminal emulator, also referred to as a “green screen.” If you talk to a system architect, he might use the term mainframe modernization to mean converting back-end applications to be more modern. If you talk to an end user, he/she will tell you it means an easy-to-use graphical interface. Those are two completely different visions.
Most large organizations’ IT departments take the architect’s view — modernizing the back-end mainframe. This could consist of converting COBOL code to Java, or running COBOL applications under a Linux environment, or converting the database from IMS to DB2 to make it easier to integrate with external applications. All of this takes time, possibly years, and lots of money. So, you have to ask the question: “What’s the goal?”
Hide in plain sight
If the goal is to reduce costs, you should look at the back end to find ways to modernize and reduce MIPS costs. However, if your goal is to improve productivity, usability, and overall cost of operations, you should consider updating the front end (user interface) first. To do that, you need a layer of abstraction. That’s a software layer that understands how to communicate with two different layers while hiding the layers from each other. As an example, you could have a modern Web-based user interface that communicates with legacy “green-screen” applications. The UI doesn’t know it’s communicating with “green screens,” and the “green screens” don’t know they’re being converted to a Web interface.
Here’s how you might implement this solution. You want to first design your user interface without regard for the mainframe’s communications protocol or architecture. Simply design your user interface using standard UX practices. Now, you need a layer of abstraction — a layer that knows how to talk to the mainframe, using the standard native protocol, and communicate with the new user interface. If you decide, as most companies do, to create a Web-based user interface, you would use the standard Web architecture used by all your internal Web applications. Maybe you use J2EE or Microsoft, or maybe your standard is Struts or Ruby on Rails. Whatever it is, just design your Web application the same way you would if it were using a standard relational database.
Can we talk, mainframe?
Now you need a way to communicate with the mainframe “green screens.” There are several options available. OpenConnect has one called ConnectiQ. It’s a server-based platform that communicates directly with the mainframe, using the TN3270 protocol. It consumes the raw 3270 packets and interprets them in memory. It then converts that information into standard Web services that can be called and consumed by your Web application. The user interface doesn’t know it’s talking to an old legacy mainframe application, and the mainframe application doesn’t know it’s communicating with a modern Web-based user interface. In other words, you have a layer of abstraction; get it? In this case, the layer of abstraction is ConnectiQ. The Web application simply communicates with ConnectiQ, using standard Web services.
Why is the layer of abstraction so important? Because, once you have the new UI in place, you can start working on modernizing the back end applications, databases, and network interfaces. Again, that could take years, but you already have a nice, intuitive, easy-to-use, easy-to-learn user interface which should be saving money; so maybe the urgency isn’t quite as great.
The key is: once the back end is modernized, you simply change your layer of abstraction so it now communicates with the new back end. Users don’t see any changes — other than, perhaps, better performance. In the example above, you would remove ConnectiQ and use different Web service calls in your Web applications. This approach also allows you to modernize a little at a time. You can move an application, or function, at a time and have a mixture of Web service calls to ConnectiQ and the new mainframe interface. Again, the users don’t know the difference. They’re just using a nice, new, modern interface.
The benefits of this approach are numerous — including improved user productivity, simpler and faster training to bring people onboard more quickly, better and more timely customer support, and access from any Web browser so work-from-home users don’t need any special software.
Let’s also talk about a very serious problem related to mainframe terminal emulators: macros!
Almost all terminal emulators provide a macro interface that allows a user to press a “Record” key and record a set of keystrokes. She can then play back the recording any time. For example, a common problem is copying data from one screen onto another one. The macro would be recorded to navigate to the first screen, copy specific fields, navigate back to the original screen, and paste the data into that screen.
This option saves a lot of time, and users can create their own macros. However, that’s actually the problem. You end up with hundreds or even thousands of user macros running. Most aren’t written correctly. Macros are not “intelligent,” and have no way to know if they’re working correctly. If a user hits the “Play” button while on the wrong screen, the macro is still going to try to run. In some situations, this can actually cause harm.
So, why am I talking about macros in a mainframe modernization blog post? Simple: if you modernize your user interface, you can provide the same benefits that macros provide without the risk of users creating their own, possibly harmful macros. You now have control.
Users will ask for changes that help them be more productive. That’s great, and now you have a centralized way to provide those changes. The best part is that all users benefit, not just the one person who knew how to create a macro.
Front first, back last
Here’s the bottom line. Please your users first to improve quality, reduce costs, and improve employee satisfaction. Then, look for ways to modernize your back end. But remember, you don’t have to modernize your back end after you modernize the front end. If users are more productive and operations costs are down, you already may have achieved the results you sought. Also, you already know how to maintain the back end. So why introduce new technology, processes, and procedures if you don’t have to?
Kevin Culliton is OpenConnect’s Vice President of Product Management and Services. He’s responsible for developing products and capabilities that solve targeted market problems for global enterprises. Culliton has over 30 years of experience in the technology sector and over 20 years of experience providing enterprise software solutions to Fortune 500 companies. He works with many of America’s largest health insurance companies (one of OpenConnect’s key markets), and is an expert in the areas of automation and process analysis for the purpose of automating. Culliton oversees both OpenConnect’s premier mainframe robotic process automation (RPA) solution — which complies with strict industry standards and is used to automate complex processes — and OpenConnect’s operational analytics solutions. He is currently working with current and future OpenConnect customers to provide the next generation of analytics and automation solutions for back-office operations.