If you use a bank account, the healthcare system, various government services and insurance, then there’s a pretty good chance that the transactions were processed through mainframes. The fact is that much of the Global 2000 companies use this technology.
“If you have huge amounts of data that can’t be let offsite for regulatory reasons, you probably need something that looks like a mainframe,” said Mike Loukides, who is the VP of Emerging Tech Content at O’Reilly Media. “Terabytes are easy now, but if you’re storing high resolution medical imagery, you’re talking petabytes fairly quickly.”
For example, the leader in the mainframe space, IBM, saw growth in this business last year. The IBM Z platform continues to see innovations, such as with cloud-native development capabilities, as well as strong improvements in processing power.
“The IBM Z business isn’t going anywhere,” said Ross Mauri, who is the general manager of IBM Z. “In fact, since announcing IBM z15 in September 2019, 75% of the top 20 global banks are using the platform. We’re also seeing growth being driven by Linux, and Red Hat OpenShift on IBM Z. Installed Linux MIPS increased 55% from 2Q2019 to 2Q2020–while we have more than 100 clients ready to get started with Red Hat OpenShift. Finally, COVID-19 has unearthed a renewed focus on IBM Z, to help keep the world’s financial trading, retail transactions, insurance claims processing, healthcare IT, and more afloat. IBM Z clients activated a total of nearly 4x more general-purpose capacity on demand in 2Q 2020 compared to 2Q 2019.”
The Need For Innovation
There are definitely nagging issues with mainframes, though. “Development cycles are slow—typically counted in quarters or years,” said Jedidiah Yueh, who is the founder and CEO of Delphix. “Tech giants, in contrast, release new software thousands of times a year. It’s like a toddler running a race against Usain Bolt.”
MORE FOR YOU
In other words, this poses a big problem for larger companies that need to fend off disruptors and find ways to better cater to customer needs. So then what can be done? Well, first of all, a replacement strategy for mainframes is probably not a viable option. It would be extremely expensive, time-consuming and far from risk free.
“Using serverless, a certain subset of mainframe tasks can have their operating budgets shrunk to almost nothing,” said Nočnica Fee, who is the developer advocate at New Relic. “A few stories have floated around the conference circuit of old services hosted on AS/400 mainframes replaced to great effect. However, these stories tend to have several things in common. For example, the task being performed was periodic but brief. When evaluating all records in a database every night for notifications or status updates, the data layer was readily available for a specific cloud function. That is, security and data sovereignty concerns were already addressed. Also, the mainframe in question had no other purpose, meaning the moving of this periodic task meant the whole system could be retired.”
For the most part, the strategy is likely to involve a hybrid model. It will be about making strategic compromises.
“One of our customers in the financial services industry shifted their mainframe outlook from cost reduction to create a long-term strategy for the mainframe,” said John McKenny, who is the Senior Vice President and General Manager of ZSolutions at BMC. “Realizing the platform’s strength in resiliency, they added a new mainframe for their environment and expanded overall capacity to handle their business-critical applications, while leveraging the cloud to support their front-end applications. They also invested in modernizing their mainframe development toolsets to attract new programmers and make it easier for them to develop on the platform.”
Gil Peleg, who is the founder and CEO of Model9, agrees with this. His company develops technology to migrate mainframe data to any cloud or on-premise storage platform.
“Considering that innovation happens today in the cloud first, both ‘Big Iron’ and public cloud providers operating at ‘hyperscale’ have an important role to play. We will continue to see innovation around hybrid cloud models that feel more like a mashup of monolithic and distributed architectures than a dramatic changing of the guard. The public cloud footprint is no longer limited to a handful of data center regions. The new cloud frontier is expanding to cover the network’s edge with solutions like AWS Outposts and Azure Stack that bring the cloud closer to the mainframe. The future will favor solutions that facilitate the adoption of these hybrid models.”
So all in all, the mainframe world will continue as it has. If anything, it will probably grow as hybrid approaches show results.
“We like to say that the mainframe, legacy, or back-office applications hold an accumulation of 30 to 40 years of business process and regulatory compliance evolution that is near impossible to replace,” said Lenley Hensarling, who is the chief strategy officer at Aerospike. “Why would you? Put your money in driving new capabilities that tie into those systems and add real value in terms of customer satisfaction, customer understanding, and increased efficiency in sales, supply chain, and product innovation.”
Tom (@ttaulli) is an advisor/board member to startups and the author of Artificial Intelligence Basics: A Non-Technical Introduction, The Robotic Process Automation Handbook: A Guide to Implementing RPA Systems and Implementing AI Systems: Transform Your Business in 6 Steps. He also has developed various online courses, such as for the COBOL and Python programming languages.