How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure
As organizations strive to stay ahead of the curve in today’s fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy.
In this blog, we’ll discuss the example of a US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights into the opportunities and challenges presented by mainframe modernization and some of the strategies that can help your organization achieve success.
The bank is headquartered in the Midwest, with branches in large regional cities. It offers a full range of personal and commercial banking services, including checking and savings accounts, loans, credit cards, mortgages and wealth management services. The bank has a strong focus on innovation, using technology to enhance the customer experience and streamline banking processes. Recently, the bank has introduced a range of digital tools, including mobile deposit, online account opening and personalized financial management tools, and is in the process of building a market-leading investment and trading platform.
The bank’s digital architecture is built around a powerful mainframe system that serves as the central hub for all banking operations. The mainframe is responsible for processing transactions, managing customer accounts and providing the core functionality that powers the bank’s services. To ensure high availability and scalability, the mainframe is supported by a cluster of servers that work together to handle the bank’s computing needs. This architecture allows the bank to process a high volume of transactions without sacrificing performance or reliability.
In addition to its mainframe, the bank has a strong relationship with Microsoft and leverages Microsoft Azure cloud platform to extend its IT infrastructure. Azure provides it with a scalable, secure and cost-effective way to deploy new applications and services, while also providing seamless integration with the bank’s existing systems. By using Azure, the bank can quickly respond to changing market conditions and customer needs, while also reducing its IT costs and improving its overall operational efficiency. The bank’s close relationship with Microsoft also provides access to cutting-edge technologies and expertise, allowing the bank to stay at the forefront of the financial services industry.
Improving the bank’s backup strategy with Azure
The bank relies heavily on its mainframe systems to store and process vast amounts of data. As the storage administrator, Matthew recognizes that storage costs, particularly backup costs, have been growing significantly over recent months. In a recent conversation with the chief information officer (CIO), he understands that there will be further demands for storage as the bank develops and releases its new investment and trading platform. He is becoming increasingly concerned about the need for more storage in the bank’s data centers and the cost of backup given the increased regulatory storage and backup requirements that will come with the new trading platform.
The bank has a strong partnership with Microsoft and Matthew has been looking at different options for backup. He recently became aware of the IBM® Cloud Tape Connector for z/OS® (CTC) which would allow the bank to easily transfer tape data to Azure Object Storage, taking advantage of the cloud’s scalability and cost-effectiveness. He particularly liked the fact that the Tape Connector requires no complicated gateway hardware devices to purchase, configure and maintain—rather, as a software-delivered service, it seamlessly connected with the bank’s Azure environment. Matthew ran several tests and was pleased with the results; he was able to back up data, restore it, and test out recovery from disasters by using an efficient, cross-region, distributed architecture for data backup.
With a cost-effective, scalable and reliable backup strategy now in place, the bank was ready to move forward with its plans to develop and release the new internet banking services.
Improving developer productivity on the mainframe
Meet Kate. Kate is a new developer at the bank and is working with Tom to develop a new internet banking application. Tom asks Kate to work on the customer balance inquiry module and update it to provide the customer with balance information across multiple accounts. To begin, she wants to start with the current balance inquiry module that is part of the existing system but Kate is not very familiar with COBOL. Kate can use the IBM watsonx Code Assistant™ for Z (WCA4Z) refactoring assistant to help visualize the existing COBOL code and its dependencies to identify the module where the balance inquiry resides. WCA4Z can then help her pull together all the relevant code artifacts. After the required source components are identified, Kate then uses generative AI (gen AI) to convert the COBOL code to object-oriented Java.
Kate extends the Java code with the new functionality required by the business and now wants to help ensure proper integration with the rest of her mainframe application environment. Kate can commit her new and updated code changes to a Git-based repository, like Azure Repos or GitHub. To maintain consistency and expedited deployments, Kate’s code repository is configured to trigger Azure DevOps pipeline builds, which has automation capability to perform all deployment operations. The code is checked for quality, compiled and placed in an artifact repository for future deployment.
Adopting cloud-native development
Fast forward a few years. Kate is now a cloud-native developer who has been building cutting-edge applications for years. Using the latest tools and technologies, she has helped the bank lead the industry in offering new and innovative services to customers. From her previous experience at the bank, she knows that mainframes still host core systems. She and her team consider this platform to be legacy technology. In a recent conversation, the topic of migration came up, and the focus was on when, not if, her team would be tasked with moving key applications and services to the cloud.
Recently, she was asked by the CIO to work on a new application, which represented the next step forward in the bank’s investment and trading platform. Though expected to be available on any device, the CIO outlined several key requirements. The application needed to easily handle high volumes of transactions and provide fast, almost real-time processing times. It would also need to have extremely low latency and provide users with detailed insights based on the bank’s rich information, along with a detailed analysis of portfolio and individual stock performance. Her leadership team was insistent that time was of the essence if the bank was to compete in the new markets the platform would unlock, and that the development team would need to move quickly.
During the planning phase, Kate and her team were made aware of a mainframe application that hosted and maintained much of the information she needed for the new platform. Initially, the team thought that migrating this application to the cloud would be the best approach. However, after some initial investigations, they realized that migrating the complex and highly regulated application would take many months if not years.
Kate and her team decide that it would be much more cost-effective and faster to leave the mainframe application as it is. They decide to work with the mainframe development team to build a set of REST (representational state transfer) APIs that will allow the Azure application to interface with the mainframe and to securely request information. Adopting this approach which allows her team to focus time and effort on customer value, they follow these steps:
- Kate and her team identify the information they need and work with the mainframe team to identify the APIs that will allow them to access the necessary data and business logic.
- The mainframe team create a set of RESTful APIs and use a tool—such as IBM API Connect® and z/OS® Connect—to create API definitions and generate API documentation and client code.
- They set up an API Gateway in Azure API Management to configure an entry point for the Azure application to access the mainframe APIs. Kate and her team get started on the Azure application using the generated API documentation and client code provided to make API calls to the mainframe system.
- As the Secure by Design approach is a critical part of the bank’s approach to development, the team helps ensure that an identity solution is put into place at the beginning. This strategy is to enable the team to handle authentication and authorization correctly and ensure that only authorized users and applications such as the bank’s new investment and trading platform can access the mainframe system through the APIs.
After several weeks of development, Kate’s team completes the integration between the Azure application running on Azure and the mainframe system. They test the Azure application thoroughly to help ensure that it works as expected, is able to communicate with the mainframe correctly, and that the mainframe system can handle the increased load. The result is a modern web application that provides customers with a seamless user experience while leveraging the power of the mainframe system. Customers can interact with the application without realizing that they are accessing resources on the mainframe.
Integrating an Azure application with a mainframe system using APIs requires careful planning, collaboration between teams, and a solid understanding of both Azure and mainframe technologies. By using APIs to bridge the gap between the old and the new, developers like Kate are able to create modern applications that provide the best of the new, leveraging the power of the mainframe while providing a seamless user experience.
Using Azure Logic Apps
After her team had built the new platform, a team member told Kate about Azure Logic Apps and how it could potentially help her automate and streamline some of the workflows in the new platform. Curious about this possibility, she decided to give it a try. She started by creating a logic app that would automate the process of transferring data between the mainframe and a cloud-based storage system. Then, she used the Azure Logic Apps designer to create a visual representation of the workflow, which made it easy to understand and modify.
Next, Kate used Azure Logic Apps connectors to integrate the mainframe with other services, such as Azure Functions and Azure API Management. This allowed her to extend the capabilities of the mainframe and create new functionality that wasn’t possible before. She was able to create a web API that allowed customers to access their account information and perform transactions in real time.
In a discussion with Tom on the mainframe team, she also suggested that Azure Logic Apps could be used to help automate the testing and deployment of mainframe applications. Together, they helped to create a continuous integration and continuous deployment (CI/CD) pipeline that automatically tested and deployed the new code whenever changes were made. This saved Tom’s team a significant amount of time and reduced the risk of errors.
As a result of these changes, the user interface was now modern and intuitive, and the processes were automated and efficient. Customers were able to access their account information and perform transactions in near real-time, and the bank saw a significant increase in customer satisfaction and efficiency.
Azure Logic Apps provide another option for developers, allowing them to modernize their mainframe applications without expensive hardware and software upgrades.
Working together as a team
In today’s fast-paced world of software development, it’s not uncommon for multiple teams to work on separate projects that depend on each other. However, without proper coordination, these projects can often suffer from miscommunication, delays, and ultimately, failure. Mainframe code development and release into production has been historically much slower than Azure due to complexity, regulation and the need for formal change control—whereas in an Azure world, things can and do change on a rapid basis.
Consider this scenario. Kate and Tom each have different deadlines for the new investment and trading platform, with new features and capabilities to be added to the front-end application. They also have to make some corresponding changes to the back-end data processing and interfaces (APIs) on the mainframe side to allow the front end to gain access to the data. It’s important that these changes are aligned as deadlines are tight and there’s a need to avoid unnecessary testing and rework.
To avoid the challenges with interdependencies and differences in deployment schedules, Kate and Tom create a shared backlog of work, which includes all the tasks and milestones for the mainframe and the front-end Azure application. They prioritize the tasks based on their importance and dependencies and assign specific deadlines to each one. They also establish a system of continuous integration and delivery, where they would regularly share their work and provide feedback to each other.
In adopting DevOps processes across both cloud native and mainframe applications and viewing the success of the project as a shared commitment, Kate and Tom were able to:
- Improve communication: Regular meetings and shared milestones helped Kate and Tom stay on the same page. They could communicate effectively and adjust as needed.
- Increase efficiency: By prioritizing tasks and aligning their efforts, Kate and Tom were able to complete their projects more efficiently. They could focus on the most critical tasks and avoid wasting time on nonessential work.
- Reduce risk: Continuous integration and delivery allowed Kate and Tom to identify and fix issues early on. They could catch bugs and conflicts before they became major problems.
- Improve quality: By collaborating and sharing their work, Kate and Tom were able to deliver high-quality projects. They could review each other’s work and provide feedback, which helped them improve the overall quality of their projects.
Kate and Tom’s story highlights the importance of DevOps processes and collaboration in software development. It also shows the importance of using common or standardized tooling to “break down silos” between development teams, especially with solutions that cross platforms. By aligning their milestones and efforts, they were able to deliver a successful project that worked seamlessly. Their experience shows that when developers work together and communicate effectively, they can overcome even the most challenging obstacles and deliver high-quality software that meets the needs of their customers.
Accelerating mainframe application modernization
IBM Consulting® can help you design a hybrid cloud strategy with a single integrated operating model, which includes common agile practices and interoperability of applications between IBM® Z®, Microsoft Azure and mainframes. IBM places applications and data on the right platform and helps keep them secured, encrypted and resilient. It’s an agile, seamlessly integrated hybrid cloud platform with IBM Z at the core. IBM Consulting works to accelerate mainframe application modernization, develop Azure applications, and drive IT automation with IBM Z and Azure. The joint approach allows businesses to innovate faster, maximize investment value, and reduce the need for specialized skills.
Learn how you can unlock the full potential of your mainframe and drive business success. Discover the benefits of mainframe modernization and schedule a conversation with our experts to explore how IBM can help you achieve your goals.
Learn more about mainframe modernization
Was this article helpful?
YesNo
Source link