What to consider when modernising mainframes

What to consider when modernising mainframes

Before I outline key considerations in any mainframe modernisation strategy, let’s confirm why companies must consider modernising their mainframe.

The straight answer to this question is that many businesses are now?implementing?mainframe modernisation strategies because it enables them to take advantage of the latest innovations in the cloud without disrupting business-critical mainframe processing and applications.

Enterprises have, for decades, relied on mainframe computers to deliver mission-critical applications. Government agencies, healthcare organisations and businesses of all kinds – particularly in finance and insurance – use mainframes to manage their most sensitive and valuable data.

Here’s a statistic that endorses the viewpoint that modernisation is essential ? 69% of IT decision-makers?report?that the inflexibility of the mainframe limits the IT department’s ability to innovate. That should be enough to prompt companies to drive their tech teams to come up with a strategy. But there are a few things to put onto the radar first.

Far from being a stagnant technology,?research?shows that in 2017, the global mainframe market was valued at $2.094 million and projected to reach $2.907 million by 2025, which translates into a compound annual growth rate of 4.3%.

So, mainframes aren’t going anywhere soon because of their high performance, reliability and security. As you should by now understand, the market for them is growing. But unlocking insights from these business workhorses comes at a cost.

Forward-thinking companies want to leverage today's most advanced analytics platforms, as well as affordable, scalable cloud services. Modernising legacy systems is an essential step towards achieving that goal. For example, providing a 360-degree view of customers to the front-line support team requires real-time data replication from the mainframe and other source systems.

Things to consider

I think I have more than justified why companies need to modernise the mainframe, so now let’s look at issues to consider when embarking on that journey. Best practice is that copybooks and related supporting files should contain an accurate description of the underlying data.

Data quality should be evaluated in advance when considering IMS and VSAM data sources as part of a data replication project.

Mainframes aren’t going anywhere soon because of their high performance, reliability and security.
?

The ability to have all data, including combinations of it, accessible to all users within a governance framework that provides security without limiting agility is called democratisation of data.

It’s important to remember data has limited value if only a limited number of highly-technical people can access, understand and utilise it.

The next matter to factor into the modernisation strategy is devising a secure, enterprise-scale repository and catalogue of all the data assets the business has available for analytics. This gives data consumers a single, go-to destination, to find, understand and gain insights from all enterprise data sources.

It includes data preparation and metadata tools that streamline the transformation of raw data into analytics-ready assets.

Data integration (DI) can help to unlock the value of data in legacy sources, including DB2 z/OS, IMS and VSAM.

DI can ingest incremental datasets continuously, from many transactional sources, into data lake and data warehouse environments, delivering up-to-date data with enterprise-class log-based change data capture.

Companies can increase business agility and flexibility by aligning IT plus business operations and enable ‘re-platforming’ by migrating legacy data to innovative cloud alternatives. Minimise the impact to production systems and reduce costly mainframe resources by eliminating direct queries and?capturing?changes once, while – at the same time ? delivering to multiple targets.

Moreover, an automated, no code approach to data pipeline creation will save time. Gain better visibility of the data landscape through enterprise-wide secure and governed catalogues and safely democratise data across every line of business.

Back-office IT workload and support of line of business issues can be reduced while maintaining flexibility and much-needed data security.

The top mainframe DI challenges

When enterprises attempt to?integrate?their mainframe data into a broader data environment, they often run into a few common roadblocks, namely:

Batch file transfer:?Scheduled scripts or mainframe jobs extract data from the ‘big iron’ and write the results into large files that must be transferred over the network, which can lead to expired data. The latter is a result of data not being delivered in real-time.

Direct database query:?Businesses seeking to integrate mainframes into a broader analytical environment tend to take a brute-force approach, but each new query intensifies instructions and adds further expensive millions of instructions per second (MIPS) to the monthly bill.

Real-time data streaming:?To achieve this, data must be moved immediately whenever changes occur. Without the correct DI architecture, it takes a significant amount of manual tuning to support the broad, deep and fast analysis required by businesses today.

Just to recap, large corporations have relied on mainframes for half a century to manage their most valuable and sensitive data.

From order processing to financial transactions, production and inventory control through to payroll, mainframes continue to support mission-critical applications.

As a result, mainframe data must be integrated into modern, data-driven, analytical business processes and the environments that support them.

Moreover, organisations cannot take the brute-force approach because they need to unlock the value of the mainframe data without increasing MIPS consumption – upon which mainframe billing systems are based.

So, how can companies affordably leverage mainframe data continuously for business analysis?

Here’s one way: offload mainframe data to modern data lake platforms such as Apache Hadoop, Azure Data Lake Services (ADLS GEN2), or Databricks Unified Data Analytics Platforms, which facilitate the creation of new analytics possibilities and insights.

Integrating with these new environments requires a fresh approach that keeps data current and available, without adding complexity or prohibitive costs.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了