Is there a future for Analytics on IBM Z?

In a series of blogs, written by Aymeric AffouardGuillaume ArnouldKhadija Souissi and Leif Pedersen, we will provide our point of view of where the directions of Analytics are going and if IBM z has a role for Analytics in the future.

 

One of the tags that IBM z has got for many years, is that it is not a platform for running Analytics workload on. The platform is known for only being able to run transactional workload.   Another tag that the platform has got, is that the platform is very expensive and not able to support todays modernized workload like mobile, web, API’s and intelligent applications taking advantage of AI or more specific the part of AI better known as Machine Learning.

 

Let’s first look at what influences the direction of where applications are going. Today’s IT landscape is changing very fast and being able to adjust the IT landscape, applications, business models etc. is a key performance parameter and the companies that master the art of adjusting quickly and create new business models, are also the companies that will lead their industries and being able to grow their business more than other companies. Companies on the other hand that reacts slowly to adapt new technology and to create new business models are left behind and are having problems with retaining and growing their business.

 

Being relevant for your customers is another key influencer. Today’s customers expect that campaigns and special recommendations or suggestions are relevant and of interest for them. If this is not the case, then they prefer not to receive these recommendations or suggestions. When customer gets anointed with the relevance of these special offers, they are in the risk of moving their business elsewhere. With other words, companies that can create a personalized feeling for their customers, by using the information and knowledge they already have about the customer, are in a better position to build a trusted relationship and retain the customer.

 

Another key performance influencer is that customers today are using many different kinds of technology to communicate with companies, e.g. mobile, tablets, computers etc. The real face to face  time with the customers is becoming less than it was 10 years ago and it will continue to become less. This creates the challenge of utilizing the face time correctly and optimal. It also creates a need to be up to date with the knowledge about the customers.

 

Most of today’s transactional workload still runs on the mainframe – let’s call this System of Record. Most of today’s analytics workload runs off the mainframe – let’s call this System of Insight. Many companies are trying to engage with their customers via Social Media and other systems, these systems could run in the Cloud, on mobile devices etc – let’s call these System of Engagements.

Companies that can utilize the information from all 3 types of systems; Systems of Record, System of Insight, and System of Engagement, are able to create new business opportunities and to be the Industry Leader within their Industry.

Having separate systems across different platforms and technologies, creates some dependencies. One of these dependencies is to be able to make the data available for analytics processing with the speed needed by the business and not only that but also make the output of the analytics process available for the transactional workload with the speed needed. It also creates dependencies for compliance and the ability to make changes to the data being exchanged between different databases and analytics systems on different platforms.

 

Let’s look at a common Analytics architecture for large companies running their transactional workload on the mainframe. Over many years, they have built an analytics architecture where the transactional data are copied from the mainframe to an analytics environment stored on a distributed system. This could be a specify analytics system like Nettezza, Teradata etc. or it could be into a relational database like MS SQL, ORCALE etc or it could just be to a Data Lake based on any variation of e.g. Hadoop. Let’s call this first level of analytics environment for Level 1.

On Analytics Level 1, where the data latency is near real time eg. from few minutes to 12 hours old. This is where the most time critical analytics workload are running. Analytics Level 1 is also providing data to Analytics Level 2 which is a less time critical Analytics level, that could be Data Marts, Data Cube etc. where the latency of the data can be e.g. > 12 or 24 hours. The data needed on Analytics Level 2 could also be copied directly from the mainframe. It is very common to have multiple Analytics levels, some have more than 15 levels where data are copied from either the mainframe or from one of the other Analytics levels.

 

This creates a challenge to keep track of where data are coming from and ensuring that the data are valid. Controlling and tracking who can access data is another challenge. This also creates dependencies when changes to original data sources are needed. All the places where the data source is copied to most likely needs to be changed, too, and this change needs to be done at the same time as the original data source is changed. When changing a data source, it might also be necessary to change all of ETL processing that the data source is involved in. These changes, to the ETL process can be on any analytics level where the data source has been copied to independent of which level the data source is copied from.

 

Bering in control and maintaining analytics environments within multiple levels on different platforms can be very time consuming and slows down the speed of being innovative to developing new applications, business, services and gain better insight into the business by exploring analytics.

 

Being able to adapt to changes in the business and being able to take new innovative solutions to the market faster and to shorten the time to market, creates a need of being able to explore more analytics insight faster and with less latency of the data needed for the analytics processing. This has created the need to move analytics workload from a lower Analytics Level to a higher Level e.g. from Analytics Level 2 to Level 1 or even to Level 0 to be able to meet the business demand.

Many companies are considering how they can make their legacy applications more intelligent and take advantage of applying analytics insight and knowledge into their transactional workload running on the mainframe. For really benefitting of adding more intelligence into the legacy applications, it is important that it adds a new layer of relevant knowledge to the legacy application. If it only adds old or out of date information to the transaction, it might create more problems than not using this kind of analytics information in an application. Exploring new capabilities in legacy applications can help creating new business, like up sale, cross sale, and lower the cost by minimizing the need of human resources to be involved in each transaction.

 

Hybrid Transaction and Analytics Processing (HTAP) is one of the most important newer technologies within the analytics spaces. HTAP makes it possible for your transactional data to be available for analytics processing as soon as the data is made available in the database running on the System of Record. This makes it possible to do real-time analytics on the transactional data as soon as the data are stored in the database. This opens the option to use real-time analytics in your legacy applications and by then making them more intelligent and take advantage of Artificial Intelligent (AI) and Machine Learning technologies

 

Let’s look at how many of these challenges and issues can be addressed by utilizing some of the analytics solutions available on the mainframe. One of the most important components that makes it possible to mix transactional and analytics workload on a mainframe is the Db2 Analytics Accelerator, which is an analytics engine that is plugged into Db2 for z/OS. Db2 for z/OS and the Db2 Analytics Accelerator support of HTAP, which means that when data is available in Db2, the data will also be available in the Db2 Analytics Accelerator for analytics processing.

The Db2 Analytics Accelerator provides the capabilities of analytics processing which normally have not been possible to run on the mainframe, due to very high resource consumption.

This Blog continues here:

Views: 357

Add a Comment

You need to be a member of The World of DB2 to add comments!

Join The World of DB2

Bringing Db2 enthusiasts together virtually. Expert or novice, distributed or mainframe, this is the place for everything DB2.

Forum

Db2 for z/OS Master class with John Campbell and the SWAT team returns!

Started by Surekha Parekh in What's hot ?. Last reply by carol Goldberg Nov 7. 1 Reply

Db2 for z/OS Master class with John Campbell and the SWAT team returns! June 24-28, 2019 at IBM Hursley near Winchester, UKSeptember 23-27, 2019 at IBM Silicon Valley Lab in San Jose, California…Continue

Tags: Events, MasterClass, JohnCampbell

RBS Shares their Db2 Utilities Experiences

Started by Calene Janacek in What's hot ? Oct 22. 0 Replies

Join Mark Turner, Lead Mainframe Architect and Strategist from RBS and Haakon Roberts, IBM DE as they share Royal Bank of…Continue

Tags: #Db2

© 2019   Created by Surekha Parekh.   Powered by

Badges  |  Report an Issue  |  Terms of Service