Data (14)

In-Database AI Client Experiences with Db2 for z/OS + Demo 

Don't  miss this live webcast on 2nd November 2021 - 11:00AM EST
Tom Ramey will highlight some of the key challenges facing Db2 for z/OS clients and how AI is a breakthrough technology, that when applied to Db2 for z/OS performance management and resiliency can have a major impact. Tom will be joined by Benny Van Straten from Rabobank and Tom Beavin from IBM.  Tom Beavin will share Db2 AI use cases and host a live Db2 AI demo.Rabobank is a Dutch multinational banking and financial services company; Benny will share first-hand experiences and lessons learnt around Db2 AI for z/OS and the power of in-database AI.

 What will you learn by attending this webcast?

  • Hear first-hand client feedback and experiences
  • Learn how Db2 AI improves SQL performance using machine learning based on unique patterns found when executing the queries in a production environment. 
  • Learn how Db2 AI automatically detects SQL access path performance regressions and automatically restores the performance back to its optimal level
  • Learn how Db2 AI automatically stabilizes dynamic queries with their optimal access path, reducing prepare overhead


Tom Ramey IBM WW Director, Data and AI on IBM Z

Benny van Straten IT Specialist/DB2 Rabobank

Akiko Hoshikawa IBM Distinguished Engineer

Tom Beavin IBM Db2 AI for z/OS Development Machine Learning and Optimization


Read more…

Business transformation and agility are critical in today’s fast moving world. Clients demand insights faster, more often, and with greater accuracy based on up-to-the-second transactional data that is frequently housed in mainframe systems. The dependency on technology and the rate at which information is created and consumed continues growing exponentially. Applications have to rapidly adapt to take account of the dynamic digital business models and environments. All this drives the need for flexible IT infrastructures that are built on a rock solid foundation to deliver consistent dependability, performance, and security in the face of rapid change.

Our z Systems portfolio is a perfect match for delivering on these needs. It is well recognized that mainframe hardware and software technology deliver unmatched levels of quality, reliability, security, and scalability. On top of this, z/OS clients have historically moved their systems forward conservatively, which has further helped to build the mainframe’s reputation for rock solid stability, essential for the most demanding business-critical workloads. A corollary is that the mainframe is often perceived as a stagnant platform that cannot move quickly, and cannot support the agile or DevOps needs of modern applications. Nothing, of course, could be further from the truth.  However, challenges do remain.

DB2 for z/OS, the mainframe’s flagship relational database product, is changing to a continuous delivery (CD) model to help further address these challenges. With CD, DB2 will deliver new features to the market faster, and in increments that will be much easier for customers to consume. Let’s take a closer look.

DB2 12 for z/OS is the latest release of DB2 and is currently in “beta” testing, or ESP (Early Support Program) testing as we call it. DB2 12 will deliver many new features for mobile, cloud, and analytics applications, while also bringing many new innovations to market to improve performance, availability, and security.

DB2 new versions have followed a historical pattern of about every three years or so, and our customers have grown comfortable with this cadence over the years. However, upgrading to a new DB2 release can be a major effort for customers. In the past, we have introduced innovations such as online rolling version upgrades (for data sharing), and APPLCOMPAT. These features allow IT groups to upgrade in a more streamlined way without having to take outages or involve application groups. But, nonetheless, a DB2 version upgrade can still be a cumbersome project. As a result of this, coupled with the conservative nature of mainframe environments, some customers don’t implement new DB2 versions until several years after GA of the product. With our traditional three-year delivery cycle, along with the version upgrade delays, it can be five to six years or more between the time that we complete the development of a new feature until the point at which that feature actually becomes available on a live system. In today’s fast changing world, this is no longer sufficient.

With CD, we will deliver new features continuously, as they become ready, on future DB2 releases. This will allow application developers and DBAs to access important new features more quickly without having to wait five to six years or more for the next DB2 version.

How will this be done so that it’s consumable and non-disruptive for customer environments?  Our approach to CD must meet or exceed the quality and stability requirements that the z Systems customer base demands, while making the delivery of new features consumable. Customers will receive defect fixes and new features in the same stream. They will be able to apply their DB2 maintenance upgrades just like they always have, including the ability to roll in maintenance upgrades across their data sharing groups while keeping the databases continuously available. In fact, this should become much easier because there will be fewer APARs with ++HOLD actions caused by toleration APARs (the new “function level” concept will ensure that necessary maintenance is applied across the DB2 group, therefore removing the need for many of these existing ++HOLD actions).

What is different is that the new maintenance will contain new features that are initially dormant. The customer can choose when to activate the new features via a new system activation command. APPLCOMPAT controls will be provided to ensure that applications remain stable and to allow for controlled exposure to the newly activated features. We will provide easy to access documentation on which new features are included in which function levels. We will work closely with ISVs to ensure that upcoming features are effectively communicated ahead of time so that the overall DB2 ecosystem remains stable as DB2 changes are incorporated.

We see the road ahead for DB2 for z/OS as being an exciting and rewarding journey for both IBM and its customers. Agile, quality-focused development will allow us to continuously deliver robust production-ready features to DB2 users much more rapidly than we could in the past; therefore, enhancing the vitality of the DB2 product and greatly easing the task of DB2 upgrades for customers.

For more information register for our live webcast with Q &A which we will be hosting on 27th September 2016 at 11am EST, this webcast will be available on replay

Read more…

IBM DB2 for z/OS in the age of cloud computing

IBM DB2 for z/OS in the age  Of cloud computing

Reliability, availability, security and mobility




• Highly virtualized server that supports mixed workloads

• Self-serving capabilities in private, membership or hybrid cloud environments

• Divisional support of responsibilities

• Customized implementations to suit your business

• Platform foundation services for cloud use cases


Download this paper and learn:-

What is cloud computing, and what is driving this market trend?


What is data as a service, and what are the drivers?


Why DB2 for z/OS?


Click here to download full paper – over 700 downloads of this top preforming asset.

Direct Link to whitepaper without registration DB2%20for%20zOS%20Age%20of%20the%20Cloud%20IMW14820USEN.pdf

Read more…


May 20th LIVE WEBCAST! Big data drives “big agriculture” at John Deere with IBM DB2 for z/OS   


Register now:



· Exploiting big data opportunities from telematics data

· Reducing CPU and enhancing performance

· Planning for and designing DB2 applications


· Surekha Parekh, Worldwide Program Director, DB2 for z/OS

· Bryan Paulsen, Technology Architect, John Deere

Broadcast date:

May 20, 2014, 11:00 a.m. EDT / 3:00 p.m. GMT / 4:00 p.m. BST

Developed for:

IT and enterprise architects and managers; application developers and managers; database administrators and managers; system analysts

Technical level:

Intermediate – advanced

Big data impacts every industry from finance markets to agriculture and beyond. John Deere, one world’s largest suppliers of agricultural machinery and supplies, gathers telematics data from sensors on its machinery while in use in the field. The sheer volume and velocity of this data caused John Deere to redesign one of the applications that manages this information in order to meet business and scalability demands.

Join us for this complimentary webcast as we explore big data from an agricultural perspective. You’ll learn how the John Deere application addressed the challenges of big data using IBM® DB2® for z/OS® within its solution. The session also covers some of the key capabilities of DB2 for z/OS used in the solution such as page size, clustering, partitioning, sequence objects and APPEND. Attendees will also experience John Deere’s application suite of products through two short videos.

Join us after the webcast for a live question-and-answer session. This webcast will also be available for replay after the event.

Read more…


DB2 for z/OS is the preferred Systems of Record for mission critical enterprise data. With the advent of Cloud paradigm, enterprise customers are looking for ways in which they can easily and securely expose DB2 for z/OS data to emerging Systems of Engagement. This has led to a hybrid cloud model, whereby public cloud offerings such as Bluemix need to access DB2 for z/OS data hosted on premises.

With the growing popularity of Bluemix, customers want to quickly expose DB2 for z/OS data to Bluemix developers as APIs and allow seamless integration of cloud and on-prem data. IBM Cloud Integration service for Bluemix has made that possible by adding native support for DB2 for z/OS as an enterprise endpoint. With Cloud Integration service on Bluemix, you can not only connect to on prem DB2 for z/OS, browse and move data, you can also create REST APIs for DB2 for z/OS CRUD operations that can be invoked easily by Bluemix applications.

To allow connectivity to DB2 for z/OS running behind a firewall, you need to install and configure an on premise WebSphere Cast Iron Secure Connector behind the firewall such that it has direct access to DB2. Secure Connector sets up a tunnel between company's secure network and Bluemix applications. Once you are connected to on-prem DB2 for z/OS using Secure Connector, you can sync data between on premise DB2 and Cloud Database using Data Sync capability. You can also choose to not move any data, but just expose REST APIs for Bluemix users to access DB2 data residing on prem.

You can find step by step instructions to access on-prem DB2 data from Bluemix at

Read more…

The IBM System z platform is known for its scalability and its unmatched security. Nonetheless, we still need to monitor the who, what, why, when, where and how of protecting information. Big data will drive increased compliance as the range of data sources expands to support decision making. All these are subject to audit, compliance, regulation and more. Taking a proactive approach could help incidents from becoming headline breaches. Read this paper and learn the full capabilities of the InforSphere Guardium portfolio for IBM System z - and why there is no excuse for data breaches and longer.

I think you will enjoy this great new white paper from Ernie Mancill of IBM - our resident DB2 for z/OS security expert.bbb

Read more…

Information has never been so critical in running a business. Organizations are having to leverage new and existing sources of information in more innovating ways than ever before – and the volumes of data  is growing exponentially.  As the mainframe contains so much business critical data store in DB2 for z/OS it becomes a primary resource for today’s business analytics and decision making. The openness of the platform enables integration with other sources of data and its market leading qualities of service lend itself to becoming an information hub for big data initiatives.  Join the webcast and listen to Carl Olofson, IDC analyst, discuss his vast knowledge and experience of the platform, the ever increasing dependencies by large enterprise customers and how it is positioned and being used to deliver in the brave new world of big data.  Mark Simmonds will also highlight the information management portfolio roadmap for System z as it pertains to big data.

Register receive "The Mainframe as a Big Data and Analytics platform" a complimentary paper written by Carl Olofson, IDC.

Read more…

High volume, high velocity, high variety – these are just a few of the information challenges that businesses face today and in the near future. Big data is still a hot topic for both IT and business leaders who recognize it as an innovative and cost-effective way of gaining deeper insight into the business and driving better decision-making. While running out of data to analyze is not a problem, too much data from too many sources could be.

The longevity and qualities of service offered by the mainframe has made it an information hub, with vast amounts of data – from transactions and in multi-terabyte data warehouses – that can feed valuable business analytics. Join us for this complimentary webcast and hear an analyst's view point.

Read more…

I decided I need to blog on big data - particular on how the IBM System z platform and DB2 for z/OS fits into this new emerging world of "I want to know everything about everything". For so long organizations have focused on getting more out the core data they already have - most of which is highly structured data, rich in content (I'm talking about the record/transaction based data stored in databases, used by home grown and packaged applications). It's trusted, you know where is comes from, you understand the provenance behind it. It's estimated that 95% or fortune 1000 companies store some of the data on System z because of its integrity and ability to store, secure, process data, to scale and be resilient, But then there's all that "other stuff" - someone else's problem (now I'm referring to emails, social media, machine/sensor data, time series, geospatial and so forth - "differently structured" data ). But organizations started to realize just how valuable this "other stuff" could be. It could potentially provide different insights and perspectives of who a customer is, what their real needs and wants are, how the "feel" about a service, product or your company.

Over the coming months I'm going to take you on my journey as I discover more about the realities of big data and what it means for businesses, governments, and - you and me - the people. Let me start by stating this : Yes it's a paradigm, a strategy but do you suddenly stop what you're doing today and start "big data" projects? Eh.... no. In all likelihood you are probably doing it already - particular if you do have the System z platform. So what is big data I hear you asking? In a nutshell it's your ability to process, integrate, understand data from anywhere that is relevant to your business. Of course it's not without its challenges. So why do it ? Because when it comes to doing business big data augments / expands on what you already know about the market, a product, a customer, etc. The more we know the better we can manage risk, costs and identify opportunities for growth - That's big data in a nut shell.

So stay with me as I show why DB2 for z/OS and System z can be integral to the success of big data in the enterprise and what we are doing to help you make big data a reality. Think BIG., Think z.

Read more…