db2 (23)

IDUG North America 2022 | Boston, MA | July 11-14  REGISTER NOW

 

  10214507269?profile=RESIZE_710x

 

Registration is Open Now!

IDUG's 2022 North America Db2 Tech Conference will be held at the Sheraton Boston on July 11-14, 2022 [some optional pre-conference events could occur on July 10, 2022]. 

This year’s North America event is a hybrid conference: you can register to attend on-site or to participate online. Register now for the conference, then (if you will be participating on-site) reserve your lodging at the Sheraton Boston!

Conference Sponsorships Available... 

Connect with your customers at IDUG 2022 NA! Click here for the sponsor prospectus that explains exhibit and marketing opportunities to share your expertise with our audience. IDUG will be bringing together the global Db2 community for four days of in-person exploration and collaboration in July. Reach out to us at cyndi@teamycc.com to learn about all our opportunities.

Schedule & Speakers Announcements Coming Soon...

IDUG’s North America Conference Planning Committee (CPC) is finalizing the schedule and speakers for IDUG 2022 NA. If you have any questions about the schedule and speakers, please reach out to CPC member Chris Muncan at chris.muncan@gmail.com

 

Read more…

Tablespaces: where exactly is my data?

By Mark Gillis

This should be pretty straight-forward: you can look in the TABLES System Catalogue and find references for the data (TBSPACE), Indexes (INDEX_TBSPACE) and Large Objects (LONG_TBSPACE). But DB2 throws a few curved balls here; partitioned tables (where the data, indexes and LOBs can be in multiple tablespaces), particular types of index that don’t seem to be in the catalogues at first glance, etc. Let’s see if we can put something together that shows the complete picture.

 

Overview

I’ve got a little database with all sorts of weird and wacky objects. It’s tiny, in terms of volume, but includes row and column-organized tables, range-partitioned tables, MQTs and a bunch of other stuff. Some tables have the full “INDEX IN … LONG IN ….” Tablespace definitions but don’t actually use them, some don’t have any or all the tablespace directives. I want to be able to see the full picture, so how do I go about that?

Click here to find out. 

 10164642078?profile=RESIZE_710x

 

Read more…

In part one of this blog James Cockayne looked at what might happen to a DB2 database that was attacked by ransomware encryption.  

In part two James shares four dos and four don’ts to help protect DB2 databases from ransomware attacks.

https://www.triton.co.uk/ransomware-and-the-db2-database-part-two/

9860513277?profile=RESIZE_710x

Read more…

By Mark Gillis

There are easily accessible means of checking what your Stored Procedure needs in the way of dependent objects (SYSCAT.ROUTINEDEP, basically). So, what if you find a, or a number of, Stored Procs that are marked as needing a REBIND and then, when you do that rebind, you get an SQL0440 indicating that “something” is missing. How do you go about checking that situation out? Find out here

9756322693?profile=RESIZE_584x

Read more…

By James Cockayne

By now I’m sure everyone has heard of the malicious practice known as ransomware attacks, where miscreants break into a corporate network and encrypt data before demanding huge sums of money to provide a method to decrypt that data and make it accessible again.  The attacks tend to be insidious – sometimes the attacker is in the network for months before they gain access to the systems they are interested in, and they are known to target backup servers as well as the primary systems to cause maximum inconvenience to the target organisation. 

Find out what an attack on a DB2 Database would look like. Continue reading James Cockayne's latest blog. 

https://www.triton.co.uk/ransomware-and-the-db2-database-part-1/

 

 

Read more…

In-Database AI Client Experiences with Db2 for z/OS + Demo 

Don't  miss this live webcast on 2nd November 2021 - 11:00AM EST
Tom Ramey will highlight some of the key challenges facing Db2 for z/OS clients and how AI is a breakthrough technology, that when applied to Db2 for z/OS performance management and resiliency can have a major impact. Tom will be joined by Benny Van Straten from Rabobank and Tom Beavin from IBM.  Tom Beavin will share Db2 AI use cases and host a live Db2 AI demo.Rabobank is a Dutch multinational banking and financial services company; Benny will share first-hand experiences and lessons learnt around Db2 AI for z/OS and the power of in-database AI.

 What will you learn by attending this webcast?

  • Hear first-hand client feedback and experiences
  • Learn how Db2 AI improves SQL performance using machine learning based on unique patterns found when executing the queries in a production environment. 
  • Learn how Db2 AI automatically detects SQL access path performance regressions and automatically restores the performance back to its optimal level
  • Learn how Db2 AI automatically stabilizes dynamic queries with their optimal access path, reducing prepare overhead

    9739408684?profile=RESIZE_400x

Tom Ramey IBM WW Director, Data and AI on IBM Z

Benny van Straten IT Specialist/DB2 Rabobank

Akiko Hoshikawa IBM Distinguished Engineer

Tom Beavin IBM Db2 AI for z/OS Development Machine Learning and Optimization

 

Read more…

In DB2 12 for z/OS, DRDA Applications and Application Compatibility Part Two Gareth Copplestone-Jones provides guidance on the implementation of server-side configuration.

Server-side configuration

When considering how to manage managing Application Compatibility – APPLCOMPAT – for your distributed applications which use the NULLID packages, the main alternative to client-side configuration (discussed in the previous article) is server-side or DB2-side configuration. Although not without its challenges, the advantage of server-side configuration is that much of the necessary configuration is done in one place, using system profiles. Continue reading part two

9524610897?profile=original

Read more…

Introduction

This, the first of two articles on how to manage the Application Compatibility level for DRDA applications, provides an introduction to the subject and considers two of the ways of doing this. In the second article Gareth Copplestone-Jones will concentrate on perhaps the most promising method and discusses its drawbacks.

A very brief history of Application Compatibility

With the release of DB2 11 for z/OS, IBM introduced Application Compatibility, which is intended to make migration from one DB2 release to another less burdensome by separating system migration from application migration, and by allowing you to migrate applications individually once system migration has completed. Application migration is managed using two controls: the APPLCOMPAT BIND option, with a default option provided by the APPLCOMPAT system parameter; and the CURRENT APPLICATION COMPATIBILITY special register.

The original announcement was that DB2 11 would support the SQL DML syntax and behaviour of both DB2 10 and DB2 11, and that DB2 12 would support that of all three. Then along came DB2 12 with Continuous Delivery and Function Levels.

Application Compatibility was extended in DB2 12 in two ways: to support function levels as well as release levels; and to support SQL DDL and DCL as well as DML. It still supports an Application Compatibility setting of V10R1.

One of the big practical issues with Application Compatibility has always been how to manage dynamic SQL packages, and in particular how to manage the NULLID packages used by DRDA clients connecting via DB2 Connect or the IBM data server clients and drivers. That’s what this article is about. Continue reading

9524610497?profile=original

 

Read more…

Node.js Application and DB2 REST services

DB2 for z/OS delivered native REST services support in the end of 2016. I wrote 2 white paper on how to create a DB2 REST service and how to consume this service from a mobile device. I start getting enquiries on how to consume a DB2 REST service from a node.js application. In the following blog, I am going to share my experience in implementing a node.js application to invoke a DB2 REST service.

https://www.ibm.com/developerworks/community/blogs/e429a8a2-b27f-48f3-aa73-ca13d5b69759/entry/Node_js_Application_and_DB2_REST_services?lang=en

Read more…

IDUG is pleased to offer these Complimentary Workshops for FREE to squeeze the most educational value out of your conference.

Sunday, Nov 13th

Certification Preparation Courses:

Pre-Certification Workshop: IBM DB2 11 DBA for z/OS & DB2 10.1 Fundamentals (Exam 610 & Exam 312)  
Pre-Certification Workshop: DB2 10.1 DBA for LUW (Exam 611) and DB2 10.5 DBA for LUW Upgrade (Exam 311)

Thursday, Nov 17th

Read more…

We have decided to extend the Early Bird Registration up to and including 10th October. This means you will be able to take advantage of the low rate for a little bit longer.

Register by October 10th. Save an additional €225 using EARLYEMEA discount code.

http://www.idug.org/p/cm/ld/fid=926

 

Read more…
Unable to attend the IDUG DB2 Tech Conference in Lisbon, Portugal this year? You can still experience featured session and Db2 Panels taking place at conference and ask questions live through uStream. Login to idug.org to join live stream, and engage with IBM strategists and developers, consultants and independent DB2 users remotely!When: October 4, 2017Time: Begins 11:00 AM WET ( GMT + 1hrs)Where: Live Stream from Anywhere!Cost: Complimentary for IDUG Members (Not a member? Join today!)Session: IBM BLU for Spark an Event store for the next generation of ApplicationsPresenter: Namik Hrle -IBM FellowAbstract: This presentation provides a deep dive into he next generation of IBM data store for handling real time event applications from IoT to new Event Sourcing applications. The Store is built on the Open source Spark platform and Object storage and can ingest Millions of transactions per second and provide Highspeed analytics on transactional data in real time. Perfect for Event sourcing applications than need the velocity and volume of data this platform can handle and for Structured Data Lake applications such as Internet of things.Join Now http://ibm.biz/BdjMLk
Read more…
Unable to attend the IDUG DB2 Tech Conference in Lisbon, Portugal this year? You can still experience featured session and Db2 Panels taking place at conference and ask questions live through uStream. Login to idug.org to join live stream, and engage with IBM strategists and developers, consultants and independent DB2 users remotely!When: October 4, 2017Time: Beings 11:00 AM WET ( GMT + 1hrs)Where: Live Stream from Anywhere!Cost: Complimentary for IDUG Members (Not a member? Join today!)Session: IBM BLU for Spark an Event store for the next generation of ApplicationsPresenter: Namik Hrle -IBM FellowAbstract: This presentation provides a deep dive into he next generation of IBM data store for handling real time event applications from IoT to new Event Sourcing applications. The Store is built on the Open source Spark platform and Object storage and can ingest Millions of transactions per second and provide Highspeed analytics on transactional data in real time. Perfect for Event sourcing applications than need the velocity and volume of data this platform can handle and for Structured Data Lake applications such as Internet of things.Join Now http://ibm.biz/BdjMLk
Read more…

The IDUG Mentor Program gives IDUG members the opportunity to pass

on the valuable skills they have learned over the years to fellow DB2 professionals.

If you wish to motivate a brand new IDUG attendee and apply for 60% Mentor

discount coupon, you must fall in to one of the following categories:

   - Loyal IDUG attendees (attended 3 major IDUG conferences in the past)
   - IBM Champions (https://www.ibm.com/developerworks/champion/ )
   - Regional User Groups (Find a local User Group at http://www.idug.org/page/user-groups-home )

Visit http://www.idug.org/p/cm/ld/fid=862 ; to learn more!

Read more…

My colleague Param (param.bng@in.ibm.com) and I (pallavipr@in.ibm.com) are exploring various aspects of Spark integration with DB2 and DB2 Connect drivers. We have decided to write a series of articles capturing our experimentation for the benefit of others as we did not find any article that focuses on different aspects of DB2 access via Spark.

Our first article in the series covered DB2 access via Spark Scala shell. This second article focuses on accessing DB2 data from via standalone Scala and Java program in Eclipse using DB2 JDBC driver and DataFrames API. Below are the detailed step by step instructions. Note that same instructions will apply to DB2 on all platforms (z/OS, LUW, I) as well as Informix.

  1. Confirm that you have Java installed by running java -version from Windows command line. JDK version 1.7 or 1.8 is recommended.

  2. Install Spark on local machine by downloading spark from https://spark.apache.org/downloads.html.

  3. We chose pre-built binaries as shown in Screenshot 1 (instead of source code download) to avoid building spark in early experimentation phase.

    9524596065?profile=originalScreenshot 1

  4. Unzip the installation file to a local directory (say C:/spark).

  5. Download Scala Eclipse IDE from http://scala-ide.org/download/sdk.html

  6. Unzip scala-SDK-4.1.0-vfinal-2.11-win32.win32.x86_64.zip into a folder (say c:\Eclipse_Scala)

  7. Find eclipse.exe from eclipse folder and run. Make sure you have 64-bit Java installed by running java -version from cmd prompt. Incompatibility between 64 bit Eclipse package and 32-bit Java will give an error and Eclipse would not start.

  8. Choose a workspace for your Scala project as shown in Screenshot 2.

    9524597452?profile=originalScreenshot 2

  9. Create a new Scala project using File->New Scala Project.

  10. Add Spark libraries downloaded in Step 6 to the newly created Scala project as shown in Screenshot 3.

    9524597469?profile=originalScreenshot 3

  11. You may see an error about more than 1 scala libraries as shown in Screenshot 4 since Spark has its own copy of Scala library.

9524597259?profile=originalScreenshot 4



  1. Remove Scala reference from the Java build path as shown in Screenshot 5 to remove the error.     9524597269?profile=originalScreenshot 5

  2. You may see another error “The version of scala library found in the build path of DB2SparkAccess (2.10.4) is prior to the one provided by scala IDE (2.11.6). Setting a Scala Installation Choice to match”. Right click Project->Properties->Scala Compiler and change project setting to 2.10 as shown in Screenshot 6.

    9524597501?profile=originalScreenshot 6

  3. After clicking OK, project gets rebuilt and you will only see a warning about different Scala versions that you can ignore.

  4. Now you can right click DB2SparkAccess project and choose New Scala App as shown in Screenshot 7. Enter application name and click Finish.

9524597088?profile=original

Screenshot 7

  1. Copy the following source code into the new Scala application you have created (.scala file) and modify the database credentials to yours.

    import org.apache.spark.sql.SQLContext

import org.apache.spark.SparkConf

import org.apache.spark.SparkContext

object DB2SparkScala extends App {

val conf = new SparkConf()

.setMaster("local[1]")

.setAppName("GetEmployee")

.set("spark.executor.memory", "1g")

val sc = new SparkContext(conf)

val sqlContext = new SQLContext(sc)

val employeeDF = sqlContext.load("jdbc", Map(

"url" -> "jdbc:db2://localhost:50000/sample:currentSchema=pallavipr;user=pallavipr;password=XXXX;",

"driver" -> "com.ibm.db2.jcc.DB2Driver",

"dbtable" -> "pallavipr.employee"))

employeeDF.show();

}

  1. Right click the application and select Run As-> Scala application as shown in Screenshot 8-

    9524598055?profile=originalScreenshot 8

  2. You may see the following exception - Exception in thread "main" java.lang.ClassNotFoundException: com.ibm.db2.jcc.DB2Driver. To get rid of the above exception, select Project->Properties and configure Java Build Path to include the IBM DB2 JDBC driver (db2jcc.jar or db2jcc4.jar) as shown in Screenshot 9. JDBC driver can be downloaded from http://www-01.ibm.com/support/docview.wss?uid=swg21385217

    9524598076?profile=originalScreenshot 9

  3. Now click on your Scala application and select Run As->Scala Application again and you should see the employee data retrieved from DB2 table as shown in Screenshot 10.

    9524597862?profile=originalScreenshot 10

  4. To perform similar access via a standalone Java program, Click on Project->New->Other as shown in Screenshot 11.

    9524597879?profile=originalScreenshot 11

  5. Select Java->Class and click Next that takes you to Screenshot 12.

    9524598261?profile=originalScreenshot 12

  6. Enter a name for your Java class and click Finish as shown in Screenshot 13 -

    9524597691?profile=originalScreenshot 13

  7. Paste the following code into your newly created class (.java file) with database credentials changed to yours.

import java.util.HashMap;

import java.util.Map;

import org.apache.spark.SparkConf;

import org.apache.spark.api.java.JavaSparkContext;

import org.apache.spark.sql.DataFrame;

import org.apache.spark.sql.SQLContext;

public class DB2SparkJava {

public static void main(String[] args) {

SparkConf conf = new SparkConf().setAppName("Simple Application");

conf.setMaster("local[1]");

conf.set("spark.executor.memory", "1g");

JavaSparkContext sc = new JavaSparkContext(conf);

SQLContext sqlContext = new SQLContext(sc);

Map<String, String> options = new HashMap<String, String>();

options.put(

"url",

"jdbc:db2://localhost:50000/sample:currentSchema=pallavipr;user=pallavipr;password=XXXX;");

options.put("driver", "com.ibm.db2.jcc.DB2Driver");

options.put("dbtable", "pallavipr.employee");

DataFrame jdbcDF = sqlContext.load("jdbc", options);

jdbcDF.show();

}

}

  1. Right click your newly created Java application. Select Run As → Java application. You should see similar results as Step 20.

Read more…

9524596087?profile=originalHi,

Greetings !!

The Kolkata India DB2 User Group (KIDUG) is a Regional Users Group (RUG) & is an organized group of individuals at the local level who share an interest in IBM’s DB2 Family of Products or similar information Management opportunities. The group has started taking shape from 2013 and has been gaining momentum ever since with professionals from different organizations showing interest. The details about the group can be found at

www.idug.org/rug/kidug.

An event was organized last year at Kolkata, which was quite a success being attended by around 120 people where a number of topics covering both DB2 on z/OS and DB2 on LUW tracks were presented. This being first such initiative at Kolkata, it generated a lot of interest and positive feedback. Given the success of the last time, a similar event is planned for June 14th, 2014. The event was attended by professionals from various renowned organizations like IBM, Cognizant etc. and also by some of the students from Technical Institutes like Techno India. An entry fee was collected last time from each of the delegates to cover the expenses of the event.

The idea is to spread the message to a wider base of professionals from various IT Organizations in and around Kolkata like TCS, Accenture, Wipro, Capgemini, HCL etc. so that there are better networking opportunities and idea exchanges happening. The message could be spread to other non IT companies as well who are either using DB2 or are a potential user of the same. We also would like to bring some distinguished speakers from different organizations to share their DB2 experiences.

The different set of professionals who are expected to benefit by attending these sessions are:

1)      DB2 programmers
2)      DB2 DBAs
3)      Data Architects
4)      People working on Migration/Replatforming projects
5)      IT Project Managers


Thanks and Regards,

Kolkata India DB2 User Group

Read more…