Hello All,
I have got an ETL stored procedure which performs the following steps and takes in the following input parameters:

Input Parameters:

IN SOURCE_SCHEMA VARCHAR(128)
, IN SOURCE_TABLE VARCHAR(128)
, IN TARGET_SCHEMA VARCHAR(128)
, IN TARGET_TABLE VARCHAR(128)
, IN EXCEPTION_SCHEMA VARCHAR(128)
, IN EXCEPTION_TABLE VARCHAR(128))
Steps:

1. Checks for delta rows between source table and target table (where a certain flag
column = 'N' in source table).
2. Delete any exceptions from the exception table (to store only the latest
exceptions).
3. Update Audit Tables.
4. Based on certain conditions, this delta flag column value is set to 'P', 'U' or
'S' in the source table (where delta flag column = 'N').
5. Where Delta Flag column value = 'P', load those rows into the target table.
6. Writes all these steps (along with exceptions) into a log file and then emails
that log file to respective recipients.

There are couple other steps involved between step 5 and step 6 but they are not quite related to the step I need help with. So currently, load statements are dynamically built:

SET STATEMENT = '';
SET STATEMENT ='LOAD FROM (SELECT ' || SELECT_COLS ||', CURRENT_TIMESTAMP AS REC_CRTN_TS, '''||ETL_JOB_ID||''' AS JOB_ID FROM '||SOURCE_SCHEMA||'.'||SOURCE_TABLE||' WHERE PROC_RAW_FG = ''P'' '|| ORDER_BY_CLAUSE ||' FOR FETCH ONLY ) OF CURSOR MESSAGES ON SERVER INSERT INTO '||TARGET_SCHEMA||'.'||TARGET_TABLE || ' FOR EXCEPTION '||EXCEPTION_SCHEMA||'.'||EXCEPTION_TABLE;
CALL DBMS_OUTPUT.PUT_LINE(STATEMENT); -- for logging purposes into a file
CALL UTL_FILE.PUT_LINE(OUTFILE,UTL_TCP.CRLF||'LOAD STATEMENT :'||UTL_TCP.CRLF||STATEMENT); -- for logging purposes into a file
SELECT_COLS is a variable that holds comma separated source fields with logic to trim char/varchar fields for select clause to apply to target table. It would look something like this (column list is huge, so only including a few): TRIM(TSO_SPLIT_NO) AS TSO_SPLIT_NO,QTY_SPLIT,TRIM(HIGH_PRTY_INDC) AS HIGH_PRTY_INDC,TRIM(QTE_MATCH_CNTL_CD) AS QTE_MATCH_CNTL_CD

Also, this column list depends on what source table and target table we have passed as inputs, following logic is implemented to extract the column list:

SET STATEMENT = '';
SET STATEMENT = 'SET (?,?,?,?,?) = (SELECT SELECT_COLUMNS, ORDER_BY_CLAUSE, PRIMARY_KEY_COLUMNS, SUSP_REC_WHERE_CLAUSE,SOURCE_COLS FROM schemaName.tableName WHERE SOURCE_SCHEMA = '''|| SOURCE_SCHEMA ||''' AND SOURCE_TABLE = '''|| SOURCE_TABLE ||''' AND TARGET_SCHEMA= '''|| TARGET_SCHEMA ||''' AND TARGET_TABLE = '''|| TARGET_TABLE ||''') ';
PREPARE C_STATEMENT FROM STATEMENT;
EXECUTE C_STATEMENT INTO SELECT_COLS, ORDER_BY_CLAUSE, PRIMARY_COLS, SUSP_REC_WHERE_CLAUSE, SOURCE_COLS;
ETL_JOB_ID is a variable that holds dynamically generated value based on the following:

SET MAIN_STEP_CD = THIS_PROC||'.'||TARGET_TABLE;
SET ETL_JOB_ID = MAIN_STEP_CD ||'_'|| TO_CHAR(BEGIN_TS,'YYYYMMDDHH24MISS');

Now, since we are using DB2 Load Utility, this step often causes blocking since this step locks up the entire table, not just the rows that it is working with. That said, I was instructed to use a cursor to insert records into target table, of course, without hard coding any of the stuff as that would make this stored procedure unusable.

I do have something in mind but I am having difficulties in constructing the actual cursor. More specifically, how do I fetch dynamically input column values and then dynamically insert them into the target table? Also, I couldn't think of a way to catch expections (like we have a FOR EXCEPTION clause in the load statement) I was thinking it is going to look something like this:

DECLARE EOFR INT DEFAULT 0;
DECLARE CR_SELECT_COLS ANCHOR DATA TYPE TO schemaName.tableName.SELECT_COLUMNS
DECLARE CR_REC_CRTN_TS TIMESTAMP;
DECLARE CR_ETL_JOB_ID ANCHOR DATA TYPE TO schemaName.tableName.ETL_JOB_ID;
DECLARE STMT VARCHAR(200);
DECLARE STMT2 VARCHAR(200);

SET STMT ='SELECT ' || SELECT_COLS ||', CURRENT_TIMESTAMP AS REC_CRTN_TS, '''||ETL_JOB_ID||''' AS JOB_ID FROM '||SOURCE_SCHEMA||'.'||SOURCE_TABLE||' WHERE PROC_RAW_FG = ''P'' '|| ORDER_BY_CLAUSE || ' FOR FETCH ONLY'
PREPARE S1 FROM STMT;

DECLARE C1 CURSOR FOR
EXECUTE S1;

DECLARE CONTINUE HANDLER FOR NOT FOUND
SET EOFR = 1;

SET STMT2 = 'INSERT INTO '||TARGET_SCHEMA||'.'||TARGET_TABLE || ' VALUES (?,?,?)';
PREPARE S2 FROM STMT2;
OPEN C1;
WHILE EOFR = 0 DO
FETCH FROM C1 INTO CR_SELECT_COLS, CR_REC_CRTN_TS, CR_ETL_JOB_ID;
EXECUTE S2 USING CR_SELECT_COLS, CR_REC_CRTN_TS,CR_ETL_JOB_ID;
END WHILE;
CLOSE C1;

But I think this is not going to work as CR_SELECT_COLS will not be able to hold multiple column VALUES. That said, based off of my DB2 knowledge (which is way down there at the moment, since DB2 LUW platform is not my expertise yet), this is the only way I could think of implementing a cursor. If you guys think there is a better way, please feel free to share. Thanks in advance!

Views: 1101

Reply to This

Replies to This Discussion

Gaurav,

Does LUW have a concurrent LOAD that does actually INSERT under the covers? That could be easy solution if was on z/OS.

Can you explain what types of Exceptions are expected? Will an entire run fail or individual rows will have problems? What is current technique that determines the Delta between source and target? That could be harder to do than the load of selected rows.

What is maximum number of rows to be Inserted?

It is going to be easier to construct an INSERT FROM SELECT to insert multiple rows at a time, either all of them, or batch of n rows using FETCH FIRST n ROWS only in a loop, or even case n = 1. INSERT VALUES after a FETCH from Cursor is probably too hard to code. 

Is there something to be updated in the source table to indicate delta 'P" rows have already been copied?  If not, it is tricky to SELECT only the rows that have not already been copied. Perhaps INSERT from SELECT cols FROM source EXCEPT SELECT cols from Target FETCH FIRST n ROWS ONLY.

Your question probably has too many complex parts to it. Simplification of the question might help.

RSS

Events

Latest Activity

2 events by Surekha Parekh were featured
Jan 12
Surekha Parekh updated an event

Modernize IBM Z Apps for Hybrid Cloud using APIs at webcast

January 11, 2021 all day
Available on-demand now, this webinar will show you how to provide access to curated, proven and high-performance applications by turning your existing IBM Z® assets into security-rich, consumable and managed APIs. You will also learn some key considerations on how to choose among the numerous API and integration solutions available. Click the link to watch:…See More
Jan 11
francisco venegas liked Surekha Parekh's event Db2 12 Exploiting new functions and Migration Planning PART 1
Jan 11
Mark liked Surekha Parekh's video
Dec 17, 2020
Judith Talavera posted an event

Modernize IBM Z Apps for Hybrid Cloud using APIs at webcast

December 16, 2020 to February 16, 2021
Available on-demand now, this webinar will show you how to provide access to curated, proven and high-performance applications by turning your existing IBM Z® assets into security-rich, consumable and managed APIs. You will also learn some key considerations on how to choose among the numerous API and integration solutions available. Click the link to watch:…See More
Dec 17, 2020
An event by Janttu Lindroos was featured

Finland Db2 User Group (FiDUG) Seminar at IBM Finland

December 10, 2020 from 9am to 5pm
More info will be available later on. Stay tuned.See More
Dec 4, 2020
An event by Michael Strelczuk was featured
Thumbnail

MDUG - Michigan Db2 User Group Seminar at Four Points by Sheraton, Novi

December 9, 2020 from 8:15am to 4pm
Check out our MDUG web site for upcoming Seminar details and past Seminar presentations, etc.:http://www.mdug.org/New to MDUG?Please see content underHome About MDUGFor Seminar Reservations, please review the MDUG site content under…See More
Dec 4, 2020
An event by Surekha was featured
Thumbnail

Surprise in store! Join the Db2 for z/OS Spotlight session for REALLY new news from the lab at http://ibm.biz/MoTownsend_IBM2020

December 7, 2020 from 11am to 12pm
Surprise in store! Join the Db2 for z/OS Spotlight session for REALLY new news from the labIDUG EMEA 2020 Spotlight session Available today on replay  http://ibm.biz/MoTownsend_IBM2020 Db2 12 for z/OS® took Db2 to a new level, by extending the core capabilities and empowering the future. IBM extended the core with new…See More
Dec 4, 2020
Jonathan Sloan posted an event

IBM Z and IBM Cloud Pak for Data - Better Together at Webcast - East Coast U.S. time zone

December 3, 2020 from 11am to 12pm
Register at http://ibm.biz/IBMZandBetterTogetherWoD.IBM Cloud Pak for Data is a fully integrated data and AI platform that modernizes how businesses collect, organize and analyze data and infuse AI throughout their organization. And now IBM has introduced IBM Cloud Pak for Data on IBM Z, a highly secure private cloud for your enterprise data that allows you to adapt, transform and compete more effectively. Join this webcast to learn how IBM…See More
Dec 4, 2020
Surekha posted an event
Thumbnail

Surprise in store! Join the Db2 for z/OS Spotlight session for REALLY new news from the lab at http://ibm.biz/MoTownsend_IBM2020

December 7, 2020 from 11am to 12pm
Surprise in store! Join the Db2 for z/OS Spotlight session for REALLY new news from the labIDUG EMEA 2020 Spotlight session Available today on replay  http://ibm.biz/MoTownsend_IBM2020 Db2 12 for z/OS® took Db2 to a new level, by extending the core capabilities and empowering the future. IBM extended the core with new…See More
Dec 4, 2020
Dragica Smintic liked Mateja Jankovič's event SQLAdria Virtual Event
Nov 25, 2020
Jonathan Sloan posted an event

IBM Z and IBM Cloud Pak for Data - Better Together at Webcast - East Coast U.S. time zone

December 3, 2020 from 11am to 12pm
Register at http://ibm.biz/IBMZandBetterTogetherWoD.IBM Cloud Pak for Data is a fully integrated data and AI platform that modernizes how businesses collect, organize and analyze data and infuse AI throughout their organization. And now IBM has introduced IBM Cloud Pak for Data on IBM Z, a highly secure private cloud for your enterprise data that allows you to adapt, transform and compete more effectively. Join this webcast to learn how IBM…See More
Nov 23, 2020
Massimo Verzì liked Adrian Collett's group DUGI - Db2 User Group Italia
Nov 20, 2020
Profile Icon via Twitter
Enjoy complimentary visit #IBM Expo at #IDUGDb2 EMEA 2020. Virtual Expo is available 24x7 - check out our compli… https://t.co/pAqtzMLvfm
TwitterNov 19, 2020 · Reply · Retweet
Carol Davis-Mann posted a blog post

DB2 v11.5.4 REST API – Part One Setup and Configuration

By James CockayneEnabling clients to interact via HTTP GET/POST requests the REST API functionality provides clients a lightweight, modern interface to data stored in DB2 databases.  In this series we look at how to get started with the REST API from the DBA’s perspective, starting in part one with how to get the service up and running.…See More
Nov 18, 2020
Roberto S Chirinos liked Surekha's blog post What year was IBM Db2 invented?
Nov 17, 2020

Bringing Db2 enthusiasts together virtually. Expert or novice, distributed or mainframe, this is the place for everything DB2.

Forum

Replaying past database activities

Started by Shaun Hide in Security and DB2. Last reply by Alexander Dashevsky Nov 17, 2020. 1 Reply

I would need to know who modified table data on a certain day.I do have my backups and database logs available.DB2audit was however not activated at this point.How would i  be able to access this information?Continue

Tags: #DataAccess

Db2 is developer inclusive and preview of Db2 11.5.5

Started by Kip in What's hot ? Nov 4, 2020. 0 Replies

This digital event examines how enterprises can become smarter with a data management system that is developer-inclusive. During this session, participants will get to learn how Db2 integrates with a broad array of programming languages, employs…Continue

© 2021   Created by Surekha Parekh.   Powered by

Badges  |  Report an Issue  |  Terms of Service