Data Migrator a Python Tkinter GUI to transfer data from Mainframe to Postgres

Data Migrator a Python Tkinter GUI to transfer data from Mainframe to Postgres

This article is about Data Migrator a Python Tkinter GUI application that I've developed to transfer data from Mainframe DB2 table to Postgres Table.

Below are the few important Python modules been used to develop Data Migrator.

  • Tkinter - For GUI display in grid mechanism.
  • Pandas - To store special variables in a Dataframe.
  • Numpy - To perform mathematical functions on Pandas Dataframe.
  • idm_db_dbi - To work with z/OS DB2 Database.
  • psycopyg2 - To work with PostgreSQL Database.

Features:

  1. Source section GUI display is designed in such a way that user can choose z/OS DB2 Table by selecting LPAR, SSID & entering Table Name & Mainframe Credentials.
  2. Target section GUI display is designed in such a way that user can choose PostgreSQL Table by passing Host, Port, Database, Table Name & Postgres Credentials.
  3. With the selection of LPAR respective subsystem id's will be auto reflected to choose via radio buttons.
  4. Automatic validation of user choices/selection/inputs which enables "Proceed" button for the user to click and procced with the data transfer.
  5. Execution log indicates each phase of work been done in DB2 - Connection setup, Version prompt, Fetch state, Total rows fetched & Disconnection.
  6. Execution log indicates each phase of work been done in PostgreSQL - Connection setup, Version prompt, Truncate state, Insert state, Total rows inserted & Disconnection.
  7. Execution log indicates elapsed time in HH.MM.SS format. So that user will have a clear vision on time taken for each execution phase to complete.
  8. Execution log indicates data transfer speed rate i.e. RPS - Rate at which rows are fetched from DB2 / inserted into Postgres per second.

Below is the GUI display of Data Migrator.py

No alt text provided for this image

Sample execution log :

"C:\Program Files (x86)\Python37-32\python.exe" "C:/Users/venkat/Desktop/Data Migrator.py

00.00.20 Connecting to z/OS DB2 database...
00.00.03 The connection was successful.
00.00.00 z/OS DB2 database version : V12R1M500
00.00.01 SQL executed , Fetching rows from table VENKAT.DB2_TABLE ...
00.00.14 Total no: of rows received : 20000 (Speed~1428rps)
00.00.01 Connection closed.

00.00.00 Connecting to the PostgreSQL database...
00.00.02 The connection was successful.
00.00.00 PostgreSQL database version : PostgreSQL 12.6 8.3.1 20191121 (Red Hat 8.3.1-5), 64-bit
00.00.00 Truncating table VENKAT.POSTGES_TABLE ...
00.00.00 Truncate completed.
00.00.00 Inserting rows into table VENKAT.POSTGRES_TABLE ...
00.00.11 Transaction completed.
00.00.00 Total no: of inserts VENKAT.POSTGRES_TABLE : 20000 (Speed~1818rps)
00.00.01 Total no: of rows? ? VENKAT.POSTGRES_TABLE : 20000
00.00.00 Connection closed.


Process finished with exit code 0        

~ the end ~

Thanks,
Kajuluri Venkata Ashok,
z/OS Mainframe DB2 DBA.        
Peter Bishop

Solutions Architect at ISI Pty Ltd

3 年

Neat. Can I copy a whole Db2 to PostGres in one operation without knowing any tablenames? That would be very neat. Unfortunately I have no Db2 nor PostGres to try that with, maybe you can and post the results? Also, is the source available? Can't see a link but maybe I missed it...it would interesting to try when (if) I have a Db2/Postgres setup to play with. Thanks for sharing.

回复
Vinay Setlur

Lead DBA - MongoDB and Ansible/Chef/AzDO automations at MetLife

3 年

Excellent Venkat ?? Another flavor could be by unloading the RDBMS source in CSV format and then importing into NOSQL platform like MongoDB. That way you’d create a “swiss knife” that can be used for NOSQL or SQL platforms

Dasari Chandranna

Senior Salesforce Engineer | Expert in LWC, Apex & CRM Integrations | Passionate about delivering impactful, scalable solutions |

3 年

Great work

要查看或添加评论,请登录

Venkata Ashok Kajuluri的更多文章