Sunday 4 June 2017

AWS - export and import using datapump utility.


Introduction:
AWS Import/Export is a service that accelerates transferring large amounts of data into and out of AWS using physical storage appliances, bypassing the Internet. AWS Import/Export supports transfers data directly onto and off of storage devices you own using the Amazon high-speed internal network.

Refer http://docs.aws.amazon.com/AWSImportExport/latest/DG/whatisdisk.html

Scenario 1: Take export dump  on AWS database Server for some of the schemas.

Use following parfile to take export on aws database server

directory=<directory name>
dumpfile=<dumpfile name>
logfile=<logfile name>
schemas=<schema name>
compression=all


For example we are using AWSNSMP  database to take schemas export and keep the dumpfile on database server

We need to use DATA_PUMP_DIR directory for keeping export dumpfile and logfiles.

SQL> select name,open_mode from v$database;

NAME      OPEN_MODE
--------- --------------------
AWSNSMP   READ WRITE

SQL> select * from dba_directories;

OWNER      DIRECTORY_NAME           DIRECTORY_PATH                 ORIGIN_CON_ID
---------- ------------------------ ------------------------------ -------------
SYS        BDUMP                    /rdsdbdata/log/trace                       0
SYS        ADUMP                    /rdsdbdata/log/audit                       0
SYS        OPATCH_LOG_DIR           /rdsdbbin/oracle/Opatch                   0
SYS        OPATCH_SCRIPT_DIR        /rdsdbbin/oracle/Opatch                   0
SYS        DATA_PUMP_DIR            /rdsdbdata/datapump                        0
SYS        OPATCH_INST_DIR          /rdsdbbin/oracle/OPatch                    0

6 rows selected.


1. Create Parfile and keep all parameters 

vi AWSNSMP_WNL.par

directory=DATA_DUMP_DIR
dumpfile=bkp_AWSNSMP.dpdmp
logfile=bkp_AWSNSMP.log
schemas=STG_MTS_APP,STG_MTS_V,
STG_MTS,
STG_ENTRP_ADM2,
STG_ENTRP_ADM2_V,
STG_ENTRP_ADM2_RO,
STG_ENTRP_ADM2_APP,
STG_ENTRP_ADM1_APP,
STG_ENTRP_ADM1,
STG_ENTRP_ADM1_V,
STG_ENTRP_ADM1_RO,
STG_ENTRP_V,
STG_ENTRP_RO,
STG_ENTRP,
STG_ENTRP_APP
compression=all
 


2. Start export as follows.

$ expdp parfile=AWSNSMP_WNL.par

Export: Release 12.1.0.2.0 - Production on Wed Apr 26 02:28:27 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.

Username: system@AWSNSMP
Password:
 


3. Monitor the export log from Database.

Findout out file name by querying  RDSADMIN.RDS_FILE_UTIL.LISTDIR package from Database.

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by mtime;

FILENAME                                 TYPE         FILESIZE MTIME
---------------------------------------- ---------- ---------- ---------
bkp_AWSNSMP.dpdmp                         file         3051520  18-APR-17
bkp_AWSNSMP.log                           file         12118    18-APR-17
datapump/                                directory     4096     18-APR-17

use following package to view the import logfile.

SQL> select * from table
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => 'DATA_PUMP_DIR',
        p_filename  => 'datapump/bkp_AWSNSMP.log'));   2    3    4

TEXT
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
;;;
Export: Release 12.1.0.2.0 - Production on Tue Apr 18 10:16:29 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01":  SYSTEM/********@AWSNSMP directory=DATA_PUMP_DIR dumpfile=bkp_AWSNSMP.dpdmp logfile=bkp_AWSNSMP.log schemas=STG_MTS_APP,STG_MTS_V,STG_
MTS,STG_ENTRP_ADM2,STG_ENTRP_ADM2_V,STG_ENTRP_ADM2_RO,STG_ENTRP_ADM2_APP,STG_ENTRP_ADM1_APP,STG_ENTRP_ADM1,STG_ENTRP_ADM1_V,STG_ENTRP_ADM1_RO,STG_ENTRP_V,STG_ENTRP_RO,STG_ENTRP,STG_ENTRP_APP compression=all encryption_password=********

Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 28.62 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT


. . exported "STG_ENTRP_ADM2"."RECORD_SEQ_POSITION"           0 KB       0 rows
. . exported "STG_ENTRP_ADM2"."REVINFO"                       0 KB       0 rows
. . exported "STG_ENTRP_ADM2"."SCHEDULED_FORM"                0 KB       0 rows
. . exported "STG_ENTRP_ADM2"."SIGNATURE"                     0 KB       0 rows
. . exported "STG_ENTRP_ADM2"."SIGNATURE_AUD"                 0 KB       0 rows
. . exported "STG_MTS"."MTS_DEPLOYMENT"                      0 KB       0 rows
Master table "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
  /rdsdbdata/datapump/bkp_AWSNSMP.dpdmp
Job "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully completed at Tue Apr 18 14:16:51 2017 elapsed 0 00:00:16
 


Scenario 2: Perform export on local server from AWS database.

 Connect to any local database server.

1.    Create directory to accommodate dumpfile and logfile.

SQL> select * from dba_directories where directory_name like '%AWS%';

OWNER      DIRECTORY_NAME       DIRECTORY_PATH                         ORIGIN_CON_ID
---------- -------------------- -------------------------------------- -------------
SYS        AWS_PUMP_DIR         /mounts/nsgpr01_exports/exp/AWS                     0


2.    Create database link.


SQL> CREATE DATABASE LINK AWSNSMP_SYSTEM_LINK CONNECT TO SYSTEM IDENTIFIED BY  <password>  USING 'AWSNSMP';
Database link created.


3.    Create parfile and include network link parameter and encryption option.

Vi AWSNSMP_NL.par
directory=AWS_PUMP_DIR
dumpfile=bkp_AWSNSMP.dpdmp
logfile=bkp_AWSNSMP.log
schemas=STG_MTS_APP,
STG_MTS_V,
STG_MTS,
STG_ENTRP_ADM2,
STG_ENTRP_ADM2_V,
STG_ENTRP_ADM2_RO,
STG_ENTRP_ADM2_APP,
STG_ENTRP_ADM1_APP,
STG_ENTRP_ADM1,
STG_ENTRP_ADM1_V,
STG_ENTRP_ADM1_RO,
STG_ENTRP_V,
STG_ENTRP_RO,
STG_ENTRP,
STG_ENTRP_APP
compression=all
NETWORK_LINK=AWSNSMP_SYSTEM_LINK encryption_password=*******


4.    Initiate export.

$ expdp parfile=AWSNSMP_NL.par

Export: Release 12.1.0.2.0 - Production on Wed Apr 26 03:10:15 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.

Username:system
Password


5.    Monitor export log

$ tail -f bkp_AWSNSMP.log


;;;
Export: Release 12.1.0.2.0 - Production on Tue Apr 18 10:12:00 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Starting "SYS"."SYS_EXPORT_SCHEMA_01":  /******** AS SYSDBA parfile=AWSNSMP_NL.par
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 17.62 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA

. . exported "STG_ENTRP_ADM2"."REVINFO"                       0 KB       0 rows
. . exported "STG_ENTRP_ADM2"."SCHEDULED_FORM"                0 KB       0 rows
. . exported "STG_ENTRP_ADM2"."SIGNATURE"                     0 KB       0 rows
. . exported "STG_ENTRP_ADM2"."SIGNATURE_AUD"                 0 KB       0 rows
. . exported "STG_MTS"."MTS_DEPLOYMENT"                      0 KB       0 rows
Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
  /mounts/nsgpr01_exports/exp/AWS/bkp_AWSNSMP.dpdmp
Job "SYS"."SYS_EXPORT_SCHEMA_01" successfully completed at Tue Apr 18 10:14:21 2017 elapsed 0 00:02:18


Scenario 3: Perform data load into AWS database using import.

1.    Create database link on local server to communicate with aws database.

SQL>CREATE DATABASE LINK AWSNSTN_SYSTEM_LINK CONNECT TO SYSTEM IDENTIFIED BY   ***** USING 'AWSNSTN';
Database link created.

 
2.    Transfer the dumpfile into aws database server:

Refer below link for transfer file to AWS.
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html

Use the following DBMS_FILE_TRANSFER.PUT_FILE to transfer file to aws Database server.
 

set timing on
set time on
BEGIN
DBMS_FILE_TRANSFER.PUT_FILE (
source_directory_object        => 'AWS_PUMP_DIR',
source_file_name               => 'expdp_NSMPAC_schema.dmp',
destination_directory_object   => 'DATA_PUMP_DIR',
destination_file_name          => 'expdp_NSMPAC_schema.dmp',
destination_database           => 'AWSNSTN_SYSTEM_LINK');
END;
/


Source_directory_object  --> local database directory where your dumpfile located
Source_file_name -->      Dumpfile name
destination_directory_object --> aws database directory where you need to place the dumpfile
destination_database --> Database link

3.    Monitor transfer of files from database.

Issue following query to view the filename

select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by mtime;

$ sqlplus

SQL*Plus: Release 12.1.0.2.0 Production on Wed May 3 01:02:25 2017

Copyright (c) 1982, 2014, Oracle.  All rights reserved.

Enter user-name: SYSTEM@AWSNSTN
Enter password:


Last Successful login time: Tue May 02 2017 01:13:16 -04:00

Connected to:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

SQL> select name,open_mode from v$database;

NAME      OPEN_MODE
--------- --------------------
AWSNSTN   READ WRITE

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by mtime;

FILENAME                                           TYPE         FILESIZE MTIME
-------------------------------------------------- ---------- ---------- ------------------
expdp_NSMPAC_schema.dmp                             file       2.0437E+10  18-APR-17
impdp_NSMPAC_schema.log                             file       164046      18-APR-17
datapump/                                          directory     4096      19-APR-17


4.    create parfile for import with parallel parameter to speed up the import process.

vi impdp_AWSNSTN_schemas.ctl
directory=DATA_PUMP_DIR dumpfile=expdp_NSMPAC_schema.dmp logfile=impdp_NSMPAC_schema.log schemas=BRMGR01 transform=oid:n parallel=16

  
5.    Start import in nohup and background

nohup impdp SYSTEM/*****@AWSNSTN parfile=impdp_AWSNSTN_schemas.ctl &

6.    Monitor import log from database.

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by mtime;

FILENAME                                           TYPE         FILESIZE MTIME
-------------------------------------------------- ---------- ---------- ------------------
expdp_NSMPAC_schema.dmp                             file       2.0437E+10   18-APR-17
impdp_NSMPAC_schema.log                             file       164046       18-APR-17
datapump/                                          directory     4096       19-APR-17



From above results you have file names,
Use rdsadmin.rds_file_util.read_text_file to read import logfile

SQL> select * from table
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => 'DATA_PUMP_DIR',
        p_filename  => 'datapump/impdp_NSMPAC_schema.log'));  2    3    4

TEXT
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
;;;
Import: Release 12.1.0.2.0 - Production on Tue Apr 18 13:07:42 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_SCHEMA_01":  SYSTEM/********@AWSNSTN parfile=impdp_AWSNSTN_schemas.ctl
Processing object type SCHEMA_EXPORT/USER

Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/AUDIT_OBJ
Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_INDEX/INDEX_STATISTICS
Job "SYSTEM"."SYS_IMPORT_SCHEMA_01" completed at Tue Apr 18 19:46:36 2017 elapsed 0 02:38:50

2462 rows selected.


Conclusion:
In this preceding scenario based article  we have learned how to perform export from aws databases and import into aws databases from on-premises to AWS and vice versa.

BY
Name: Omer
Designation: Senior Database Engineer
Organization: Netsoftmate IT Solutions.
Email: info@netsoftmate.com

13 comments:

  1. simply superb, mind-blowing, I will share your blog to my friends also
    AWS Online Course

    ReplyDelete

  2. The information which you have provided is very good. It is very useful who is looking for AWS Training

    ReplyDelete
  3. Enjoyed reading the article above, really explains everything in detail, the article is very interesting and effective. Thank you and good luck for the upcoming articles AWS Online Training

    ReplyDelete
  4. simply superb, mind-blowing, I will share your blog to my friends also
    Visit spring boot certification!

    ReplyDelete
  5. I appreciate this piece of useful information. We Help of Tradeskill you can learn how to start
    your export business, We provide you a facility to learn Export Import Business Online For more information visit our site: Import Export Certificate Course Online

    ReplyDelete
  6. I appreciate this piece of useful information. We are Help of Tradeskill you can learn how to start your export business, We provide you a facility to learn Export Import Business Online For more information visit our site: Import Export Classes Online

    ReplyDelete
  7. I appreciate this piece of useful information.Thanks for sharing this.

    ReplyDelete
  8. Thank you for this fascinating post, I am happy I observed this website on Google. Not just content, in fact, the whole site is fantastic. Shipping Container Homes

    ReplyDelete
  9. The AWS certification course has become the need of the hour for freshers, IT professionals, or young entrepreneurs. AWS is one of the largest global cloud platforms that aids in hosting and managing company services on the internet. It was conceived in the year 2006 to service the clients in the best way possible by offering customized IT infrastructure. Due to its robustness, Digital Nest added AWS training in Hyderabad under the umbrella of other courses.

    ReplyDelete

  10. We came up with a great learning experience of Big Data Hadoop training in Chennai, from Infycle Technologies, the finest software training Institute in Chennai. And we also come up with other technical courses like Cyber Security, Graphic Design and Animation, Block Security, Java, Cyber Security, Oracle, Python, Big data, Azure, Python, Manual and Automation Testing, DevOps, Medical Coding etc., with great learning experience with outstanding training with experienced trainers and friendly environment. And we also arrange 100+ Live Practical Sessions and Real-Time scenarios which helps you to easily get through the interviews in top MNC’s. for more queries approach us on 7504633633, 7502633633.

    ReplyDelete

Comparing Oracle Database Appliance X8-2 Model Family

September 2019 Oracle announced Oracle Database Appliance X8-2 (Small, Medium and HA). ODA X8-2 comes with more computing resources com...