Showing posts with label EXPORT/IMPORT. Show all posts
Showing posts with label EXPORT/IMPORT. Show all posts

Friday, May 29, 2020

Importing Data into Oracle on Amazon RDS using Oracle Data Pump and a Database Link

Import schema in AWS Oracle RDS
==============================================


Environment Details:
-------------------------------

Schema Name  : TEST

Source (Normal)
-----------------

Database Name : india
Database Version : 19c
Platform : oracle linux


Target (RDS)
--------------

Database Name : orcl
Dabase version  : 19c
Platform : AWS Oracle RDS




Step 1 : perform schema export 

expdp '/ as sysdba' directory=DATA_PUMP_DIR dumpfile=test_schema.dmp LOGFILE=test_schema.log schemas=test


example 


[oracle@anurag ~]$ expdp  directory=DATA_PUMP_DIR dumpfile=test_schema.dmp LOGFILE=test_schema.log schemas=test

Export: Release 19.0.0.0.0 - Production on Thu May 28 17:21:14 2020
Version 19.3.0.0.0

Copyright (c) 1982, 2019, Oracle and/or its affiliates.  All rights reserved.

Username: / as sysdba
Password:

Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Starting "SYS"."SYS_EXPORT_SCHEMA_01":  /******** AS SYSDBA directory=DATA_PUMP_DIR dumpfile=test_schema.dmp LOGFILE=test_schema.log schemas=test
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . exported "TEST"."TEST"                               5.539 KB       2 rows
Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
  /u01/oracle/admin/india/dpdump/test_schema.dmp
Job "SYS"."SYS_EXPORT_SCHEMA_01" successfully completed at Thu May 28 17:21:53 2020 elapsed 0 00:00:29

[oracle@anurag ~]$


we can perform export using dataump API as well 

login as sysdba


DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'sample.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); 
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'exp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''TEST'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/


example :

SQL>
SQL> DECLARE
  2  hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE(handle => hdnl, filename => 'sample.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);   3
  4    5    6  DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'exp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''TEST'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/  7    8    9

PL/SQL procedure successfully completed.

SQL>
SQL>



Step 2: ADD RDS database tns entry in TNSNAMES.ora file also validate using tnsping 


ORCL =
  (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = aws_rds_db_orcl_endpoint_name)(PORT = 1521))
    (CONNECT_DATA =
      (SERVER = DEDICATED)
      (SERVICE_NAME = orcl)
        (UR = A)
    )
  )
  


Step 3: Create public database link to using AWS rds database endpoint (create as per your requirement like public or private)


create public database link  to_rds19c  connect to "username" identified by "password"
USING '(DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = aws_rds_db_orcl_endpoint_name)(PORT = 1521)) (CONNECT_DATA = (SERVER = DEDICATED) (SERVICE_NAME = orcl)))';

SQL> create public database link  to_rds19c  connect to "username" identified by "password"
USING '(DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = aws_rds_db_orcl_endpoint_name)(PORT = 1521)) (CONNECT_DATA = (SERVER = DEDICATED) (SERVICE_NAME = orcl)))';
  2
Database link created.

Step 4: validate created database link is working or not 

SQL>
SQL>
SQL> select * from dual@to_rds19c;

D
-
X

SQL>
SQL>

Step 4: Transfer dumpfile to AWS RDS orcl database using database link using below API

BEGIN
DBMS_FILE_TRANSFER.PUT_FILE(
source_directory_object       => 'DATA_PUMP_DIR',
source_file_name              => 'sample.dmp',
destination_directory_object  => 'DATA_PUMP_DIR',
destination_file_name         => 'sample_copied.dmp', 
destination_database          => 'to_rds19c' 
);
END;




SQL>
SQL> BEGIN
  2  DBMS_FILE_TRANSFER.PUT_FILE(
source_directory_object       => 'DATA_PUMP_DIR',
source_file_name              => 'sample.dmp',
destination_directory_object  => 'DATA_PUMP_DIR',
destination_file_name         => 'sample_copied.dmp',
destination_database          3    => 'to_rds19c'
);
END;
  4    5    6    7    8    9   10  /

PL/SQL procedure successfully completed.

SQL>
SQL>



Step  5: On AWS RDS database validate dumpfile which you transfer  using below command


select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by mtime;


Step 6: create schema in target AWS database (orcl) with required privilege  after that Perform import usig datapump API 


DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'sample_copied.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'sample_imp.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''TEST'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/



Step 7: To monitor import log using below command 

SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('DATA_PUMP_DIR','sample_imp.log'))





* You have to enable the port for access in AWS security group 


https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/rds-ug.pdf

Tuesday, March 17, 2020

Transfer the dump-file from On premise to RDS instance of AWS using DBMS_FILE_TRANSFER


Transfer the dump-file from On premise to RDS instance of AWS:
===============================================================


prerequisite :
============


  • create public db link in source database pointing to AWS RDS instance
  • DB link should be working fine


Command:
============


BEGIN
DBMS_FILE_TRANSFER.PUT_FILE(
source_directory_object =>'DATA_PUMP_DIR',
source_file_name =>'test1_full_db.dmp',
destination_directory_object =>'DATA_PUMP_DIR',
destination_file_name => 'test1_full_db.dmp',
destination_database =>'AWS_RDS'
);
END;
/




SQL>
SQL>
SQL>
SQL> BEGIN
DBMS_FILE_TRANSFER.PUT_FILE(
source_directory_object =>'DATA_PUMP_DIR',
source_file_name =>'test1_full_db.dmp',
destination_directory_object =>'DATA_PUMP_DIR',
destination_file_name => 'test1_full_db.dmp',
destination_database =>'AWS_RDS'
);
END;
/  2    3    4    5    6    7    8    9   10



Saturday, October 12, 2019

Extract Index DDL from database export dump




PAR file for index ddl extract from dump using traditional import utility
===================================================================

userid= '/ as sysdba '
statistics=none
buffer=9999999
log=/oracle/export/IND/dump/imp_index_full_10122019.log
constraints=n
grants=n
rows=n
FROMUSER=HR
indexfile=/oracle/export/IND/dump/index3.sql
Tables=(EMPLOYEES)
file=(/oracle/export/IND//exp_full_IND.dmp)


To execute the command in background 
===========================================

nohup imp parfile=file_name.par &

Friday, May 22, 2015

FULL DATABASE EXPORT BACKUP USING DATAPUMP (EXPDP)


Change path as per your requirment



bash-3.2$
bash-3.2$ cat full_export.sh

ORACLE_SID=TESTRE
ORACLE_HOME=/opt/oracle/product/10.2.0/db_1; export ORACLE_HOME
PATH=$ORACLE_HOME/bin; export PATH
expdp system/manager dumpfile=full_backup14012015.dmp logfile=full_backuplog.log directory=datapump_dir full=y
bash-3.2$
bash-3.2$


save with 

full_export_sh

and run with ./full_export.sh


You can also run in background

nohup ./full_export.sh &

press enter

bash-3.2$ tail -f nohup.out
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT



SCHEMA IMPORT FROM FULL DATAPUMP BACKUP.


Change path as per your requirement



bash-3.2$
bash-3.2$ cat import_schema.sh
#!/bin/sh

ORACLE_SID=TESTRE
ORACLE_HOME=/opt/oracle/product/10.2.0/db_1; export ORACLE_HOME
PATH=$ORACLE_HOME/bin; export PATH
impdp  system/manager dumpfile=sprt_schema_backup14012015.dmp logfile=sprt_importlog.log directory=datapump_dir schemas=SPRTSG4
bash-3.2$
bash-3.2$


save with 

import_schema.sh

and run with ./import_schema.sh


You can also run in background

nohup ./full_export.sh &

press enter

bash-3.2$ tail -f nohup.out
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT

Sunday, April 27, 2014

SCRIPT TO DATAPUMP/NORMAL EXPORT/IMPORT STATUS MONITORING

/////////////import job  status


SELECT
SUBSTR(sql_text, INSTR(sql_text,'INTO "'),50) table_name
, rows_processed
, ROUND( (sysdate-TO_DATE(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60,1) minutes
, TRUNC(rows_processed/((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60)) rows_per_minute
FROM sys.v_$sqlarea
WHERE
sql_text like 'INSERT %INTO "%'
AND command_type = 2
AND open_versions > 0;




///////////export job status

select
   round(sofar/totalwork*100,2)  percent_completed, 
   v$session_longops.* 
from 
   v$session_longops 
where
   sofar <> totalwork 
order by
   target, sid; 

Thursday, March 27, 2014

ORACLE EXAM DUMP PAPER FOR CERTIFICATION

Check following link to access the oracle exam dump SQL,OCA,OCP on Oracle 10g

Copy and paste on browser


https://drive.google.com/folderview?id=0ByQqSfeaD8_tN3dPSno0SlVoN3M&usp=drive_web