1. Import using NETWORK_LINK
2. QUERY CLAUSE in EXPDP
3. SQLFILE parameter in impdp
4. TABLE_EXISTS_ACTION option in IMPDP
5. EXCLUDE/INCLUDE option
6. DATA_OPTION=SKIP_CONSTRAINT_ERRORS
7.LOGTIME=ALL Parameter – Oracle 12c new feature
8. Views_as_tables Parameter In Datapump Of Oracle 12c
9. TRANSFORM=DISABLE_ARCHIVE_LOGGING – Oracle 12c new feature
10. ENCRYPTION in export Import
11. CONTENT parameter in EXPDP/IMPDP:
12. CLUSTER PARAMETER IN RAC:
13.EXPDP to an ASM DISKGROUP:
14.EXPDP to multiple directories :
1. Import using NETWORK_LINK:
Objective – Export table dbatest.EMP_TAB from SOURCEDB to TARGETDB using network link .
Import using network , wont create any dumpfile, it will directly do the import using db_link
Add tns entry of sourcedb in TARGETDB tnsnames.ora file [ TARGETDB]
SOURCEDB = (DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = 192.20.204.3)(PORT = 1521)) (CONNECT_DATA = (SERVER = DEDICATED) (SERVICE_NAME = SOURCEDB) ) )
Create a DB_LINK on TARGETDB pointing to SOURCEDB.
SQL> connect dbatest Enter password: Connected. SQL> create database link SOURCE connect to system identified by oracle using 'SOURCEDB'; Database link created.
Now Run impdp on TARGETDB using NETWORK_LINK=<>
impdp directory=DUNNING logfile=test.log tables=dbatest.EMP_TAB network_link=SOURCE
Starting "DBATEST"."SYS_IMPORT_TABLE_01": dbatest/******** directory=DUNNING logfile=test.log tables=dbatest.EMP_TAB network_link=SOURCE Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 23 MB Processing object type TABLE_EXPORT/TABLE/TABLE . . imported "DBATEST"."EMP_TAB" 175340 rows Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER ;;; Import> exit Job "DBATEST"."SYS_IMPORT_TABLE_01" successfully completed at Mon Jan 23 18:12:05 2017 elapsed 0 00:07:51
2. QUERY CLAUSE in EXPDP
Requirement – Export dump of a table from emp_tab WHERE created > sysdate -40 . Filter can be added on any column depending upon the requirement.
SQL> select count(*) from “DBACLASS”.”EMP_TAB” WHERE created > sysdate -40;
COUNT(*)
———-
1384
Create a parfile with query clause:
cat expdp_query.par dumpfile=test.dmp logfile=test1.log directory=TEST tables=dbaclass.EMP_TAB QUERY=dbaclass.EMP_TAB:"WHERE created > sysdate -40"
Now run the expdp command with parfile. We can see, 1384 rows will be exported.
expdp parfile=expdp_query.par Export: Release 12.1.0.2.0 - Production on Mon Jan 23 14:52:07 2017 Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved. Username: / as sysdba Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options Starting "SYS"."SYS_EXPORT_TABLE_01": /******** AS SYSDBA parfile=expdp_query.par Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 29 MB Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER . . exported "DBACLASS"."EMP_TAB" 199.4 KB 1384 rows Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ****************************************************************************** Dump file set for SYS.SYS_EXPORT_TABLE_01 is: /export/home/oracle/test.dmp Job "SYS"."SYS_EXPORT_TABLE_01" successfully completed at Mon Jan 23 14:53:02 2017 elapsed 0 00:00:23
3. SQLFILE parameter in impdp:
It can be used, only with impdp. This helps in generating the DDLs from a dumpfile.
Suppose i have dump file of table DBACLASS.DEP_TAB . If you need the DDL of the table, then use sqlfile with impdp command as below.
Parfile:
dumpfile=test.dmp logfile=test1.log directory=TEST tables=DBACLASS.DEP_TAB sqlfile=emp_tab.sql
impdp parfile=impdp_sqlfile.par Import: Release 12.1.0.2.0 - Production on Mon Jan 23 15:14:16 2017 Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved. Username: / as sysdba Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options Master table "SYS"."SYS_SQL_FILE_TABLE_01" successfully loaded/unloaded Starting "SYS"."SYS_SQL_FILE_TABLE_01": /******** AS SYSDBA parfile=impdp_sqlfile.par Processing object type SCHEMA_EXPORT/TABLE/TABLE Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type SCHEMA_EXPORT/STATISTICS/MARKER Job "SYS"."SYS_SQL_FILE_TABLE_01" successfully completed at Mon Jan 23 15:14:22 2017 elapsed 0 00:00:01
Now check the generated sql_file:
cat emp_tab.sql -- CONNECT SYS ALTER SESSION SET EVENTS '10150 TRACE NAME CONTEXT FOREVER, LEVEL 1'; ALTER SESSION SET EVENTS '10904 TRACE NAME CONTEXT FOREVER, LEVEL 1'; ALTER SESSION SET EVENTS '25475 TRACE NAME CONTEXT FOREVER, LEVEL 1'; ALTER SESSION SET EVENTS '10407 TRACE NAME CONTEXT FOREVER, LEVEL 1'; ALTER SESSION SET EVENTS '10851 TRACE NAME CONTEXT FOREVER, LEVEL 1'; ALTER SESSION SET EVENTS '22830 TRACE NAME CONTEXT FOREVER, LEVEL 192 '; -- new object type path: SCHEMA_EXPORT/TABLE/TABLE CREATE TABLE "DBACLASS"."DEP_TAB" ( "OWNER" VARCHAR2(128 BYTE), "OBJECT_NAME" VARCHAR2(128 BYTE), "SUBOBJECT_NAME" VARCHAR2(128 BYTE), "OBJECT_ID" NUMBER, "DATA_OBJECT_ID" NUMBER, "OBJECT_TYPE" VARCHAR2(23 BYTE), "CREATED" DATE, "LAST_DDL_TIME" DATE, "TIMESTAMP" VARCHAR2(19 BYTE), "STATUS" VARCHAR2(7 BYTE), "TEMPORARY" VARCHAR2(1 BYTE), "GENERATED" VARCHAR2(1 BYTE), "SECONDARY" VARCHAR2(1 BYTE), "NAMESPACE" NUMBER, "EDITION_NAME" VARCHAR2(128 BYTE), "SHARING" VARCHAR2(13 BYTE), "EDITIONABLE" VARCHAR2(1 BYTE), "ORACLE_MAINTAINED" VARCHAR2(1 BYTE) ) SEGMENT CREATION IMMEDIATE PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "USERS" ; -- new object type path: SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS -- new object type path: SCHEMA_EXPORT/STATISTICS/MARKER
4. TABLE_EXISTS_ACTION option in IMPDP:
TABLE_EXISTS_ACTION
Action to take if imported object already exists.
Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.
TABLE_EXISTS_ACTION=APPEND:
With this option ,while importing the table, if table exists in the database, then it will append the data on top the existing data in table.
select count(*) from "DBATEST"."EMP_TAB"; COUNT(*) ---------- 175340 impdp dumpfile=emp_tab.dmp logfile=emp_tab.log directory=VEN table_exists_action=APPEND Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA dumpfile=emp_tab.dmp logfile=emp_tab.log directory=VEN table_exists_action=APPEND Processing object type TABLE_EXPORT/TABLE/TABLE Table "DBATEST"."EMP_TAB" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append Processing object type TABLE_EXPORT/TABLE/TABLE_DATA . . imported "DBATEST"."EMP_TAB" 19.16 MB 175340 rows Job "SYS"."SYS_IMPORT_FULL_01" successfully completed at Tue Jan 24 08:57:09 2017 elapsed 0 00:01:18 SQL[SYS@TCRMDB01]SQL>>]select count(*) from "DBATEST"."EMP_TAB"; COUNT(*) ---------- 350680
TABLE_EXISTS_ACTION=TRUNCATE:
While importing the table, if the table exists in database, it will truncate the table and load the data.
impdp dumpfile=emp_tab.dmp logfile=emp_tab.log directory=VEN table_exists_action=TRUNCATE Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA dumpfile=emp_tab.dmp logfile=emp_tab.log directory=VEN table_exists_action=TRUNCATE Processing object type TABLE_EXPORT/TABLE/TABLE Table "DBATEST"."EMP_TAB" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate Processing object type TABLE_EXPORT/TABLE/TABLE_DATA . . imported "DBATEST"."EMP_TAB" 19.16 MB 175340 rows Job "SYS"."SYS_IMPORT_FULL_01" successfully completed at Tue Jan 24 09:02:23 2017 elapsed 0 00:01:11
TABLE_EXISTS_ACTION=REPLACE:
While importing , if the table exists in database, then it will drop it and recreate it from the dump
impdp dumpfile=emp_tab.dmp logfile=emp_tab.log directory=VEN table_exists_action=REPLACE Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA dumpfile=emp_tab.dmp logfile=emp_tab.log directory=VEN table_exists_action=REPLACE Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/TABLE_DATA . . imported "DBATEST"."EMP_TAB" 19.16 MB 175340 rows Job "SYS"."SYS_IMPORT_FULL_01" successfully completed at Tue Jan 24 09:20:30 2017 elapsed 0 00:01:37
TABLE_EXISTS_ACTION=SKIP:
This is the defult option with impdp. I.e if the the table exists, it will skip that table.
Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA dumpfile=emp_tab.dmp logfile=emp_tab.log directory=VEN Processing object type TABLE_EXPORT/TABLE/TABLE ORA-39151: Table "DBATEST"."EMP_TAB" exists. All dependent metadata and data will be skipped due to table_exists_action of skip Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Job "SYS"."SYS_IMPORT_FULL_01" completed with 1 error(s) at Tue Jan 24 09:31:51 2017 elapsed 0 00:01:01
5. EXCLUDE/INCLUDE option:
These two options can be used in both expdp or impdp to exclude or include, particular objects or object_types:
Here we will show few scenarios:
Export a schemas DBACLASS, excluding TABLE EMP_TAB and DEPT
dumpfile=test.dmp logfile=test1.log directory=TEST exclude=TABLE:"IN ('EMP_TAB','DEPT')" schemas=DBACLASS
Exclude few schemas while import:
dumpfile=test.dmp logfile=test1.log directory=TEST EXCLUDE=SCHEMA:"IN ('WMSYS', 'OUTLN')"
export/Import only TABLE and INDEX ( OBJECT_TYPE)
dumpfile=FULL.dmp logfile=full.log directory=exp_dir directory=DBATEST INCLUDE=TABLE,INDEX
6. DATA_OPTION=SKIP_CONSTRAINT_ERRORS
DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS is a very useful parameter mostly used with table_exists_action=APPEND .
In general, if you are appending data ( table_exists_action=APPEND) to an existing table with unique constaint and if the new rows violating the unique constraint,
then the import job will fail( even if only one row violates the constraint)
DATA_OPTION=SKIP_CONSTRAINT_ERRORS parameter will help in skipping the duplicate row and import the rest of the rows. It is similar to IGNORE=Y option of classic exp/imp
Here i tried to append data with table_exists_action=APPEND, it failed due to unique key violation.
impdp dumpfile=unique.dmp directory=EXPDIR table_exists_action=APPEND
With the Partitioning, OLAP, Data Mining and Real Application Testing options Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA dumpfile=unique.dmp directory=EXPDIR table_exists_action=APPEND Processing object type TABLE_EXPORT/TABLE/PROCACT_INSTANCE Processing object type TABLE_EXPORT/TABLE/TABLE Table "DBATEST"."GNSRT00" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append Processing object type TABLE_EXPORT/TABLE/TABLE_DATA ORA-31693: Table data object "DBATEST"."GNSRT00" failed to load/unload and is being skipped due to error: ORA-00001: unique constraint (DBATEST.RRN_PK_GNSRT00) violated Job "SYS"."SYS_IMPORT_FULL_01" completed with 1 error(s) at Tue Jan 24 10:31:28 2017 elapsed 0 00:00:11
Now lets try with SKIP_CONSTRAINT_ERRORS option:
impdp dumpfile=unique.dmp directory=EXPDIR table_exists_action=APPEND data_options=SKIP_CONSTRAINT_ERRORS
Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA dumpfile=unique.dmp directory=EXPDIR table_exists_action=APPEND data_options=SKIP_CONSTRAINT_ERRORS Processing object type TABLE_EXPORT/TABLE/PROCACT_INSTANCE Processing object type TABLE_EXPORT/TABLE/TABLE Table "DBATEST"."GNSRT00" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append Processing object type TABLE_EXPORT/TABLE/TABLE_DATA . . imported "DBATEST"."GNSRT00" 9.078 KB 20 out of 22 rows ----- >>> 2 row(s) were rejected with the following error: ORA-00001: unique constraint (DBATEST.RRN_PK_GNSRT00) violated Rejected rows with the primary keys are: Rejected row #1: column RRN: 3 Rejected row #2: column RRN: 4 Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX Job "SYS"."SYS_IMPORT_FULL_01" successfully completed at Tue Jan 24 10:33:07 2017 elapsed 0 00:00:07
We can see, it skipped the duplicate rows and and imported the rows . Best thing is it will print the rejected rows in the log.
7.LOGTIME=ALL Parameter – Oracle 12c new feature
LOGTIME
Specifies that messages displayed during export operations be timestamped.
Valid keyword values are: ALL, [NONE], LOGFILE and STATUS.
[oracle@localhost ~]$ expdp dumpfile=logdump.dmp logfile=viewlog.log directory=DUMP schemas=TEST logtime=ALL Export: Release 12.1.0.2.0 - Production on Sat Aug 22 07:19:41 2015 Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved. Username: sys/oracle@ORCL as sysdba Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options 22-AUG-15 07:19:51.752: Starting "SYS"."SYS_EXPORT_SCHEMA_02": sys/********@ORCL AS SYSDBA dumpfile=logdump.dmp logfile=viewlog.log directory=DUMP schemas=TEST logtime=ALL 22-AUG-15 07:19:52.687: Estimate in progress using BLOCKS method... 22-AUG-15 07:19:55.163: Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA 22-AUG-15 07:19:55.244: Total estimation using BLOCKS method: 39 MB 22-AUG-15 07:19:55.776: Processing object type SCHEMA_EXPORT/USER 22-AUG-15 07:19:55.937: Processing object type SCHEMA_EXPORT/SYSTEM_GRANT 22-AUG-15 07:19:55.971: Processing object type SCHEMA_EXPORT/ROLE_GRANT 22-AUG-15 07:19:56.014: Processing object type SCHEMA_EXPORT/DEFAULT_ROLE 22-AUG-15 07:19:56.687: Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA 22-AUG-15 07:20:05.095: Processing object type SCHEMA_EXPORT/TABLE/TABLE 22-AUG-15 07:20:08.467: Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS 22-AUG-15 07:20:08.719: Processing object type SCHEMA_EXPORT/STATISTICS/MARKER 22-AUG-15 07:20:20.538: . . exported "TEST"."TEST" 10.66 MB 93925 rows 22-AUG-15 07:20:20.763: . . exported "TEST"."TEST2" 10.66 MB 93937 rows 22-AUG-15 07:20:21.072: . . exported "TEST"."TEST3" 10.66 MB 93938 rows 22-AUG-15 07:20:21.926: Master table "SYS"."SYS_EXPORT_SCHEMA_02" successfully loaded/unloaded 22-AUG-15 07:20:21.928: ****************************************************************************** 22-AUG-15 07:20:21.929: Dump file set for SYS.SYS_EXPORT_SCHEMA_02 is: 22-AUG-15 07:20:21.932: /home/oracle/DUMP/logdump.dmp 22-AUG-15 07:20:21.957: Job "SYS"."SYS_EXPORT_SCHEMA_02" successfully completed at Sat Aug 22 07:20:21 2015 elapsed 0 00:00:32 [oracle@localhost ~]$
8. Views_as_tables Parameter In Datapump Of Oracle 12c
VIEWS_AS_TABLES parameter has been introduced in datapump of 12c. With this we can export a view at source database as a table and import as table in target database.
Complete article – Click here
9. TRANSFORM=DISABLE_ARCHIVE_LOGGING – Oracle 12c new feature
A new feature has been added in datapump of oracle 12c. We can import data with nologgin option i.e without generating logs. We sometimes face issue while importing big tables, as it generates lot of archive logs.
TRANSFORM=DISABLE_ARCHIVE_LOGGING:Y – This parameter can be used to make the impdp nologging.
Complete article –Click here
10. ENCRYPTION in export Import
ENCRYPTION
Encrypt part or all of a dump file.
Valid keyword values are: ALL, DATA_ONLY, ENCRYPTED_COLUMNS_ONLY, METADATA_ONLY and NONE.
ENCRYPTION_ALGORITHM
Specify how encryption should be done.
Valid keyword values are: [AES128], AES192 and AES256.
ENCRYPTION_MODE
Method of generating encryption key.
Valid keyword values are: DUAL, PASSWORD and [TRANSPARENT].
ENCRYPTION_PASSWORD
Password key for creating encrypted data within a dump file.
ENCRYPTION_PWD_PROMPT ——– >>>>>>> INTRODUCED IN ORACLE 12C
Specifies whether to prompt for the encryption password [NO].
Terminal echo will be suppressed while standard input is read.
ENCRYPTION_PASSWORD:
To encrypt any dumpfile, provide password to ENCRYPTION_PASSWORD in parfile.
dumpfile=emp_enc1.dmp logfile=emp_tab.log directory=VEN tables=dbatest.EMP_TAB ENCRYPTION_PASSWORD=oracle ENCRYPTION_ALGORITHM=AES256
Export: Release 12.1.0.2.0 - Production on Tue Jan 24 12:25:15 2017 Starting "SYS"."SYS_EXPORT_TABLE_01": /******** AS SYSDBA dumpfile=emp_enc1.dmp logfile=emp_tab.log directory=VEN tables=dbatest.EMP_TAB ENCRYPTION_PASSWORD=******** ENCRYPTION_ALGORITHM=AES256 Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER . . exported "DBATEST"."EMP_TAB" 19.16 MB 175340 rows Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ******************************************************************************
ENCRYPTION_PWD_PROMPT
With this parameter, No need to write the password in parfile or expdp command , we can pass the password, when it asks for input.
expdp dumpfile=emp_enc.dmp logfile=emp_tab.log directory=VEN tables=dbatest.EMP_TAB ENCRYPTION_PWD_PROMPT=YES
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options Encryption Password: ------- > Here provide the password FLASHBACK automatically enabled to preserve database integrity. Starting "SYS"."SYS_EXPORT_TABLE_01": /******** AS SYSDBA dumpfile=emp_enc.dmp logfile=emp_tab.log directory=VEN tables=dbatest.EMP_TAB ENCRYPTION_PWD_PROMPT=YES Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 23 MB . . exported "DBATEST"."EMP_TAB" 19.16 MB 175340 rows Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ******************************************************************************
While importing this dump, we have to pass the same password.
dumpfile=emp_enc.dmp logfile=emp_tab.log directory=VEN tables=dbatest.EMP_TAB ENCRYPTION_PWD_PROMPT=YES
Username: / as sysdba Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options Encryption Password: ----- >>> PASS THE PASSWORD HERE --- > Master table "SYS"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded Starting "SYS"."SYS_IMPORT_TABLE_01": /******** AS SYSDBA dumpfile=emp_enc.dmp logfile=emp_tab.log directory=VEN tables=dbatest.EMP_TAB ENCRYPTION_PWD_PROMPT=YES Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/TABLE_DATA . . imported "DBATEST"."EMP_TAB" 19.16 MB 175340 rows Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER Job "SYS"."SYS_IMPORT_TABLE_01" successfully completed at Tue Jan 24 12:23:38 2017 elapsed 0 00:01:52
If you try to import without encryption password, it will throw error as : ORA-39174: Encryption password must be supplied.
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options ORA-39002: invalid operation ORA-39174: Encryption password must be supplied.
11. CONTENT parameter in EXPDP/IMPDP:
CONTENT
Specifies data to unload.
Valid keyword values are: [ALL], DATA_ONLY and METADATA_ONLY.
CONTENT=ALL is the default value
CONTENT=METADATA_ONLY:
It will take export only the skeleton/DDL on the object/schema. It wont export any of the data/rows.
dumpfile=emp_enc1.dmp logfile=emp_enc.log directory=VEN tables=DBATEST.EMP_TAB CONTENT=METADATA_ONLY
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options FLASHBACK automatically enabled to preserve database integrity. Starting "SYS"."SYS_EXPORT_TABLE_01": /******** AS SYSDBA dumpfile=emp_enc1.dmp logfile=emp_enc.log directory=VEN tables=DBATEST.EMP_TAB CONTENT=METADATA_ONLY Processing object type TABLE_EXPORT/TABLE/TABLE Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
CONTENT=DATA_ONLY:
It will take export only the ROWS/DATA of the tables , by excluding the DDL.
dumpfile=emp_enc2.dmp logfile=emp_enc.log directory=VEN tables="DBATEST"."EMP_TAB" CONTENT=DATA_ONLY
FLASHBACK automatically enabled to preserve database integrity. Starting "SYS"."SYS_EXPORT_TABLE_01": /******** AS SYSDBA dumpfile=emp_enc2.dmp logfile=emp_enc.log directory=VEN tables=DBATEST.EMP_TAB CONTENT=DATA_ONLY Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 23 MB . . exported "DBATEST"."EMP_TAB" 19.16 MB 175340 rows Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ****************************************************************************** Dump file set for SYS.SYS_EXPORT_TABLE_01 is:
Before importing the dump into another database, make sure that the table exists in that table, as we have taken export using DATA_ONLY,
which don’t contain the DDL. Else import will fail with error as below.
Here we tried to load this dump into a database, where table doesn’t exists.
select table_name from dba_tables where table_name='EMP_TAB'; no rows selected
Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA dumpfile=emp_enc2.dmp logfile=emp_enc.log directory=VEN Processing object type TABLE_EXPORT/TABLE/TABLE_DATA ORA-39034: Table TABLE_DATA:"DBATEST"."EMP_TAB" does not exist. ---- >>>>>>>> ERROR Job "SYS"."SYS_IMPORT_FULL_01" completed with 1 error(s) at Tue Jan 24 14:45:01 2017 elapsed 0 00:00:54
So it failed.
NOTE – NEVER USE TABLE_EXISTS_ACTION=REPLACE
12. CLUSTER PARAMETER IN RAC:
CLUSTER
Utilize cluster resources and distribute workers across the Oracle RAC [YES].
In a RAC database, if you are taking export with parallel option and the datapump directory is not shared between the nodes, then export might fail with below error.
Because, the remote node is unable to write the dumps to that directory.
ORA-31617: unable to open dump file “/dumparea/TEST/asset_75.dmp” for write
ORA-19505: failed to identify file “/dumparea/TEST/asset_75.dmp”
ORA-27037: unable to obtain file status
SVR4 Error: 2: No such file or directory
So to avoid this, Use CLUSTER=N parameter, with which , the export will run worker processes only on the local node.
parfile:
dumpfile=asset_%U.dmp logfile=asset.log directory=VEN parallel=32 cluster=N
13.EXPDP to an ASM DISKGROUP:
We can use asm diskgroups also to store the expdp dumpfile.
Create a directory pointing to asm diskgroup( for dumpfiles)
SQL> create directory SOURCE_DUMP as '+NEWTST/TESTDB2/TEMPFILE'; Directory created. SQL> grant read,write on directory SOURCE_DUMP to public; Grant succeeded.
Create a directory pointing to normal filesystem ( for logfiles)
SQL> create directory EXPLOG as '/export/home/oracle'; Directory created.
export syntax:
expdp dumpfile=dumpfile.dmp logfile= LOGFILE_DIR:logfile.log directory=DUMP_DIR tables=….
expdp dumpfile=test.dmp logfile=EXPLOG:test.log directory=SOURCE_DUMP tables=dbatest.EMPTAB exclude=statistics
expdp dumpfile=test.dmp logfile=EXPLOG:test.log directory=SOURCE_DUMP tables=dbatest.EMPTAB exclude=statistics Export: Release 12.1.0.2.0 - Production on Wed Feb 8 10:07:54 2017 Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved. Username: / as sysdba Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP, Advanced Analytics and Real Application Testing options Starting "SYS"."SYS_EXPORT_TABLE_01": /******** AS SYSDBA dumpfile=test.dmp logfile=EXPLOG:test.log directory=SOURCE_DUMP tables=dbatest.EMPTAB exclude=statistics Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 11 MB Processing object type TABLE_EXPORT/TABLE/TABLE . . exported "DBATEST"."EMPTAB" 8.691 MB 81753 rows Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ****************************************************************************** Dump file set for SYS.SYS_EXPORT_TABLE_01 is: +NEWTST/TESTDB2/tempfile/test.dmp Job "SYS"."SYS_EXPORT_TABLE_01" successfully completed at Wed Feb 8 10:08:13 2017 elapsed 0 00:00:15
Note: If you dont mention the logfile directory, then it will fail with below error.
expdp dumpfile=test.dmp logfile=test.log directory=SOURCE_DUMP tables=dbatest.EMPTAB exclude=statistics Export: Release 12.1.0.2.0 - Production on Wed Feb 8 10:06:12 2017 Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved. Username: / as sysdba Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP, Advanced Analytics and Real Application Testing options ORA-39002: invalid operation ORA-39070: Unable to open the log file. ORA-29283: invalid file operation ORA-06512: at "SYS.UTL_FILE", line 536 ORA-29283: invalid file operation
14.EXPDP TO MULTIPLE DIRECTORIES:
If you wish to take a expdp backup of a big table/schema, but you don’t sufficient space in a single mount point to keep the dump. In this case we take expdp dump to multiple directory.
Refer link – https://dbaclass.com/article/how-to-take-expdp-of-a-table-to-multiple-directories-in-oracle/
Good one!
Very useful info. Thank you very much for summarizing it.