1. Import using NETWORK_LINK
2. QUERY CLAUSE in EXPDP
3. SQLFILE parameter in impdp
4. TABLE_EXISTS_ACTION option in IMPDP
5. EXCLUDE/INCLUDE option
6. DATA_OPTION=SKIP_CONSTRAINT_ERRORS
7.LOGTIME=ALL Parameter – Oracle 12c new feature
8. Views_as_tables Parameter In Datapump Of Oracle 12c
9. TRANSFORM=DISABLE_ARCHIVE_LOGGING – Oracle 12c new feature
10. ENCRYPTION in export Import
11. CONTENT parameter in EXPDP/IMPDP:
12. CLUSTER PARAMETER IN RAC:
13.EXPDP to an ASM DISKGROUP:
14.EXPDP to multiple directories :

 

Objective – Export table dbatest.EMP_TAB from SOURCEDB to TARGETDB using network link .
Import using network , wont create any dumpfile, it will directly do the import using db_link

Add tns entry of sourcedb in TARGETDB tnsnames.ora file [ TARGETDB]

Create a DB_LINK on TARGETDB pointing to SOURCEDB.

Now Run impdp on TARGETDB using NETWORK_LINK=<>

impdp directory=DUNNING logfile=test.log tables=dbatest.EMP_TAB network_link=SOURCE

2. QUERY CLAUSE in EXPDP

Requirement – Export dump of a table from emp_tab WHERE created > sysdate -40 . Filter can be added on any column depending upon the requirement.

SQL> select count(*) from “DBACLASS”.”EMP_TAB” WHERE created > sysdate -40;

COUNT(*)
———-
1384

Create a parfile with query clause:

Now run the expdp command with parfile. We can see, 1384 rows will be exported.

3. SQLFILE parameter in impdp:

It can be used, only with impdp. This helps in generating the DDLs from a dumpfile.
Suppose i have dump file of table DBACLASS.DEP_TAB . If you need the DDL of the table, then use sqlfile with impdp command as below.

Parfile:

Now check the generated sql_file:

4. TABLE_EXISTS_ACTION option in IMPDP:

TABLE_EXISTS_ACTION
Action to take if imported object already exists.
Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.

TABLE_EXISTS_ACTION=APPEND:

With this option ,while importing the table, if table exists in the database, then it will append the data on top the existing data in table.

TABLE_EXISTS_ACTION=TRUNCATE:

While importing the table, if the table exists in database, it will truncate the table and load the data.

TABLE_EXISTS_ACTION=REPLACE:
While importing , if the table exists in database, then it will drop it and recreate it from the dump

TABLE_EXISTS_ACTION=SKIP:

This is the defult option with impdp. I.e if the the table exists, it will skip that table.

5. EXCLUDE/INCLUDE option:

These two options can be used in both expdp or impdp to exclude or include, particular objects or object_types:
Here we will show few scenarios:

Export a schemas DBACLASS, excluding TABLE EMP_TAB and DEPT

Exclude few schemas while import:

export/Import only TABLE and INDEX ( OBJECT_TYPE)

6. DATA_OPTION=SKIP_CONSTRAINT_ERRORS

DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS is a very useful parameter mostly used with table_exists_action=APPEND .

In general, if you are appending data ( table_exists_action=APPEND) to an existing table with unique constaint and if the new rows violating the unique constraint,
then the import job will fail( even if only one row violates the constraint)

DATA_OPTION=SKIP_CONSTRAINT_ERRORS parameter will help in skipping the duplicate row and import the rest of the rows. It is similar to IGNORE=Y option of classic exp/imp

Here i tried to append data with table_exists_action=APPEND, it failed due to unique key violation.

impdp dumpfile=unique.dmp directory=EXPDIR table_exists_action=APPEND

Now lets try with SKIP_CONSTRAINT_ERRORS option:

impdp dumpfile=unique.dmp directory=EXPDIR table_exists_action=APPEND data_options=SKIP_CONSTRAINT_ERRORS

We can see, it skipped the duplicate rows and and imported the rows . Best thing is it will print the rejected rows in the log.

7.LOGTIME=ALL Parameter – Oracle 12c new feature

LOGTIME
Specifies that messages displayed during export operations be timestamped.
Valid keyword values are: ALL, [NONE], LOGFILE and STATUS.

8. Views_as_tables Parameter In Datapump Of Oracle 12c

VIEWS_AS_TABLES parameter has been introduced in datapump of 12c. With this we can export a view at source database as a table  and import as table in target database.

Complete article – Click here 

9. TRANSFORM=DISABLE_ARCHIVE_LOGGING – Oracle 12c new feature

A new feature has been added in datapump of oracle 12c. We can import data with nologgin option i.e without generating logs. We sometimes face issue while importing big tables, as it generates lot of archive logs.

TRANSFORM=DISABLE_ARCHIVE_LOGGING:Y – This parameter can be used to make the impdp nologging.

Complete article –Click here

10. ENCRYPTION in export Import

ENCRYPTION
Encrypt part or all of a dump file.
Valid keyword values are: ALL, DATA_ONLY, ENCRYPTED_COLUMNS_ONLY, METADATA_ONLY and NONE.

ENCRYPTION_ALGORITHM
Specify how encryption should be done.
Valid keyword values are: [AES128], AES192 and AES256.

ENCRYPTION_MODE
Method of generating encryption key.
Valid keyword values are: DUAL, PASSWORD and [TRANSPARENT].

ENCRYPTION_PASSWORD
Password key for creating encrypted data within a dump file.

ENCRYPTION_PWD_PROMPT ——– >>>>>>> INTRODUCED IN ORACLE 12C
Specifies whether to prompt for the encryption password [NO].
Terminal echo will be suppressed while standard input is read.

 

ENCRYPTION_PASSWORD:

 
To encrypt any dumpfile, provide password to ENCRYPTION_PASSWORD in parfile.

ENCRYPTION_PWD_PROMPT

With this parameter, No need to write the password in parfile or expdp command , we can pass the password, when it asks for input.

While importing this dump, we have to pass the same password.

If you try to import without encryption password, it will throw error as : ORA-39174: Encryption password must be supplied.

11. CONTENT parameter in EXPDP/IMPDP:

CONTENT
Specifies data to unload.
Valid keyword values are: [ALL], DATA_ONLY and METADATA_ONLY.

CONTENT=ALL is the default value

CONTENT=METADATA_ONLY:

It will take export only the skeleton/DDL on the object/schema. It wont export any of the data/rows.

CONTENT=DATA_ONLY:

It will take export only the ROWS/DATA of the tables , by excluding the DDL.

Before importing the dump into another database, make sure that the table exists in that table, as we have taken export using DATA_ONLY,
which don’t contain the DDL. Else import will fail with error as below.

Here we tried to load this dump into a database, where table doesn’t exists.

So it failed.

NOTE – NEVER USE TABLE_EXISTS_ACTION=REPLACE 

 

12. CLUSTER PARAMETER IN RAC:

CLUSTER
Utilize cluster resources and distribute workers across the Oracle RAC [YES].

In a RAC database, if you are taking export with parallel option and the datapump directory is not shared between the nodes, then export might fail with below error.
Because, the remote node is unable to write the dumps to that directory.

ORA-31617: unable to open dump file “/dumparea/TEST/asset_75.dmp” for write
ORA-19505: failed to identify file “/dumparea/TEST/asset_75.dmp”
ORA-27037: unable to obtain file status
SVR4 Error: 2: No such file or directory

So to avoid this, Use CLUSTER=N parameter, with which , the export will run worker processes only on the local node.

parfile:

13.EXPDP to an ASM DISKGROUP:

We can use asm diskgroups also to store the expdp dumpfile.

Create a directory pointing to asm diskgroup( for dumpfiles)

Create a directory pointing to normal filesystem ( for logfiles)

export syntax:

expdp dumpfile=dumpfile.dmp logfile= LOGFILE_DIR:logfile.log directory=DUMP_DIR  tables=….

expdp dumpfile=test.dmp logfile=EXPLOG:test.log directory=SOURCE_DUMP tables=dbatest.EMPTAB exclude=statistics

Note: If you dont mention the logfile directory, then it will fail with below error.

14.EXPDP TO MULTIPLE DIRECTORIES:

If you wish to take a expdp backup of a big table/schema, but you don’t sufficient space in a single mount point to keep the dump. In this case we take expdp dump to multiple directory.

Refer linkhttps://dbaclass.com/article/how-to-take-expdp-of-a-table-to-multiple-directories-in-oracle/