Oracle Data Pump (Expdp, Impdp) in Oracle Database 10g, 11g, 12c, 18c
Oracle Data Pump (Expdp, Impdp) in Oracle Database 10g, 11g, 12c, 18c
com/articles/10g/oracle-data-pump-10g#GettingStarted
8i | 9i | 10g | 11g | 12c | 13c | 18c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux
Home » Articles » 10g » Here
Oracle Data Pump is a newer, faster and more flexible alternative to the "exp" and "imp" utilities
used in previous Oracle versions. In addition to basic import and export functionality data pump
provides a PL/SQL API and support for external tables.
This article was originally written against Oracle 10g, but the information is still relevant up to and
including the latest versions of Oracle. New features are broken out into separate articles, but the
help section at the bottom is up to date with the latest versions.
Getting Started
Table Exports/Imports
Schema Exports/Imports
Database Exports/Imports
INCLUDE and EXCLUDE
CONTENT and QUERY
Network Exports/Imports (NETWORK_LINK)
Flashback Exports
Miscellaneous Information
Data Pump API
External Tables (Unloading/Loading Data Using External Tables)
Secure External Password Store
Help
o expdp
o impdp
Related articles.
Data Pump Enhancements in Oracle Database 11g Release 1 (expdp and impdp)
Data Pump Enhancements in Oracle Database 12c Release 1 (expdp and impdp)
Data Pump Enhancements in Oracle Database 12c Release 2 (expdp and impdp)
SQL Developer 3.1 Data Pump Wizards (expdp, impdp)
Transportable Tablespaces
Oracle Cloud : Autonomous Data Warehouse (ADW) - Import Data from an Object Store
(impdp)
Getting Started
For the examples to work we must first unlock the SCOTT account and create a directory object it
can access. The directory object is only a pointer to a physical directory, creating it does not
actually create the physical directory on the file system of the database server.
CONN / AS SYSDBA
ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;
Table Exports/Imports
The TABLES parameter is used to specify the tables that are to be exported. The following is an
example of the table export and import syntax.
Schema Exports/Imports
The OWNER parameter of exp has been replaced by the SCHEMAS parameter which is used to specify
the schemas to be exported. The following is an example of the schema export and import syntax.
Database Exports/Imports
The FULL parameter indicates that a complete database export is required. The following is an
example of the full database export and import syntax.
INCLUDE=object_type[:name_clause] [, ...]
EXCLUDE=object_type[:name_clause] [, ...]
The following code shows how they can be used as command line parameters.
If the parameter is used from the command line, depending on your OS, the special characters in
the clause may need to be escaped, as follows. Because of this, it is easier to use a parameter
file.
A single import/export can include multiple references to the parameters, so to export tables,
views and some packages we could use either of the following approaches.
INCLUDE=TABLE,VIEW,PACKAGE:"LIKE '%API'"
or
INCLUDE=TABLE
INCLUDE=VIEW
INCLUDE=PACKAGE:"LIKE '%API'"
EXCLUDE=SCHEMA:"LIKE 'SYS%'"
EXCLUDE=SCHEMA:"IN
('OUTLN','SYSTEM','SYSMAN','FLOWS_FILES','APEX_030200','APEX_PUBLIC_USER','ANONYMO
US')"
The valid object type paths that can be included or excluded can be displayed using
the DATABASE_EXPORT_OBJECTS, SCHEMA_EXPORT_OBJECTS, and TABLE_EXPORT_OBJECTS views.
The QUERY parameter allows you to alter the rows exported from one or more tables. The following
example does a full database export, but doesn't include the data for the EMP and DEPT tables.
The way you handle quotes on the command line will vary depending on what you are trying to
achieve. Here are some examples that work for single tables and multiple tables directly from the
command line.
CONN test/test
CREATE DATABASE LINK remote_scott CONNECT TO scott IDENTIFIED BY tiger USING
'DEV';
In the case of exports, the NETWORK_LINK parameter identifies the database link pointing to the
source server. The objects are exported from the source server in the normal manner, but written
to a directory object on the local server, rather than one on the source server. Both the local and
remote users require the EXP_FULL_DATABASE role granted to them.
For imports, the NETWORK_LINK parameter also identifies the database link pointing to the source
server. The difference here is the objects are imported directly from the source into the local
server without being written to a dump file. Although there is no need for a DUMPFILE parameter, a
directory object is still required for the logs associated with the operation. Both the local and
remote users require the IMP_FULL_DATABASErole granted to them.
Flashback Exports
The exp utility used the CONSISTENT=Y parameter to indicate the export should be consistent to a
point in time. By default the expdp utility exports are only consistent on a per table basis. If you
want all tables in the export to be consistent to the same point in time, you need to use
the FLASHBACK_SCN or FLASHBACK_TIME parameter.
The FLASHBACK_TIME parameter value is converted to the approximate SCN for the specified time.
# In parameter file.
flashback_time="to_timestamp('09-05-2011 09:00:00', 'DD-MM-YYYY HH24:MI:SS')"
Not surprisingly, you can make exports consistent to an earlier point in time by specifying an
earlier time or SCN, provided you have enough UNDO space to keep a read consistent view of the
data during the export operation.
If you prefer to use the SCN, you can retrieve the current SCN using one of the following queries.
SELECT current_scn FROM v$database;
SELECT DBMS_FLASHBACK.get_system_change_number FROM dual;
SELECT TIMESTAMP_TO_SCN(SYSTIMESTAMP) FROM dual;
The following queries may prove useful for converting between timestamps and SCNs.
In 11.2, the introduction of legacy mode means that you can use the CONSISTENT=Y parameter with
the expdp utility if you wish.
Miscellaneous Information
Unlike the original exp and imp utilities all data pump ".dmp" and ".log" files are created on the
Oracle server, not the client machine.
All data pump actions are performed by multiple jobs (server processes not DBMS_JOB jobs).
These jobs are controlled by a master control process which uses Advanced Queuing. At runtime
an advanced queue table, named after the job name, is created and used by the master control
process. The table is dropped on completion of the data pump job. The job and the advanced
queue can be named using the JOB_NAMEparameter. Cancelling the client process does not stop
the associated data pump job. Issuing "ctrl+c" on the client during a job stops the client output and
presents a command prompt. Typing "status" at this prompt allows you to monitor the current job.
Export> status
Job: SYS_EXPORT_FULL_01
Operation: EXPORT
Mode: FULL
State: EXECUTING
Bytes Processed: 0
Current Parallelism: 1
Job Error Count: 0
Dump File: D:\TEMP\DB10G.DMP
bytes written: 4,096
Worker 1 Status:
State: EXECUTING
Object Schema: SYSMAN
Object Name: MGMT_CONTAINER_CRED_ARRAY
Object Type: DATABASE_EXPORT/SCHEMA/TYPE/TYPE_SPEC
Completed Objects: 261
Total Objects: 261
Data pump performance can be improved by using the PARALLEL parameter. This should be used
in conjunction with the "%U" wildcard in the DUMPFILE parameter to allow multiple dumpfiles to be
created or read. The same wildcard can be used during the import to allow you to reference
multiple files.
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'SCOTT.dmp',
directory => 'TEST_DIR');
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'SCOTT.log',
directory => 'TEST_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter(
handle => l_dp_handle,
name => 'SCHEMA_EXPR',
value => '= ''SCOTT''');
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;
/
Once the job has started the status can be checked using.
The syntax to create the external table pointing to an existing file is similar, but without the "AS"
clause. In this case we will do it the same schema, but this could be in a different schema in the
same instance, or in an entirely different instance.
Help
The HELP=Y option displays the available parameters.
expdp
expdp help=y
The Data Pump export utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:
You can control how Export runs by entering the 'expdp' command followed
by various parameters. To specify parameters, you use keywords:
Command Description
------------------------------------------------------------------------------
FILESIZE Default filesize (bytes) for subsequent ADD_FILE commands.
Command Description
------------------------------------------------------------------------------
REUSE_DUMPFILES Overwrite destination dump file if it exists (N).
Oracle 11g Release 2 (11.2) altered the format of the help output as well as adding the following
parameters.
CLUSTER
Utilize cluster resources and distribute workers across the Oracle RAC.
Valid keyword values are: [Y] and N.
SERVICE_NAME
Name of an active Service and associated resource group to constrain Oracle RAC
resources.
SOURCE_EDITION
Edition to be used for extracting metadata.
Oracle 12c Release 1 (12.1) added the following parameters.
ABORT_STEP
Stop the job after it is initialized or at the indicated object.
Valid values are -1 or N where N is zero or greater.
N corresponds to the object's process order number in the master table.
ACCESS_METHOD
Instructs Export to use a particular method to unload data.
Valid keyword values are: [AUTOMATIC], DIRECT_PATH and EXTERNAL_TABLE.
COMPRESSION_ALGORITHM
Specify the compression algorithm that should be used.
Valid keyword values are: [BASIC], LOW, MEDIUM and HIGH.
ENCRYPTION_PWD_PROMPT
Specifies whether to prompt for the encryption password.
Terminal echo will be suppressed while standard input is read.
KEEP_MASTER
Retain the master table after an export job that completes successfully [NO].
MASTER_ONLY
Import just the master table and then stop the job [NO].
METRICS
Report additional job information to the export log file [NO].
VIEWS_AS_TABLES
Identifies one or more views to be exported as tables.
For example, VIEWS_AS_TABLES=HR.EMP_DETAILS_VIEW.
STOP_WORKER
Stops a hung or stuck worker.
TRACE
Set trace/debug flags for the current job.
impdp
impdp help=y
The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:
You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords:
Oracle 11g Release 2 (11.2) altered the format of the help output as well as adding the following
parameters.
CLUSTER
Utilize cluster resources and distribute workers across the Oracle RAC.
Valid keyword values are: [Y] and N.
SERVICE_NAME
Name of an active Service and associated resource group to constrain Oracle RAC
resources.
SOURCE_EDITION
Edition to be used for extracting metadata.
TARGET_EDITION
Edition to be used for loading metadata.
ABORT_STEP
Stop the job after it is initialized or at the indicated object.
Valid values are -1 or N where N is zero or greater.
N corresponds to the object's process order number in the master table.
ACCESS_METHOD
Instructs Export to use a particular method to unload data.
Valid keyword values are: [AUTOMATIC], DIRECT_PATH and EXTERNAL_TABLE.
ENCRYPTION_PWD_PROMPT
Specifies whether to prompt for the encryption password.
Terminal echo will be suppressed while standard input is read.
KEEP_MASTER
Retain the master table after an export job that completes successfully [NO].
MASTER_ONLY
Import just the master table and then stop the job [NO].
METRICS
Report additional job information to the export log file [NO].
TRANSPORTABLE
Options for choosing transportable data movement.
Valid keywords are: ALWAYS and [NEVER].
Only valid in NETWORK_LINK mode import operations.
VIEWS_AS_TABLES
Identifies one or more views to be imported as tables.
For example, VIEWS_AS_TABLES=HR.EMP_DETAILS_VIEW.
Note that in network import mode, a table name may be appended
to the view name.
REMAP_DIRECTORY
Remap directories when you move databases between platforms.
STOP_WORKER
Stops a hung or stuck worker.
TRACE
Set trace/debug flags for the current job.