Oracle Data Pump - Part II

James Koopmann's picture

Since we are all familiar with Oracle’s original export (exp) utility, and in my opinion Data Pump will be replacing exp soon, I thought it would be good to start off getting familiar with this utility by some relatively simple Data Pump exports (expdp) that are similar to the way we have used exp in the past. In particular the FULL export.

As a word of caution, the Data Pump exports (expdp) are not compatible with exp. So as you go forth and play with this utility please at least name the exports something that will signify that the dump file was created by expdp so that you won’t get confused. Also since this utility is not backward compatible, if you have any databases prior to 10g and you are using exp, you may want to hold off on implementing the new expdp utility as you will not be able to import into any pre-10g databases.

Where are my dmp files

Different from the original export utility in Oracle, Data Pump runs only on the server side. You may initiate the export from a client but the job(s) themselves will run inside an Oracle server. There are no dump files (expdat.dmp) or log files that will be created on your local machine if you initiate a Data Pump Export (expdp) there. But, as we will see through a couple examples, if you have an Oracle server on your local machine you can get Oracle to produce dump files there.

Oracle creates dump and log files through DIRECTORY objects. So before you can use Data Pump you must create a DIRECTORY object. There are a few different “default” mechanisms for Oracle to determine an appropriate DIRECTORY to use. Mostly through environment variables and a default directory name that Oracle will look for. But as we all should know, we should not leave this to chance and instead explicitly create and use a directory object name of our choice. As soon as you create an object, a DIRECTORY here, that is a default you open yourself up to security breaches and thus this practice should be avoided. So for here I have logged into my S1 database and will create a DIRECTORY named datapump.

SQL-S1> CREATE DIRECTORY datapump AS 'C:\user\datafile\datapump';

Then, as you use Data Pump you can reference this DIRECTORY as a parameter for export where you would like the dump or log files to end up. It is good to note here that as dump and log files are created, log files that are written to will overwrite existing log files of the same name but dump files that have the same name will only create an error condition and error out the Data Pump job. This was not the case with Oracle’s original export utility (exp). Subsequent exports would overwrite all files. With Data Pump this is a nice safeguard but can also create problems for those of us who did nightly exports to the same location and file names. Now we have to think about cleanup routines. A small price to pay for additional security that could save your life one day when the scraping utility fails.

Just like the original exp utility Data Pump requires some authorization to allow users to export. Here I am granting EXP_FULL_DATABASE to a user JKOOP on database S1 that will allow the user to perform a full database export. If not given the JKOOP user could only export their own schema. Also I need to grant READ and WRITE privileges on the recently created DIRECTORY. Also on database S1.

SQL-S1 > GRANT READ, WRITE ON DIRECTORY datapump to jkoop;

Now for the Examples

We are all familiar with the FULL database export. Data Pump easily performs this with the following command line. Notice there are just a few name changes and instead of specifying the directory path in the file locations the additional parameter for your DIRECTORY is supplied. This command line assumes you are on the database server and environment variables are properly set for a direct connection.

E:> expdp jkoop/pwd FULL=y DIRECTORY=datapump DUMPFILE=expdata.dmp LOGFILE=expdata.log

We have also used the exp utility to connect through a TNS entry to perform an export on a remote database. Data Pump can also easily do this by adding a connection identifier to the user/password parameter. The exact same way done in exp.

E:> expdp jkoop/pwd@S1 FULL=y DIRECTORY=datapump DUMPFILE=byTNS.dmp LOGFILE=byTNS.log

Now for a few export trickeries. These next two examples assume an additional database named S2. They allow for a connection to the target database that we want to export through a database link. So the first thing to do is create a database link.


The key item to remember with Data Pump and where files will end up is the fact that wherever you Data Pump runs it requires a DIRECTORY to place dump and log files in. So since we will be connecting to the S2 database there will be required a DIRECTORY for placing these files in. Here I create a new DIRECTORY named mydump on database S2.

SQL-S2> CREATE DIRECTORY mydump AS 'D:\mydump';

Now for the command line options. Here we are running on the server where database S2 resides and will be producing a full dump of database S1 through the NETWORK_LINK. But placing the dump and log files on the server where database S1 resides. This was great news for me as when I first read the documentation I thought all dumps would have to reside on the server the database resided on. Now I can almost produce an environment where a single database is a ‘gateway’ for my database exports if needed.

E:> expdp jkoop/pwd FULL=y DIRECTORY=mydump NETWORK_LINK=S1 DUMPFILE=byDB.dmp LOGFILE=byDB.log

Ok, suppose we do produce that gateway for exports. Do we need to execute all commands from that server? No! With Data Pump we need only connect to the S2 database through a TNS entry and then supply the appropriate NETWORK_LINK to the database we want to export.


Introduction to Monitoring Data Pump

A simple way to gain insight into the status of a Data Pump job is to look into a few views maintained within the Oracle instance the Data Pump job is running. These views are DBA_DATAPUMP_JOBS, DBA_DATAPUMP_SESSIONS, and V$SESSION_LOGOPS. These views are critical in the monitoring of your export jobs so, as we will see in a later article, you can attach to a Data Pump job and modify the execution of the that job.

This view will show the active Data Pump jobs, their state, degree of parallelism, and the number of sessions attached.

SQL> select * from dba_datapump_jobs

---------- ---------------------- ---------- ---------- ------------- --------- -----------------
JKOOP      SYS_EXPORT_FULL_01     EXPORT     FULL       EXECUTING     1          1
JKOOP      SYS_EXPORT_SCHEMA_01   EXPORT     SCHEMA     EXECUTING     1          1

This view give gives the SADDR that assist in determining why a Data Pump session may be having problems. Join to the V$SESSION view for further information.


OWNER_NAME JOB_NAME                       SADDR
---------- ------------------------------ --------

This view helps determine how well a Data Pump export is doing. Basically gives you a progress indicator through the MESSAGE column.

SQL> select username,opname,target_desc,sofar,totalwork,message from V$SESSION_LONGOPS

-------- -------------------- ---------- ----- ---------- ------------------------------------------------
JKOOP    SYS_EXPORT_FULL_01   EXPORT       132        132 SYS_EXPORT_FULL_01:EXPORT:132 out of 132 MB done
JKOOP    SYS_EXPORT_FULL_01   EXPORT        90        132 SYS_EXPORT_FULL_01:EXPORT:90 out of 132 MB done
JKOOP    SYS_EXPORT_SCHEMA_01 EXPORT        17         17 SYS_EXPORT_SCHEMA_01:EXPORT:17 out of 17 MB done
JKOOP    SYS_EXPORT_SCHEMA_01 EXPORT        19         19 SYS_EXPORT_SCHEMA_01:EXPORT:19 out of 19 MB done

The original export utility (exp) may or may not be going away soon. The documentation clearly states that Data Pump will handle data types that exp will not and we should begin our migration to this new utility. Except for those instances where you must export between 10g and pre-10g databases. This article stepped through the process of performing FULL exports as these are typical in Oracle environment. If you are doing schema or table exports the change is simple and we will visit those in subsequent parts to this series.


However you still have a problem when you are looking very quickly extract data from Oracle into flat portable format rather than into binary. Data pump extracting data into binary in similar way as export does. This leads to slower import cycle (it is takes longer to import binary data than if data would be in ascii). For instance, if you will take Fastreader from wisdomforce, so this software allows quickly export data from large tables into csv format or even into pipe which makes data movement process much more convenient and robust. Oracle still has a lot to accomplish here.

What do you think?


Extremely efficient write up, pointed out how i can use datapump to maximise data exports. However the amount of Oracle 7 and 9i databases i have will really stop me being able to use it for a year or so.

Thanks for all the advice in the article though.


Informative article and we set upon testing this out. Has anyone found this to be slower than normal import export, am I missing something.


When I wrote the syntax for importing data using datapump
impdp system/password@db10g full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impdpDB10G.log fromuser=test touser=test1, Then I get error message LRM-00101.But I didn't get.
So, Pls help me.

Hey Koop! Thanks as always for the help. I found this little nugget and thought it might be helpful for folks trying to kill a DP job using the Data Pump API:

SET serveroutput on
SET lines 100
   l_handle NUMBER;
  -- Format: DBMS_DATAPUMP.ATTACH('[job_name]','[owner_name]');
   DBMS_DATAPUMP.STOP_JOB (l_handle,1,0);


You created a directory on SQL-S2> CREATE DIRECTORY mydump AS 'D:\mydump';
Should it be SQL-S1> CREATE DIRECTORY mydump AS 'D:\mydump';

Because we are storing our data files and log files on server S1.

Please explain.
Thanks in advance