xpdp via network_link fails when specifying parallel

From: Jesper Wolf Jespersen <jesper_at_remove.oz8ace.dk.remove.this>
Date: Tue, 03 Aug 2010 14:20:51 +0200
Message-ID: <4c5809cd$0$56795$edfadb0f_at_dtext02.news.tele.dk>



Hello All.

I am using Oracle 10.2.0.3 enterprise edition on Windows 2003 R2 X64 edition.

When I perform Datapump exports via a database link and parallel > 1 my worker threads die.

The error message is: KUP-04038: internal error: kupax-meta1

It seems like its always worker thread 2 that dies, but even though the other threads seem to continue I do not trust the export being performed.

When using only one single worker thread there is no problem.

The command line is like this:



expdp system/manager_at_dbs NETWORK_LINK=P34T_SYSTEM schemas=(panda,ecom) directory=DATA_PUMP_DIR dumpfile=exp%u.dmp logfile=PandaExport.log parallel=%EXP_PARALLEL% parfile=exp.par

The parfile tries to recreate the consistent=y of the old export utility, the content is this:



FLASHBACK_TIME="TO_TIMESTAMP(TO_CHAR(SYSDATE,'YYYY-MM-DD HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS')"

I haven been able to find any information on this behaviour on the net.

I have used the same command line but without the network_link parameter to good effect, and parallel processing realy speeds up the export.

But since I need to move data between servers I would need to copy all the files from one server to the next after the export, which is a task that is hard to automate eficiently, since the driving site is a third node in the network.

In one setup we tried using dual hosted disks, one node can write and the other can read. No problem for the SAN but Windows does not expect an NTFS file system to change on its own, so we have to unmount and remount the drive on node 2 after the export on node 1 before I can start the import on node 2.
Also a task which is hard to automate.

Have any of you had similar experiences and what did you do ?

Greetings from Denmark
Jesper Wolf Jespersen Received on Tue Aug 03 2010 - 07:20:51 CDT

Original text of this message