Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.server -> Help with Import / Export
Hi All,
I need some help with this import problem. We exported our entire database using the following script:
DIR1=$EXPORT_PATH/$ORACLE_SID/$ORACLE_SID.p1.Z DIR2=$EXPORT_PATH/$ORACLE_SID/$ORACLE_SID.p2.Z MAXFILESIZE=2000m
mkfifo exportpipeaa
mkfifo exportpipeab
mkfifo wrkpipe
# start reader from first pipe to create export-file1
dd if=exportpipeaa of=$DIR1 &
# start reader from second pipe to create export-file2
dd if=exportpipeab of=$DIR2 &
# start reader from wrkpipe to compress the export stream and split
into
# into three pieces of size = MAXFILESIZE (max)
dd if=wrkpipe | compress | split -b $MAXFILESIZE - exportpipe &
# start the real export now into the workpipe (wrkpipe)
exp $SYSTEMPASSWD buffer=1000000 file=wrkpipe direct=y
log=$BACKUP_PATH/$ORACLE_SID/exp.log full=y
# clean up the stuff
rm exportpipeaa exportpipeab wrkpipe
This creates a file of 1.5GB. I need to import a table from this export file. I cannot uncompress it because it is too big for any one filesystem we have. So, I tried the import script:
DIR1=/tmp/DR_CASHT.p1.Z
mkfifo wrkpipe
umask 000
imp $SYSTEMPASSWD buffer=4096000 commit=y ignore=y file=wrkpipe tables=audit_trail log=/tmp/imp.log &
cat $DIR1 | uncompress | dd of=wrkpipe
What happens is I get a segmentation fault and a core dump.
I really need to get at this data for a certain table. Can anyone offer any suggestions? Or maybe a Perl script to go through the export file?
Thanks,
Arthur Received on Wed Aug 14 2002 - 15:22:47 CDT