Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Exporting to Unix file bigger than 2G.

Re: Exporting to Unix file bigger than 2G.

From: Thomas Kyte <tkyte_at_us.oracle.com>
Date: Mon, 08 Mar 1999 16:28:08 GMT
Message-ID: <36e5f9d0.11919909@192.86.155.100>


A copy of this was sent to "Pedro Tavares" <pedrotavares_at_cmvm.pt> (if that email address didn't require changing) On Mon, 8 Mar 1999 15:28:25 -0000, you wrote:

>What sould I do in order to export the database to a Unix file that exceeds
>2GB. ?
>Thanks
>
>Pedro Tavares
>C.M.V.M.
>Lisbon, Portugal
>

In addition to the other post with the "how", here is an example that does it to a file using SPLIT and GZIP to get n 500meg compressed files (where n varies depending on the size of the export)....

It exports the whole database and then does an import to test the integrity of the export file as much as possible (also shows how to reimport this munged up file)...

#!/bin/csh -vx

setenv UID sys/o8isgr8
setenv FN exp.`date +%j_%Y`.dmp.gz
#setenv FN exp.`date +%j_%Y`.dmp.Z
setenv PIPE /tmp/exp_tmp.dmp

#setenv COMPRESS compress
setenv COMPRESS gzip

echo $FN

cd /nfs/atc-netapp1/expbkup2
ls -l

rm expbkup.log export.test exp.*.dmp.{gz,Z}.* $PIPE mknod $PIPE p

date > expbkup.log
$COMPRESS < $PIPE | split -b 500m - $FN. & exp userid=$UID buffer=20000000 file=$PIPE full=y >>& expbkup.log date >> expbkup.log

date > export.test
cat `echo $FN.* | sort` | zcat > $PIPE & imp userid=sys/o8isgr8 file=$PIPE show=y full=y >>& export.test date >> export.test

ls -l
rm -f $PIPE  

Thomas Kyte
tkyte_at_us.oracle.com
Oracle Service Industries
Reston, VA USA

--
http://govt.us.oracle.com/ -- downloadable utilities  



Opinions are mine and do not necessarily reflect those of Oracle Corporation Received on Mon Mar 08 1999 - 10:28:08 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US