Re: Loading in a compressed dump file

From: Chris Hamilton <toneczar_at_erols.com>
Date: Thu, 18 Jun 1998 13:13:17 GMT
Message-ID: <6mb3rm$59k$1_at_goo.nwd.usace.army.mil>


reb_at_shell5.ba.best.com (TimeVest Inc.) wrote:

>So apparently a large database (6 gigs) was dumped in compressed
>format (.Z) to disk and is around 400 MBs. The file was created on
>Solaris 2.5.
>
>I copied the compressed file over to a 9 gig NFTS drive on an NT.
>Using the mks uncompress I tried to uncompress it. It created a 2 gig
>file. I tried again with gzip_nt.exe and it also created a 2 gig file.
>
>How am I supposed to load this database into the database if I can't
>even uncompress the dump file to it's full 6 gigs?
>
>Any ideas?!

[Quoted] I think your best bet is to create a tnsnames.ora entry on the Solaris box, pointing to the database on NT. Then run the "exp" utility from Solaris, loading data into the NT database.

Or maybe you can export the data into several smaller files. This wouldn't be an option if a single table is > 2 GB, though.

I had a problem once with exp/imp on NT - exp73 unloaded the whole database, about 4 GB worth. No errors in the log file. However, the next day I went to import the data back into a new database, I found that the same file had actually been truncated at 2 GB, though NT never notified exp73 that the file was truncated.

Chris



Chris Hamilton, DBA
US Army Corps of Engineers
christopher.h.hamilton_at_usace.army.mil http://www.serve.com/cowpb/chamilton.html Received on Thu Jun 18 1998 - 15:13:17 CEST

Original text of this message