Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: how to exp big table with 2GB file size limitation?

Re: how to exp big table with 2GB file size limitation?

From: John Matzen <"John>
Date: Thu, 14 Sep 2000 17:37:56 -0500
Message-ID: <ss2kjnt4ct676@corp.supernews.com>

Great! Thanks! At least with this idea I'll be able to split the output stream :-)

John

"Big Al" <db-guru_at_att.net> wrote in message news:39C14A04.4AA97687_at_att.net...
> True, the worst compression ration I get is about 4:1 (on used data not
> allocated) and usually better than that. The smallest file I have
> failed to compress into less than 2GB is 11GB and that was mainly
> because it contained a lot of raw columns. When files get over 12GB I
> start splitting into multiple exports.
>
> Big Al
>
> Sybrand Bakker wrote:
> >
> > Basically
> > create a pipe with mknod
> > fork a process which calls gzip reading it's input from the pipe (gzip >
> > [compressed file] < orapipe &)
> > exp your data to the pipe.
> > Your data will be compressed on the fly.
> > Probably the compression factor is big enough to survive 10G (and
 usually
> > that 10G isn't completel occupied anyway)
> >
> > Hth,
> >
> > Sybrand Bakker, Oracle DBA
> >
> > "John Matzen" <krum(at)uo(d0t)com> wrote in message
> > news:ss2868fdct671_at_corp.supernews.com...
> > > I'm running Oracle 8i on Solaris 2.7. How can I backup my 10GB of
 data
 with
> > > exp if I have a 2GB file size limitation?
> > >
> > > Thanks!
> > >
> > > John
> > >
> > >
> > >
Received on Thu Sep 14 2000 - 17:37:56 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US