Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.server -> Re: 2G Max on export file
Hi Gary,
The disadvantage of writing large export files is that reading them will take a
long time. You probably have your database split up in several tablespaces. Why
don't you make an export file for every tablespace that only contains the rows
(indexes=n constraints=n) and one that only contains the constraints & indexes
(rows=n).
You can write them in parallel and import in parallel too. With a little more
tweeking you can win some time reading the constraints file to (primarily spent
building indexes).
On Tue, 6 Apr 1999 19:46:06 +0200, Vitaliy Mogilevskiy wrote
(in message <370A485D.8F99FA20_at_earthlink.net>):
snip
> You could export to a tape device or if you want to goto disk, you can use a
> pipe to export to 'compress' or 'split'. (or compress and then split). For
> speed, I export to split creating a series of 500m files. Here is an
snip
> this script exports the full database into a series of 500m files. It then
> 'reconstructs' the original file after the export is done using cat into the
> pipe and tests the integrity of the export using IMP show=y....
snip
> gcb45_at_dial.pipex.com wrote:
>
>> I know that there is a 2G ceiling on UNIX files. One of my database export >> files is approaching the 2G limit and it is already compressed on the fly. >> Does anyone have any suggestions as to what should be done to deal with >> this problem? >> >> Any help would be appreciated Thanks Gary
--
The best way to accellerate a computer running windows is at 9.8 m/s^2
Received on Tue Apr 06 1999 - 23:20:12 CDT