Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.server -> Importing huge dump file.
Hi all,
Need your suggestions to import a large dump file.
The compressed dump file size is around 9 GB( if I do gunzip it will
be morethan 80gb).
This dump file has around 100 tables that needs to be imported.
I don't have the luxury to import the entire dump file at once as this
takes lot of time and resources.
So I have descided to import few tables at a times. I created few
parfiles with
tables=" parameter and planning to run import whenever i get an
opportunity( normally night...)
I'm following the below given steps to import:
However i am not satisfied with the performance. Could somebody please let me know your thoughts on this process and also recommendations to improve the performance of import process.
Also has some concern about the parameter COMMIT=y/N . bit confused
about this commit parameter setting.
I don't mind increasing UNDO tablespace if needed.
Oracle Version Details:
Oracle9i Enterprise Edition Release 9.2.0.6.0 - Production
With the Partitioning option
OS:Linux 2.6.5-7.282-bigsmp #1 SMP
Thanks,
krish
Received on Fri Sep 07 2007 - 00:12:18 CDT