Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Mailing Lists -> Oracle-L -> RE: Export Dump of of Large Table (Size = 200 GB) in 9.2 & Import into 10.2?

RE: Export Dump of of Large Table (Size = 200 GB) in 9.2 & Import into 10.2?

From: VIVEK_SHARMA <VIVEK_SHARMA_at_infosys.com>
Date: Tue, 20 Nov 2007 17:29:07 +0530
Message-ID: <69E1360E54B50C4A828A136C158E4742036754AB3A@BLRKECMBX02.ad.infosys.com>

Folks

Need to take Export Dump of a Large (200 GB) Table in Minimum possible Time?

NOTE - Table is range partitioned by date.

Some Options:-

  1. Should Multiple Export processes be fired concurrently on different Data ranges thereby exporting multiple Partitions / Date Ranges concurrently? Any Sample Examples / Link showing "exp with where condition for Partition / Conditional Exports"? if so, will this Slow down the export process? Such Dumps should be usable by Multiple Concurrent imports too.
  2. Optionally, Using a single Export Process, Can the Export Dump file be broken into multiple dump files of Smaller Size instead of 1 Single BIG Dump file, during exp? Any Sample Examples to break a Large Export Dump File based on Size? Can these multiple dump files(thus obtained) be imported Concurrently by Multiple import processes or NOT? Any Size Limitation on MAX Size of Export Dump file in 9.2?

NOTE - Source Database - in version 9.2, Single-Byte, NOT Production Target Database is in version 10.2, Multi-Byte, NOT Production, internal DB.

NOTE - Source & Target Databases are in different NON-Networked geographies.

Cheers & Thanks

Vivek

--
http://www.freelists.org/webpage/oracle-l
Received on Tue Nov 20 2007 - 05:59:07 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US