RE: Multiple imports simultaneous?

From: Storey, Robert (DCSO) <"Storey,>
Date: Mon, 4 Jan 2016 18:10:05 +0000
Message-ID: <FE4C2B093843BB4B873D754E5E0BE4DB01998C42C6_at_DCSOSVMS02.dcso.org>



Wish I could make your day.

But, since I am migrating from a 9i system, I can only use the native imp/exp

From: Seth Miller [mailto:sethmiller.sm_at_gmail.com] Sent: Monday, January 04, 2016 11:07 AM To: Storey, Robert (DCSO)
Cc: Kenny Payton; Oracle L
Subject: Re: Multiple imports simultaneous?

Robert,

Please make my day and tell me you are using Data Pump, not imp.

Seth Miller

On Mon, Jan 4, 2016 at 7:21 AM, Storey, Robert (DCSO) <RStorey_at_dcso.nashville.org<mailto:RStorey_at_dcso.nashville.org>> wrote: Yea, I’m exploring options of just don’t doing the import, and then doing it later by loading it into a temp table, and then breaking the data into smaller chunks, like you suggested…

From: Kenny Payton [mailto:k3nnyp_at_gmail.com<mailto:k3nnyp_at_gmail.com>] Sent: Wednesday, December 30, 2015 3:21 AM To: Storey, Robert (DCSO)
Cc: Oracle L
Subject: Re: Multiple imports simultaneous?

I would concentrate on separating and improving the speed of the single table with the 12 hour import time. I have ran into similar issues with tables with out of line clob data. The way I resolved it was to break the single table export and import into several smaller jobs that can be ran concurrently. If you have a column that lends itself easily to breaking the table fairly evenly into many chunks use that column. If it's a number field you can use the mod function. If you don't have a good key you can use the dbms_rowid package and mod on something like block id. I was able to minimize complexity a bit by taking this one table and converting the move into inserts across a database link but you should also be able to use the query option on expdp. I'm assuming this table is not partitioned, otherwise using the partition key would probably work nicely as well.

In my experience the only way I could obtain dramatic import speed improvements was to break the single table into many smaller concurrently running jobs.

Kenny
On Dec 29, 2015 4:29 PM, "Storey, Robert (DCSO)" <RStorey_at_dcso.nashville.org<mailto:RStorey_at_dcso.nashville.org>> wrote: Just curious about import.

I’m importing an older 9i system into an 11g system. I am precreating the tablespaces, etc.

Ultimately, the import only cares about 3 schemas. One that only has my PL/SQL and the other two have the data. So, Schema A, B, C. Schema B has about 360 tables and about 190 million rows total. Schema D has about 45 tables, but and about 35 million rows of which 27 are in one table that has an XMLTYPE column. Importing just the one table in Schema B takes about 12 hours. I’m working on methods to trim that time.

But, is it possible to do multiple imports at once if using different inbound schemas.

  1. Export the database to create my dump file.
  2. ON the target server, make 3 copies of the import file.
    1. Do an import of Schema A, rows=n, indexes=n, constraints=n
    2. Do an import of Schema B, rows=n, indexes=n, constraints=n
    3. Do an import of Schema C, rows=n, indexes=n, constraints=n
  3. Do an import for each schema where rows =n, and indexes and constraints = y.

Theoretically, this should not interfere with each other. I can set the database to no-archive and increase redo logs so that the waits should be reduced.

Thoughts?

--
http://www.freelists.org/webpage/oracle-l
Received on Mon Jan 04 2016 - 19:10:05 CET

Original text of this message