Re: Multiple imports simultaneous?

From: Seth Miller <sethmiller.sm_at_gmail.com>
Date: Mon, 4 Jan 2016 12:14:52 -0600
Message-ID: <CAEueRAXZ0ydkkxKAj9=9QL=pr_XH_jyYqkKMuo0uhjvupYKPGg_at_mail.gmail.com>



Yes, I missed that the data was coming out of 9i.

Seth Miller

On Mon, Jan 4, 2016 at 12:10 PM, Storey, Robert (DCSO) < RStorey_at_dcso.nashville.org> wrote:

> Wish I could make your day.
>
>
>
> But, since I am migrating from a 9i system, I can only use the native
> imp/exp
>
>
>
> *From:* Seth Miller [mailto:sethmiller.sm_at_gmail.com]
> *Sent:* Monday, January 04, 2016 11:07 AM
> *To:* Storey, Robert (DCSO)
> *Cc:* Kenny Payton; Oracle L
> *Subject:* Re: Multiple imports simultaneous?
>
>
>
> Robert,
>
>
>
> Please make my day and tell me you are using Data Pump, not imp.
>
>
>
> Seth Miller
>
>
>
> On Mon, Jan 4, 2016 at 7:21 AM, Storey, Robert (DCSO) <
> RStorey_at_dcso.nashville.org> wrote:
>
> Yea, I’m exploring options of just don’t doing the import, and then doing
> it later by loading it into a temp table, and then breaking the data into
> smaller chunks, like you suggested…
>
>
>
> *From:* Kenny Payton [mailto:k3nnyp_at_gmail.com]
> *Sent:* Wednesday, December 30, 2015 3:21 AM
> *To:* Storey, Robert (DCSO)
> *Cc:* Oracle L
> *Subject:* Re: Multiple imports simultaneous?
>
>
>
> I would concentrate on separating and improving the speed of the single
> table with the 12 hour import time. I have ran into similar issues with
> tables with out of line clob data. The way I resolved it was to break the
> single table export and import into several smaller jobs that can be ran
> concurrently. If you have a column that lends itself easily to breaking
> the table fairly evenly into many chunks use that column. If it's a
> number field you can use the mod function. If you don't have a good key
> you can use the dbms_rowid package and mod on something like block id. I
> was able to minimize complexity a bit by taking this one table and
> converting the move into inserts across a database link but you should also
> be able to use the query option on expdp. I'm assuming this table is not
> partitioned, otherwise using the partition key would probably work nicely
> as well.
>
> In my experience the only way I could obtain dramatic import speed
> improvements was to break the single table into many smaller concurrently
> running jobs.
>
> Kenny
>
> On Dec 29, 2015 4:29 PM, "Storey, Robert (DCSO)" <
> RStorey_at_dcso.nashville.org> wrote:
>
> Just curious about import.
>
>
>
> I’m importing an older 9i system into an 11g system. I am precreating the
> tablespaces, etc.
>
>
>
> Ultimately, the import only cares about 3 schemas. One that only has my
> PL/SQL and the other two have the data. So, Schema A, B, C. Schema B has
> about 360 tables and about 190 million rows total. Schema D has about 45
> tables, but and about 35 million rows of which 27 are in one table that has
> an XMLTYPE column. Importing just the one table in Schema B takes about 12
> hours. I’m working on methods to trim that time.
>
>
>
> But, is it possible to do multiple imports at once if using different
> inbound schemas.
>
>
>
> 1) Export the database to create my dump file.
>
> 2) ON the target server, make 3 copies of the import file.
>
> a. Do an import of Schema A, rows=n, indexes=n, constraints=n
>
> b. Do an import of Schema B, rows=n, indexes=n, constraints=n
>
> c. Do an import of Schema C, rows=n, indexes=n, constraints=n
>
> 3) Do an import for each schema where rows =n, and indexes and
> constraints = y.
>
>
>
> Theoretically, this should not interfere with each other. I can set the
> database to no-archive and increase redo logs so that the waits should be
> reduced.
>
>
>
> Thoughts?
>
>
>

--
http://www.freelists.org/webpage/oracle-l
Received on Mon Jan 04 2016 - 19:14:52 CET

Original text of this message