Re: Split a dmp file before import
From: Sashi <smalladi_at_gmail.com>
Date: Tue, 10 Nov 2009 12:56:48 -0800 (PST)
Message-ID: <622efa6c-56c5-4e1a-bd19-ce13aece4057_at_h34g2000yqm.googlegroups.com>
On Nov 10, 1:15 pm, joel garry <joel-ga..._at_home.com> wrote:
> On Nov 10, 4:22 am, Sashi <small..._at_gmail.com> wrote:
>
> > Hi all, I have a dmp file that contains about 9 million rows.
> > Is there a utility/technique that will split the file into two (or
> > more) so that I can run the import in two (or more)
> > stages?
>
> > Thanks,
> > Sashi
>
> Mark's answer pretty much says it all, but there may be more info
> available if you tell us what problem you are trying to solve. Disk
> space? Speed of imp? Redo generation? Trying to parallelize?
>
> jg
> --
> _at_home.com is bogus.
> Yay Stu! (Stu and I were buddies years ago):http://www.campinglife.com/output.cfm?ID=2209609
Date: Tue, 10 Nov 2009 12:56:48 -0800 (PST)
Message-ID: <622efa6c-56c5-4e1a-bd19-ce13aece4057_at_h34g2000yqm.googlegroups.com>
On Nov 10, 1:15 pm, joel garry <joel-ga..._at_home.com> wrote:
> On Nov 10, 4:22 am, Sashi <small..._at_gmail.com> wrote:
>
> > Hi all, I have a dmp file that contains about 9 million rows.
> > Is there a utility/technique that will split the file into two (or
> > more) so that I can run the import in two (or more)
> > stages?
>
> > Thanks,
> > Sashi
>
> Mark's answer pretty much says it all, but there may be more info
> available if you tell us what problem you are trying to solve. Disk
> space? Speed of imp? Redo generation? Trying to parallelize?
>
> jg
> --
> _at_home.com is bogus.
> Yay Stu! (Stu and I were buddies years ago):http://www.campinglife.com/output.cfm?ID=2209609
Thanks for your replies, and that pretty much sums it up well for me.
My problem is that my archiver keeps getting filled up, and I'm
running short of disk space. It's unable to generate enough undo
tablespace to represent my transaction.
I googled around and took the approach of commiting regularly and
using a buffer size of 50 MB.
The DMP file is actually a single table, and is about 950 MB.
So on my import command I set commit=Y and buffer=50000000.
This is 10.2.0.4.0 on solaris 10.
Regards,
Sashi
Received on Tue Nov 10 2009 - 14:56:48 CST