Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Re: reading the oracle files
"John" <john_miller_portland_at_hotmail.com> wrote in message
news:1121056995.132566.85800_at_z14g2000cwz.googlegroups.com...
> fasreader might will do what you are asking. Fastreader is extracting
> data from oracle data files into flat (csv, xml ... whatever) files and
> creates input for loaders.
>
> Overall unload/load process with Fastreader probably 10 times faster
> than native Oracle one. We did 100GB on 4CPU machine in 2 hours with
> Fastreader comparing to almost 24 hours of Oracle export/import.
>
> Besides speed the data is in portable readable format and not in binary
> which open plenty of integration / data transfer options.
>
> fastreader is on http://www.wisdomforce.com
>
>
> Jim Kennedy wrote:
> >
> > >
> > I have a program that reads the datafiles directly. It is used to find
> > corruption. (I got it from an Oracle employee.) But I doubt it would
read
> > the information faster than Oracle does. Oracle does have a service
(costs
> > a minimum of $10,000 a couple of years ago) to extract the data from the
> > Oracle files. However, I think it does this by running a copy of the
Oracle
> > engine and extracting the information to sql loader files (1 per table).
> >
> > I do not think that you will read the data any faster out of the
database
> > any faster than Oracle does.
>
Actually it uses Oracle engine to get the data and doesn't read the files
directly. It probably uses multiple threads, bind variables, and the array
interface to do so. (what one would do if one was trying to create a high
speed extracter.)
The Oracle import and export is a serial process and is not multi threaded. It is possible the data pump in 10G is multi threaded.
Jim Received on Sun Jul 10 2005 - 23:59:03 CDT
![]() |
![]() |