Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: reading the oracle files

Re: reading the oracle files

From: Jim Kennedy <kennedy-downwithspammersfamily_at_attbi.net>
Date: Mon, 11 Jul 2005 18:45:37 -0700
Message-ID: <buidnVrWDKccvU7fRVn-pA@comcast.com>

"John" <john_miller_portland_at_hotmail.com> wrote in message news:1121100653.776306.260900_at_g43g2000cwa.googlegroups.com...
> As far as I know fastreader does not use Oracle engine to retrieve
> data.
>
> This is why fastreader doesnt add overhead to oracle database while
> extracting data. Very helpful for production environment.
> It is only I/O since a lot of read / write
>
>
> Jim Kennedy wrote:
> > "John" <john_miller_portland_at_hotmail.com> wrote in message
> > news:1121056995.132566.85800_at_z14g2000cwz.googlegroups.com...
> > > fasreader might will do what you are asking. Fastreader is extracting
> > > data from oracle data files into flat (csv, xml ... whatever) files
and
> > > creates input for loaders.
> > >
> > > Overall unload/load process with Fastreader probably 10 times faster
> > > than native Oracle one. We did 100GB on 4CPU machine in 2 hours with
> > > Fastreader comparing to almost 24 hours of Oracle export/import.
> > >
> > > Besides speed the data is in portable readable format and not in
binary
> > > which open plenty of integration / data transfer options.
> > >
> > > fastreader is on http://www.wisdomforce.com
> > >
> > >
> > > Jim Kennedy wrote:
> > > >
> > > > >
> > > > I have a program that reads the datafiles directly. It is used to
find
> > > > corruption. (I got it from an Oracle employee.) But I doubt it
would
> > read
> > > > the information faster than Oracle does. Oracle does have a service
> > (costs
> > > > a minimum of $10,000 a couple of years ago) to extract the data from
the
> > > > Oracle files. However, I think it does this by running a copy of
the
> > Oracle
> > > > engine and extracting the information to sql loader files (1 per
table).
> > > >
> > > > I do not think that you will read the data any faster out of the
> > database
> > > > any faster than Oracle does.
> > >
> > Actually it uses Oracle engine to get the data and doesn't read the
files
> > directly. It probably uses multiple threads, bind variables, and the ar
ray
> > interface to do so. (what one would do if one was trying to create a
high
> > speed extracter.)
> >
> > The Oracle import and export is a serial process and is not multi
threaded.
> > It is possible the data pump in 10G is multi threaded.
> >
> > Jim
>

They are accessing it via Oracle not via reading a file directly. If they were accessing it directly then they would say that. (also they would have to contend with changes in data structures, compiler byte alignment - What does Oracle use and on what OS, and on what version, and endian considerations.) Much simpler and more reliable to just use the published apis and write the code efficiently. (and multi thread it.) Unlikely you are going to get much if any additional throughput by "avoiding" or bypassing the Oracle engine. (never mind the myrid of compatibility issues and massive code maintenance with an Oracle patch and possibly running afoul of the customer's license agreement!) The Oracle engine isn't overhead. How efficiently one uses it is a matter of programming. The exp utility from Oracle is not especially efficient. There are ways to make it more efficient. (eg look at what they did in the Oracle data pump.)

Jim Received on Mon Jul 11 2005 - 20:45:37 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US