Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> Re: Suggestions on how to efficiently search a large table
On Nov 15, 8:35 am, chrism..._at_gmail.com wrote:
> On Nov 15, 7:42 am, bdbafh <bdb..._at_gmail.com> wrote:
>
>
>
>
>
> > On Nov 14, 8:55 pm, chrism..._at_gmail.com wrote:
>
> > > I have a billion+ row table that I need to do queries on.
>
> > > The table looks something like this:
>
> > > item
> > > id number
> > > process1_id number
> > > process2_id number
> > > process3_id char(35)
>
> > > I need to perform multiple processes on each item. When I process an
> > > item, I then set the corresponding column with the result of the
> > > process.
>
> > > I need to do queries like the following to determine which processes I
> > > haven't completed yet:
>
> > > select * from item where process1_id is null or process2_id is null
> > > or process3_id is null
>
> > > Any recommendations on how to index or query such a table to get good
> > > performance? The cardinality of all the process*_id columns is very
> > > high.
>
> > > Thanks.
>
> > You've provided no information regarding:
> > - version of Oracle software
> 10g
> > - edition of Oracle software
> 10.2.0.3
> > - how many client processes will be hammering the table
> 1
> > - how the table is being loaded
>
> loaded from a comma delimited file exported from a SQL Server
> database> - hardware, OS, clustering, RAC, server color
>
> LS41, SE Linux, OCSF, RAC> - desired response time
>
> < 2 seconds
>
> Also, I should have added that I will be limiting the query with a
> rownum < 1000 in the WHERE clause.- Hide quoted text -
>
> - Show quoted text -
If removing the rownum filter would produce a 10,000,000 row result set, it still has to get those 10,000,000 rows before rownum can be assigned and determined. So I don't think limiting on rownum is going to have any appreciable impact on access plan or query performance. Received on Thu Nov 15 2007 - 11:14:25 CST
![]() |
![]() |