Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.misc -> Re: Large Tables
if your table has > 1-2 million rows
you may find it extremely beneficial
to split it into multiple tables and then
create a partition view on them.
We had a table with ~ 40 million rows which was growing
by 1.2 M rows/month, until it became virtually unusable
(it took several hours just to recreate all indexes on it after
yet another data load).
After we split it into 44 tables by month of the year, and created
a partition view combining all these tables together, it FLIES !
We just have to load each month's data into a separate table
and add this table to the view. No indexes get dropped/recreated.
Cordially,
Dragon
nomail_at_nowhere.fr wrote in article <34228CB6.5BE5B369_at_nowhere.fr>...
> I did and you better use ParallelQuery (Full Scan in //) and be very
careful
> with Joints as the performance
> drop if you use standard Index Optimisation.
> Our database: 1g ram 60g data 30 Tables (1-10 Million lines), Joins 6-7
> Tables.
> Happy Tuning (avoid implicit nested loop).
>
> An exemple of the same query: Join on 6 Tables
> Standard Indexes Plan: 8 Hours
> Parallel Query with implicit Nested Loop: several Weeks (3 million * 20
s)
> Query Tuned with only Hash/Sort/Join/Full: 10 min
>
> Richu Wu wrote:
>
> > I am using Oracle Applications SC 10.7 on an Oracle 7.3 database. The
> > server will be an IBM R50 RiSC 6000 w/ 8 processors and 2 Gigs of RAM.
> >
> > One of the main table that will be used in reporting could hold as many
as
> > 8,000,000 records.
> >
> > Has anyone had any experience querying against tables of this
magntitude?
> > I don't have any way in the near future to test this. Any help would be
> > appreciated.
> >
> > Thank you,
> >
> > Richu Wu
>
>
>
>
Received on Mon Sep 22 1997 - 00:00:00 CDT