Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> comp.databases.oracle -> data handling issue

data handling issue

From: curwen <josephcurwen_at_despammed.com>
Date: 25 Feb 2004 05:53:10 -0800
Message-ID: <cc68edc4.0402250553.18a9dabf@posting.google.com>

Hi all,
I'd like to have you opinion about the following performance/design issue:

a logging system is generating a lot of records every our,(about 500k/day)
I need to store them in a huge Oracle RAC system a Java web application connects to the data and executes queries on them, mainly on last entered set of records queries on informations stored more then 3 months ago are quite rare

as first guess I managed to store them in a table indexed by a primary key, let's say in the table 'LOGS' the problem is that table is going to grows very big,

have you got 'best practice' ideas to handle the situation in a better way?

I thought about moving data priodically in a 'LOGS_ARCHIVE' table , for example using a Oracle job
is it necessary?
PS I can't use Oracle standard file archiving support, I need data to stay on tables

thanks in advance
jc Received on Wed Feb 25 2004 - 07:53:10 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US