Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.misc -> design advice
Hi all,
I'd like to have you opinion about the following performance/design
issue:
a logging system is generating a lot of records every our,(about
500k/day)
I need to store them in a huge Oracle RAC system
a Java web application connects to the data and executes queries on
them, mainly on last entered set of records
queries on informations stored more then 3 months ago are quite rare
as first guess I managed to store them in a table indexed by a primary key, let's say in the table 'LOGS' the problem is that table is going to grows very big,
have you got 'best practice' ideas to handle the situation in a better way?
I thought about moving data priodically in a 'LOGS_ARCHIVE' table ,
for example using a Oracle job
is it necessary?
PS I can't use Oracle standard file archiving support, I need data to
stay on tables
thanks in advance
jc
Received on Fri Feb 27 2004 - 02:28:26 CST