Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Archiving Data

Re: Archiving Data

From: rama rao <rama76_at_streamyx.com>
Date: Thu, 13 Jul 2006 23:22:07 +0800
Message-ID: <44b66615$1_1@news.tm.net.my>

<millerkm_at_gmail.com> wrote in message
news:1152046487.012163.32090_at_p79g2000cwp.googlegroups.com...
> Hi,
>
> I am trying to come up with a solution to automatically archive data in
> our Oracle database.
>
> Our scenario is as follows:
>
> We have 35 million records coming in per month into our database. Every
> month we create a new table to store the records for that month.
>
> Since 35 million records is quite a few, we'd like to only store 1 or 2
> months in our database, and then automatically run a script or
> something that would copy out older tables into another database which
> we'd use to run queried reports on. The problem with this is you are
> going to have to copy 35 million records at the end of every month. Is
> this going to be a performance problem on the CPU while the copy is
> taking place?
>
> Another solution had thought of was to create a new database every X
> months to store new incoming data. All old databases would be archived
> databases. The only problem with this is it seems difficult to create
> new databases via script, plus in 5 years you are going to have 20+
> databases kicking around.
>
> What is the best idea for achieving something like this? Is there any
> administrative tools like SQL's DTS that might do some of this for me?
>
> I don't know enough about database administration to come up with the
> best solution.
> I'm just the programmer implementing this archiving problem.
>
> We're currently using C# ADO.NET 1.1 and Oracle 10g.
>
> Thanks in advance for the input,
> Kevin
>

Hi Kevin,

Have you considered of using export and import. I am not sure how long it would took you to create a dump of 35 million records. You could try to perform this activity during non peak hours. You could also consider to separating your current active table with the one you are going to dump, in to separate hard disk which helps to load balance the I/O while you accessing the both tables.

Export and import is very easy to use and doesn't required any additional configuration.

rgds,
Rama. Received on Thu Jul 13 2006 - 10:22:07 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US