Re: big table headache

From: N.N. <asko.makitalo_at_elomatic.com>
Date: Thu, 11 Jan 2001 09:44:08 GMT
Message-ID: <Iff76.179$pd7.4018_at_read2.inet.fi>


Hi!

[Quoted] Some ideas...

Divide the data somehow. Use different tablespaces for different year. Backup only modifyed tablespace.
Sure you use 8.1.5> EE.
Burn archives to DVD-RAM.
Use DVD stock.

Asko
susana73_at_hotmail.com wrote in message <93jp96$g9g$1_at_nnrp1.deja.com>...
>Hi all,
>
>I have a database consists of 30 tables around 500GB. 29 of them are a
>total of only 20 GB. There is a single table 'arch_struct' occupying
>almost 500GB alone. This is a table with blob and store the old record
>from the table 'struct'. Records got archived from 'struct' to
>'arch_struct' every week. 'arch_struct' does get access(both read and
>write) but infrequently.
>
>This gives me a lot of trouble in terms of backup and maintanence. It
>takes me a long time to backup, especially when I think about I only
>have 20GB regular data. I also found that Oracle does not reuse the
>deleted space efficently. I want to defragment it but I don't have
>500GB space to dump it.
>
>Any suggestion is appreciated.
>
>Susan
>
>
>Sent via Deja.com
>http://www.deja.com/
Received on Thu Jan 11 2001 - 10:44:08 CET

Original text of this message