Re: Backup for large number of archived logs per hour

From: Jared Still <jkstill_at_gmail.com>
Date: Sat, 6 Mar 2010 18:46:34 -0800
Message-ID: <bf46381003061846o1687fd4ke9cf9499978ee5da_at_mail.gmail.com>



On Sat, Mar 6, 2010 at 1:25 PM, Martin Bach <development_at_the-playground.de>wrote:

> Hi Michael!
>
> I suggest you take on the advice given by the other replies to your
> thread - 200G database generating 20G/day seems a lot...
>
>

I'm not too sure that hard and fast rules, or even rules of thumb can really be trusted for such things.

The size of the database can be a fair predictor of how much redo may be generated, but the volatility of the data must also be considered, as well as the number of processes making changes.

Jared Still
Certifiable Oracle DBA and Part Time Perl Evangelist Oracle Blog: http://jkstill.blogspot.com Home Page: http://jaredstill.com

--
http://www.freelists.org/webpage/oracle-l
Received on Sat Mar 06 2010 - 20:46:34 CST

Original text of this message