RE: Backup for large number of archived logs per hour
Date: Sat, 6 Mar 2010 14:49:07 -0800
Thank you everyone for your input.
From: Martin Bach [mailto:development_at_the-playground.de]
Sent: Sat 3/6/2010 1:25 PM
To: Michael Dinh
Subject: Re: Backup for large number of archived logs per hour
I suggest you take on the advice given by the other replies to your thread - 200G database generating 20G/day seems a lot...
On 04/03/10 20:31, Michael Dinh wrote:
> As an example, if we have a 200G DW generating 20G archived log per
> hour, then what would be a more efficient way for backup?
> It does not make sense to backup archived log when the entire DW is less
> that the amout of archived logs generated.
In your case incrementally updated (RMAN) backups could be just what the doctor ordered, check section 4.4.3 in the 10g backup and recovery basics section.
In a nutshell you take a level 0 image copy of your database, then take incrementals and merge them into your original level 0 backup. This is a really cool strategy that can cut your recovery time considerably since there are a lot less archived logs to be applied. Of course it's something to test thoroughly before putting into production...
-- Martin Bach OCM 10g http://martincarstenbach.wordpress.com <http://martincarstenbach.wordpress.com/> http://www.linkedin.com/in/martincarstenbach -- http://www.freelists.org/webpage/oracle-lReceived on Sat Mar 06 2010 - 16:49:07 CST