RE: Backup for large number of archived logs per hour

From: Michael Dinh <mdinh_at_XIFIN.Com>
Date: Sat, 6 Mar 2010 14:49:07 -0800
Message-ID: <>

Thank you everyone for your input.  

From: Martin Bach [] Sent: Sat 3/6/2010 1:25 PM
To: Michael Dinh
Subject: Re: Backup for large number of archived logs per hour

Hi Michael!

I suggest you take on the advice given by the other replies to your thread - 200G database generating 20G/day seems a lot...

On 04/03/10 20:31, Michael Dinh wrote:
> As an example, if we have a 200G DW generating 20G archived log per
> hour, then what would be a more efficient way for backup?
> It does not make sense to backup archived log when the entire DW is less
> that the amout of archived logs generated.

In your case incrementally updated (RMAN) backups could be just what the doctor ordered, check section 4.4.3 in the 10g backup and recovery basics section.

In a nutshell you take a level 0 image copy of your database, then take incrementals and merge them into your original level 0 backup. This is a really cool strategy that can cut your recovery time considerably since there are a lot less archived logs to be applied. Of course it's something to test thoroughly before putting into production...



Martin Bach
OCM 10g <>

Received on Sat Mar 06 2010 - 16:49:07 CST

Original text of this message