Re: Oracle multi Tb refresh for UAT from Prod

From: Tim Gorman <tim.evdbt_at_gmail.com>
Date: Fri, 16 Nov 2018 09:14:17 -0800
Message-ID: <5ec8d142-535a-d395-5cb4-62898cacc9bd_at_gmail.com>





Sanjay,

Here are two shell-scripts that I use when refreshing Delphix "virtual databases" for Oracle based on two common requirements...

  1. Preserve non-prod database and application account passwords across refresh
  2. Preserve non-prod database link definitions across refresh

Some background:  when you initially provision a database clone, you must invariably change account passwords (so that production passwords aren't exposed in non-production) and change the definition of database links (so that production databases aren't corrupted by non-prod activities).  For a variety of reasons, this might be a manual process, although many folks have it automated.

Regardless, when the database clone is refreshed later, it might be necessary to repeat (and other) changes, and things can get messy. For example, after the initial cloning, DBAs might set account passwords to non-prod defaults, but developers and/or testers might change these non-prod default values for many reasons.  So what is really needed is not to re-execute the same procedures performed after the initial cloning, but simply to preserve what existed prior to the refresh operation and automatically re-apply those settings after the refresh is complete.

So, the attached shell-script "ora_vdb_prerefresh.sh" is intended to be called from a Delphix "pre-refresh" hook. "Hooks" are similar programmatic callouts, similar to database triggers or "user exits".  This script saves off existing database account passwords by generating a SQL*Plus script, and then it saves off database link definitions using DataPump export.

Then, the attached shell-script "ora_vdb_postrefresh.sh" is intended to be called from a Delphix "post-refresh" hook.  This script checks to see if a SQL*Plus script was generated and, if so, executes it to re-apply all account passwords.  Then, if a DataPump export file exists, the script calls DataPump import to re-apply the database link definitions.

You mentioned "/dropping prod user and adding Test user and application schema back/", so that may or may not be covered by the existing logic.  Certainly, adding an entire schema back can be performed with another set of calls to DataPump?

Please note that the attached scripts have ".txt" file-extensions to avoid freaking out email filters, and of course these extensions are intended to be removed on saving.

Disclaimers:  Please realize that these scripts are merely starting points or templates, not complete solutions.  They work fine in my lab environment and at a couple of my customers, but they don't necessarily do exactly what you want or won't work for you.  If you use them, you'll need to take ownership of them, adapt them to your environment, and there is no warranty, use at your own risk.

Hope this helps...

-Tim

On 11/16/18 08:21, Sanjay Mishra (Redacted sender smishra_97 for DMARC) wrote:
> Can someone share the process used in your experience/organization
> where you have several multi TB database and need to frequently
> refresh UAT for performance testing ? I am looking not much from
> Masking the data which are sometimes required but based on dropping
> prod user and adding Test user and application schema back
> ? Appreciate If someone can share any script used for sync
> user/password which can be main challenge as other Registration with
> OEM/Catalog can be handled easily.
>
> Tx
> Sanjay





--
http://www.freelists.org/webpage/oracle-l



Received on Fri Nov 16 2018 - 18:14:17 CET

Original text of this message