Re: Oracle multi Tb refresh for UAT from Prod

From: Sanjay Mishra <"Sanjay>
Date: Mon, 19 Nov 2018 17:11:37 +0000 (UTC)
Message-ID: <1881139261.3337873.1542647497542_at_mail.yahoo.com>



 Mladen
Thanks for your update. Our Prod where I am looking for a solution ranges from 10-55Tb. Prod and UAT are on their own Exadata cluster and so the network is not issue as using the fast connection to the Tape library. The requirement comes is to minimize the time to least possible and so RMAN restore was taking the minimum of 4-5 hr and hence make some UAT not available. Thinking of Snapclone as another alternative test as then if I can create TestMaster and once done create the snapclone in no more than few min. Definitely, it will need much more disk space like keeping one Testmaster for existing snapclone and another copy for new Testmaster else removing existing TestMaster can remove the Snapclone and hence the UAT.

So checking two options1. RMAN restore and calculating the time for the biggest environment2. Using Snapclone process and calculating space and timing Then might use mix approach using RMAN or snapclone based on application SLA requirement TxSanjay On Saturday, November 17, 2018, 2:22:00 AM EST, Mladen Gogala <gogala.mladen_at_gmail.com> wrote:  

 Hi Sanjay,

As usual, there are only two methods: rman and SAN copy. I am a consultant and I don't have a database that I can call my own. I gave up the comfortable life of a DBA who is always on call and has to review projects, but his advice can be overruled by the development manager who is more worried about deadlines than performance and have sold my SQL*Soul to consulting more than 6 years ago. I have been enjoying the power of the dark side ever since.

When you use term "multi TB",  how much is "multi"? 10 GB Ethernet can under ideal conditions copy/restore using rman with the speed of 3.5 TB per hour. With some decent equipment which supports SRDF or HUR you can be even faster. For the database nightmares with sizes of 100+ TB, you will need some special equipment for copy over the Ethernet. This is an example:

http://www.mellanox.com/page/products_dyn?product_family=206&mtag=connectx_4_en_ic

This is Mellanox 100 GB ethernet adapter. You will also need an accompanying router and cables as well as the network engineer who knows how to deal with that. And there aren't many of those.

https://www.cisco.com/c/en/us/solutions/service-provider/100-gigabit-solution/index.html

I have seen this only once and it can achieve up to 32 TB per hour.  Quite different from the 9600 baud modem attached to VAX 4200 with which I have started exploring network, using DECNET protocol.  Basically, VLDB cannot function on a small machine and you really, really need specialized hardware, either SAN or network to do that. Something tells me that 100GB Ethernet will become the standard before I retire.

Regards

On 11/16/18 11:21 AM, Sanjay Mishra (Redacted sender smishra_97 for DMARC) wrote:
> Can someone share the process used in your experience/organization
> where you have several multi TB database and need to frequently
> refresh UAT for performance testing ? I am looking not much from
> Masking the data which are sometimes required but based on dropping
> prod user and adding Test user and application schema back
> ? Appreciate If someone can share any script used for sync
> user/password which can be main challenge as other Registration with
> OEM/Catalog can be handled easily.
>
> Tx
> Sanjay

-- 
Mladen Gogala
Database Consultant
Tel: (347) 321-1217

--
http://www.freelists.org/webpage/oracle-l


  
--
http://www.freelists.org/webpage/oracle-l
Received on Mon Nov 19 2018 - 18:11:37 CET

Original text of this message