Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Server Duplication

Re: Server Duplication

From: Vick <vramming_at_my-deja.com>
Date: Fri, 03 Sep 1999 16:49:00 GMT
Message-ID: <7qou5n$lm8$1@nnrp1.deja.com>


Stéphane
I really appreciate your thorough response... how about this for a idea? If the DBs are all going to be exactly the same on exactly the same servers (hardware) and the installs of NT & Oracle are all the same... what about building the first DB, and altering the logfiles and control files to archive/trace and then a normal shutdown. Then, I think, I could copy the datafiles, controlfiles, logs, etc to the new server at the NT OS level and open the DB???? I could get all the services and listeners installed and configured before the copy...

Any ideas on how long it takes to transfer 100 GB across a 100Gb WAN (no other traffic on the WAN)?
Vickie

In article <37CF7B42.7487_at_oriolecorp.com>,   sfaroult_at_oriolecorp.com wrote:
> Vick wrote:
> >
> > Hello Oracle fans...
> > I have a really wild question for you.
> > My company is starting a new project-- and I need to make estimates
on
> > time for construction/configuration of many Oracle DBs on many
> > different servers. HA! So I'm looking for some advice.
> > The basic idea is:
> > There is to be a 100 GB database installed as a basic DB with some
> > additional customized data added on each server. The base DB is
> > supposedly going to be coming from a master server through gigabit
> > ethernet.
> >
> > I'm thinking, having never seen this data, that this is going to be
an export/import deal. If it is -- how long will it take for the import
> > of the data? Will the tablespaces and tables need to be created
before the import? Is user info available from an import? Or will users need to be created each time (should be scripted in PLSQL)?
> >

I only have a bare idea of how big an opportunity this is and I'm guessing that it could be a real disaster if done wrong.
>
> Vick,
>
> IMHO exp/imp is about the worst option. It will take days to
reimport a full, duly indexed 100Gb database. If I understand you well, but for some additional local data, the database is basically the same
> everywhere? First, you should take care of having all the local data
> stored into a special tablespace. You should then proceed by database
> copy, i.e. compress all the datafiles, init.ora, log, control files,
> etc., send them to another machine, uncompress them there, and, after
> having used the BACKUP CONTROLFILE TO TRACE command on the source
> database, rename your files if need be, regenerate your control file
and
> lo!. There is a number of papers on the net explaining in details how
to
> proceed, out of memory I think there's one at http://www.tusc.com but
I
> am sure you can find something similar elsewhere, which explains step
by
> step how to proceed. I have been confronted to the problem myself some
> time ago, we had to duplicate a very large number of times (I think it
> has been done over 50 times) a 20 Gb database and this was done by a
> Korn-shell script I wrote; this script is unpublished because there
is a
> number of things in it which are special to the way those databases
were
> installed (everything under a single mounting point to ease
> administration - they were development databases) and how directory
> structures had been set, but if you feel at ease with Korn-shell and
> with the help of a paper as specified above, perhaps you can adapt it
to
> your own case. Don't remember how long it took, in fact what is
longest
> is transferring the files over the network, everything else is done in
> 10 minutes. E-mail me if you want this script (warning : commented in
> English but screen-output is in French). I plan to write a generic
> version to accommodate any OFA databases, but I have more projects and
> ideas than time to carry them on ;-). Once your database it's
installed,
> it's easy to drop and recreate the tablespace for local data.
> I have also used other techniques in different conditions and on
another
> assignment, where the problem was to create new databases which were
to
> be replicated. The database was relatively small (4.5 Gb, perhaps only
> half full), my problem was that I was working on a WAN. Basically, the
> idea was to do a full export with ROWS=N to export the structure only.
> Then all the tables were dumped one by one using pdqout, our unloader
> (which generates flat files and SQL*Loader control files). Everything
> was written to a single directory, tar, gzip and ftp. The database
being
> created with an empty structure at the other end, I had to disable all
> foreign keys (free scripts are available on our site, among others, to
> do this), then SQL*Loader applied to all files, then constraints were
> enabled. To give you an idea, starting from scratch (no target
database)
> I installed a database in Tokyo from Paris over a 64K line in less
than
> an hour. In fact what was longest was running catalog.sql in Tokyo.
But
> in your case I would recommend a file copy AND FORGET ABOUT IMP!
> --
> Regards,
>
> Stéphane Faroult
> Oriole Corporation
> Performance Tools & Free Scripts
> ------------------------------------------------------------------
> http://www.oriolecorp.com, designed by Oracle DBAs for Oracle DBAs
> ------------------------------------------------------------------
>

Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't. Received on Fri Sep 03 1999 - 11:49:00 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US