Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.misc -> Re: Minimize imp-effect on users

Re: Minimize imp-effect on users

From: Karsten Farrell <kfarrell_at_belgariad.com>
Date: Fri, 01 Aug 2003 17:32:09 GMT
Message-ID: <MPG.199443ef5275c0ce989820@news.la.sbcglobal.net>


Hi Tobias Rangne, thanks for writing this:
> Hi!
>
> As quite a Newbie on Oracle i would very much appreciate if one/some of
> those experiented posters in this newsgroup could just give me a hint if i
> am at least moving in the right direction, or if my idea is just plain
> stupid...
>
> Background:
> One part of our oracle-database consists of some tables that (until now)
> only had to be updated on a 6-8 week basis from a different server (not on
> the same network). Now we have been given the oportunity to recieve this
> data on a daily basis.
>
> The mechanism used today is a quite ugly one:
> Drop all objects that belongs to the oracle-user (a read only one only used
> for this: 'userx') holding the data and then doing a IMP on the dump-file.
> This means the data is unavalible during the time of the import (not to
> speak of what would happen if the import would fail after the
> table-drops...)
>
> After a 'litle bit' of RTFM i came up whith :
>
> 1. Create 2 new users temp1 and temp2
> 2. importing the file with FULL=Y to user temp1
> 3. Create synonyms for all tables and granting select for them to userx.
>
> After this i could switch every second day between the users temp1 and temp2
> to perform the following:
> 1. Import to the user NOT used yesterday.
> 2. When the import is finished just drop and recreate the synonyms and
> giving select-grants to userx to point at 'this days' temp-user.
> 3. drop the tables belonging to the temp-user used yesterday.
>
> Still i think that i would have the problem of something going wrong in the
> middle of the
> DROP SYNONYM1 ..
> DROP SYNONYM2 ..
> CREATE SYNONOM ..
> CREATE SYNONOM2 ..
> GRANT SELECT .. etc
>
> sequence, (what i read in the manuals and my own testing makes me think that
> ( at least I ;-) am not able to perform a bunch of DML in a transaction. Am
> i right?)
>
> but this would at least speed up things from a user-point of view (and the
> import could run 'slow and easy' with some consumer-limited-profile for
> temp1 and temp2 not to make things go slow for others)
>
> I have tried this in a test-environment and it seems that everything works,
> and i also think that i have read the books that could be involved in
> solutions like this (concepts, admin, reference,sql, and backup and
> recovery.) but there might be much more safer, faster and better ways to
> accomplish this that i missed.
>
> Oh sorry: 8.1.7.3 on W2000 and (do i dare?: 'soon to bee gone' 8.0.4 on
> novell 4.11)
>
> Thanks in advance.
>
> Best regards and a nice weekend to all.
>
> Tobias (Hopefully soon to be able to contribute to this group, i actually
> love to RTFM...)
>
> P.S. This is my first entry in a NG ever, so if I (not by intention) did
> cross-post, please just tell me how not to... D.S
>

Without knowing how much control you have over the file you will receive daily, what I suggest might not be the most efficient.

If (big IF) you can specify the format of the received file, why not have it created as a sql loader file (either csv or fixed width columns)? Then you could load it directly into your table without the need for temp userids nor dropping and recreating all those objects every time.

If you have no control over the format of the received file, then you should be able to "get by" with a modified technique:

  1. Create a user account temp1.
  2. Import into temp1.
  3. Write scripts that insert into your target tables as select from temp1.
  4. Delete all data from temp1.

The latter technique assumes you don't have too many tables. But since you are probably writing scripts to do all your drop/create stuff (I assume), this script wouldn't be too difficult.

There are several approaches to this problem. But suggesting a better solution would require additional info:

  1. Does the file you receive update or completely replace your target data?
  2. How many tables are we talking about here ... ballpark figure? How much data?
  3. You said the source and target are on different networks. How are you receiving the export file then? If you are copying with ftp or something like that, then you might be able to get away with an INSERT...SELECT FROM type of scenario (since you can probably "reach" the source database if you can ftp from source to target server). -- [:%s/Karsten Farrell/Oracle DBA/g]
Received on Fri Aug 01 2003 - 12:32:09 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US