Re: Minimizing downtime for 9i to 10g upgrade
Date: Fri, 12 Sep 2008 06:25:38 -0700 (PDT)
On Sep 11, 6:07 pm, Chuck <chuckh1958_nos..._at_gmail.com> wrote:
> I need to upgrade a 9i database to 10g but the business unit cannot
> tolerate any downtime. I've advised them the cleanest way is to upgrade
> the db in place which takes about 90 minutes. They are uncomfortable
> with that much down time. An exp/imp to a new 10g database would take
> even longer. Is there a simple straight forward way to do a major
> release upgrade without down time?
> One thought I had was to create a 2nd database using 10g, export the
> data from 9i to 10g (consistent=y) with users still udating the 9i
> database. Then comes the part I'm not sure of. Is there some feature in
> Oracle that will let me convert the log files from binary to SQL (sort
> of like mysql's mysqldump utility)? If I know the export started at scn
> X, I could convert all committed sql from scn x+1 and then apply it to
> the new database. It wouldn't be zero down time, but it would certainly
> minimize it. Is this possible? Or do I need to be doing something above
> the database layer to accomplish what I want?
> Another thought is again copying the data to a new 10g database but
> using transportable tablespaces instead. I'm just not sure that they
> would be compatible between 9i and 10g. Again it's not a zero down time
> solution but might be a lot faster than having the db down for the
> entire time that dbua is running
How do you ever apply patchsets and security patches? Certainly if the app can't tolerate downtime, it would also be important enough to need to have had the quarterly critical patch updates applied in a timely manner.
That would likely have been 4 patchsets and other one-off patches. How did you handle the downtime incurred while that maintenance was performed?
-bdbafh Received on Fri Sep 12 2008 - 08:25:38 CDT