Oracle FAQ Your Portal to the Oracle Knowledge Grid

Home -> Community -> Usenet -> c.d.o.server -> Re: Supporting multiple oracle versions in a trigger

Re: Supporting multiple oracle versions in a trigger

From: Jack Addington <>
Date: Sat, 22 Oct 2005 14:42:34 GMT
Message-ID: <uds6f.266731$tl2.87588@pd7tw3no>

"Jim Kennedy" <> wrote in message


> You are going to serialize on latches big time. ... we fixed their code
> and they accomplished the task in
> a different manner. We got a 5 X performance improvement and no one knew
> when their code was running. (the other clients could do work at the same
> time.)

This latches business is not simple - at least not from googling. Apparently Tom's book can't get here fast enough now. Is your big concern with DDL, DML or both in my situation? ie) the stored procs to create the transposed snapshots or the stored procs to sync the transposed snapshots?
or worse the basic transactional activities on the core data header tables (the table of pointers as you put it) (please see attachment and/or link at bottom)

> You can also get Tales of the Oak Table. Nice
> story in there about the project where one developer tried a similar idea.
> Sounded fantastic, ultimately flexible, on demand reconfiguable. (result,
> it sucked and after millions of dollars the guy was fired and they went
> with
> a commercial app. One of those that used "old fashioned traditional
> methods".)

I will have a look at the other suggested material.

> Seriously, when you write a C program do you create a generic object or
> struct with a pointer to one of each type? Of course not. Why would you
> think it would be a good way to do things in a database. My question
> isn't
> really rhetorical it is based on decades of varied experience.

Not to be facetious and I realize you are making a point but first of all I wouldn't create a C program to store data. The database is my tool to support the data requirements and in my preference to manage as much of the business logic as possible. My data requirements are the collection, storage, and flexible extraction of custom data. One client is a research group collecting data from tests/measures while another is a marketing company building/collecting survey data. While there are static datasets the major selling feature is the ability to take a dataset and quickly modify, reuse, alter, etc. My marketing client doesn't have the resources to create new tables and add/remove columns and change their data entry forms.

The basic premise for the generic data capture is an EAV (entity-attribute-value) model. Decent overview that I liked and has similar product features: I can find other EAV modeling concepts but nothing that deals with database performance/tuning. My biggest challenges is the conversion between EAV and a logical model. I am happing with the performance so far but the client base is small and the database stress is minimal. If you have the time I would appreciate any ideas on scalability improvements.

Thanks for your time and for giving me answers I can followup on.

Jack Received on Sat Oct 22 2005 - 09:42:34 CDT

Original text of this message