Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Default column value of MAX +1 of column - possible in Oracle 9i?

Re: Default column value of MAX +1 of column - possible in Oracle 9i?

From: HansF <News.Hans_at_telus.net>
Date: Tue, 29 Mar 2005 18:15:39 GMT
Message-Id: <pan.2005.03.29.19.17.13.925109@telus.net>


On Tue, 29 Mar 2005 17:57:52 +0000, wrote:

> Comments embedded.
>

>>
>>I do find that people who reject them out of hand have had limited or bad
>>experience with them, either from other RDBMSs or older versions of Oracle.  

>
> I do have some bad experience with them, e.g. I have sometimes seen
> them used as qucik and dirty add-ons to an application instead of the
> correct solution ("patching through triggers"), which is to correct
> the application. But the triggers are not to blame for incorrect use
> of them, of course.

Understandable. Such happens frequently when developers are in a hurry or don't understand the technology. You'ld be surprised how few developers actually understand how to use an RDBMS

>
> Whenever i have chosen development methods, triggers never seemed to
> be the first choice, as in the mentioned A) + B) example.
>
> Also notice that the trend goes towards API's being as
> RDBMS-independent as possible which leaves triggers more or less
> behind.

For this, I encourage reading Thomas Kyte's "Effective Oracle By Design". If companies have money to waste, them indepeenddence is OK, but ... well Chapters 1-3 say it all!

>
>
>

>>I also find it quite amusing that people, especially OO specialists,
>>trumpet the glories of event-based programming in response to screen
>>events, but reject the same model in response to database events.  Which
>>seems silly, as many screen map directly to tables, and many creen events
>>(other than pretty-print) map directly to database events.

>
> Good point. I guess RDBMS by system designers is still considered a
> strictly relational area in which OO methods and thoughts have no
> relevance.
>

Yup. Funny how people stop thinking once they learn patterns.

>

>>> This separation generally is a good thing and holds true in some
>>> cases. But not all.
>>

>
>>
>>For the sake of argument, would you please identify situations where this
>>is NOT a good thing.  I'm truly interested - even response by email is OK.

>
> What I meant was : The separation of GUI and business logic is
> probably always a good thing, but such separation is not always
> *maintained*. Either because the technology does not allow it, or
> (more common) simply because the GUI developers are handed over the
> business logic. Either by explicit decision or as a result of not
> making appropriate decisions.
>

Aye, there's the rub.

>

>>
>>I concur with your assessment that results from the changing roles and
>>responsibilities.  I just don't have to like the damage done in our
>>industry by the 'lowering of standards'.  The person who owns a high-end
>>Mercedes Benz could take the car to the corner garage for an oil change,
>>but probably will take it to a specialist ... I've often wondered why they
>>stopped doing the same with their business tools.  

>
> I don't like that trend either. Very few DBA's do, I think.
>
> Interesting enough, that trend will probably result in increased
> demand of DBA ressources in the future. Fire fighting has always
> demanded more resources than fire prevention. So the sad trend of
> simply skipping a decent DB design will ensure more DBA jobs. And
> still we don't like it. Because it is a simple waste of resources.
>
>
>>I agree that patching through triggers, rather than solving problems
>>though proper analysis, can end up with a less-than-optimal solution. In a
>>similar fashion to the "guns don't kill people, people with guns do"
>>argument, I have to ask - is the improper use of triggers the fault of the
>>technology?

>
> No.
>
>
>>
>>I also find that CONSTRAINTS are under-utilized.  Developers tend to code
>>constraints into application, or worse, code constraints into triggers.
>>For the class of constraints that can be declared, this is absolute
>>stupidity ... proof by blatant assertion (TM) <g>

>
> I agree 100%. I love constraints (PK's and FK's that is, I have some
> minor reservations with check constraints). They are as basic as the
> data model.
>
>
>
> - Kenneth Koenraadt

Thus, I think we've exhaausted most of our discussion - we seem to agree that triggers are not necessarily evil, but the poor implementation and thoughtless utilization of triggers have abused the poor things.

And another myth has been discussed, and any debunking will probably be ignored to allow the myth to properly mature.

/Hans Received on Tue Mar 29 2005 - 12:15:39 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US