Daniel Morgan wrote:
> My instinct would be to go for a duplicate table set loaded by an AFTER
> UPDATE -AFTER DELETE trigger.
>
> That way you have clear physical separation between current data and older
> data. Then I would build an application into which one entered a single
> date and that displayed information pertinent only to that date.
I disagree with you here Daniel. This is about insurance products and the
total size is mot large in todays terms. I'd guess the base table will likely
have an average row size of around 4K, the details around 10K so with the 10x
the whole thing is under a 150G - a moderate tx system today.
- 10 record changes is not excessive and should be easily handled by the CBO.
Some thought might be given to the indexing scheme, and some consideration on
disk & tablespace balancing might be in order but it's not a biggie. IMHO,
it's better to use tools such as the OEM Tuning Pack to manage performance
here.
- I suspect that this application is well suited to Partitioning (likely list
or name range). Although that is a separate cost option, the gains in
performance and admin overall will likely outweigh the cost in the long run.
- invariably someone needs a report from the production system, usually for
legal reasons, which need selection across multiple subsets of effective
dates. Having done exactly what you propose, I found that the joins across
the multiple tables master+history against the details become complex and
fraught with possibility to make mistakes. Two days before the fraud-trial is
not the time to realize the join was wrong!
- the simplest solution for me is using "Effective_Date", "Expiry_Date". I'd
even consider including Effective_Date as part of the key, but that needs
careful review before jumping on it. I also always try to add a
"Created_DTTM", "Created_By", "LastUpd_DTTM" and "LastUpd_By" columns when I
can get away with it and that's saved my nut several times ... DTTM is
Date/Time.
Received on Wed Apr 30 2003 - 00:04:17 CDT