Re: A new proof of the superiority of set oriented approaches: numerical/time serie linear interpolation
Date: 2 May 2007 12:07:15 -0700
Message-ID: <1178132835.210921.48340_at_p77g2000hsh.googlegroups.com>
What I did is called conjugacy in mathematics. Could I recommend "Bypasses: A Simple Approach to Complexity" by Z.A. Melzak (ISBN10: 0-471-86854-X)?
The basic idea is that you can use a transform and its inverse to make a problem simple (F ○ G ○ F' ) = G (darn! superscript -1 will not print) if certain conditions are met. It is how logs and integral transforms work, but it is more general than than analysis.
I suggested that the new OLAP operators could be used to easily convert your original de-normalized data into a format that is easier to work with. If I wanted to do this with OLAP functions, I would have written it with one "AVG() OVER()" instead. That would be very compact and still within the rules about using only set-oriented code.
> .. he shows once more demonstrated he has no clue about RM. <<
Really? Then why did you use a technique that has been in SQL FOR SMARTIES since the first edition over ten years ago? Did you notice that your original data was a copy of a clipboard used to gather raw data, so the rows did not model a complete fact?
Time comes in durations (Snodgrass, et al) so each leg of the trip needs either a (start_time, end_time) pair or a known origin point and a displacement from it. You had the fact split around multiple rows. When your data model is this bad, you are forced into complex queries, such as you posted.
I have just started playing with conjugacy for RM, so I do not have a lot of examples much less a general theory yet. If anyone want to talk about it, I probably ought to start another thread. Received on Wed May 02 2007 - 21:07:15 CEST