Re: Object-relational impedence
Date: Tue, 4 Mar 2008 23:19:42 +0100
> On 2008-03-04, Dmitry A. Kazakov <mailbox_at_dmitry-kazakov.de> wrote:
>> On Tue, 4 Mar 2008 17:58:02 +0000, Eric wrote: >> >>> On 2008-03-04, Dmitry A. Kazakov <mailbox_at_dmitry-kazakov.de> wrote: >>>> On Tue, 4 Mar 2008 15:41:40 +0000, Eric wrote: >>>> >>>>> On 2008-03-04, Dmitry A. Kazakov <mailbox_at_dmitry-kazakov.de> wrote: >>>>>> On Mon, 3 Mar 2008 23:03:41 +0000, Eric wrote: >>>>>> >>>>>>> No, RDBs partition data so that it is sensibly and easily available to >>>>>>> any possible application. So if you use OO you are saying "there will >>>>>>> never be any other application that will need my data". >>>>>> >>>>>> No, it is engineering which says so. It translates as "put the requirements >>>>>> first," or simpler "pigs do not fly." >>>>> >>>>> So no-one ever says "we should be able to get that stuff out of the xyz >>>>> application and combine it with our data so that we can..."! >>>> >>>> You should plan this use case in advance. That would be a requirement. A >>>> system can only do things it was designed for. (This applies to RDBMS as >>>> well). For each application exist things it cannot do. That implies: either >>>> A) there will never be any other application that will ask to do these, or >>>> B) the application is incorrect (= does not fulfill the requirements). >>>> >>> So you will always know, in advance, what all the possible future >>> applications will want! I hope you are not crazy enough to believe that. >> >> No, I am. When I am looking for a solution I have to know what is the >> problem. Is that crazy? Further, dealing with a generalized problem I shall >> consider what would be the consequences of such generalization. There is >> always a price to pay. You certainly have heard about computability, NP >> problems and such stuff. But just going from 1ms to 100žs makes a huge >> difference.
> Everything has a price. You have to choose. What I see is someone taking
> only the short-term view.
Huh, now after all these cries that DB is not about persistency... What makes a temporal aspect so relevant then?
>>> So you are left with minimising "things it cannot do". I guess that >>> means you should have something which can make the data available to any >>> application that asks, according to any logically possible criterion. >>> Did you know that this is what an RDBMS does? >> >> No it does not, when "asking" is defined as diffuse as in the natural >> language. There exist certain limitations on what and how can be asked.
> That's what I said - logically possible criteria.
>> These limitations should be specified as functional and non-functional >> requirements.
> If possible. What I meant was that you should minimise the limitations
> on both the expected and the unknown futures.
No optimum exists under these conditions.
>> If you prefer to buy a cat in the bag named RDBMS (or >> whatever), that's up to you. I merely state that there is always something >> in any bag.
> Cat? What cat? But actually, see what I said above about price.
You said that the price has to be paid. Right, but the question is about performance/price ratio. You can buy a bigger car, but it would require more gasoline and it would be more difficult to park. Software developing is an expensive thing.
>> As for the bag RDBMS, among the thing it contains are >> object-relational impedance,
> You made this one up because you don't understand.
What didn't I? That impedance exists or that it does not?
> OK, it's not the perfect language, but what is? And it is possible to
> have an RDBMS that doesn't use it.
>> poor performance,
> Relative to what? Where are the tests? Do you install an RDBMS product
> and just go with whatever myths you have heard lately, or do you get a
> product specialist to sort it out?
Come on, show me the nearest neighbour search in ten-dimensional space implemented in RDBMS. What would be the complexity of? You should clearly understand that it is possible to break the neck of *any* indexing method. This refutes the argument to "any logically possible criterion.".
>> unpredictable behavior,
> Please explain. Unless you're talking about bugs, but everything has
And what are the means available in order to prevent bugs? How much RDBMS support static analysis? SQL is practically untyped. Design by contract, how? Code reuse is close to none, well, code is evil, why should we reuse it? Upper bounds for memory footprint? For response times?
>> maintenance costs,
> Everything has those too. Again, I have to assume that you take only the
> short-term view.
No,I mean long-term maintenance costs.
> What else would you like to make up?
Actually I don't want to concentrate on critique of RDBMS. It is a hardware to me. I would buy one in case I needed it.
My objective is rather data-centric view. Which is IMO the reason for object-relational impedance. BTW, I see nothing wrong in RA, which has in my view fully independent on the notion of data. RA would nicely fit into OO as a set of types with corresponding operations.
>>> Perhaps not, since you have also said that "data are irrelevant". >> >> Yes, I did. I am working mainly in the area of industrial data acquisition >> and control. It might sound funny, but being so close to "data" one starts >> to better understand why data are irrelevant.
> Aha! Your data is transient, and what you are mostly doing is
> transforming it.
Well, one viewed it this way in 50-60s, I guess. But it is a long time since one dropped this data-centric view on the system as a huge signal filter. This model does not scale and is inadequate (event controlled and non-numeric things, GUI etc).
> I at least have no problem with using OO programming
> for that. Also, that explains your short-term view. But what do you do
> with the data (presumably transformed) that does get kept for longer?
> Put it somewhere that will be available for a variety of expected and
> unexpected uses? But we were here before!
Yes, here we go again. Data are meaningless if usage is unexpected. Nobody can use a CD-ROM in a wind-up phonograph, deaf people notably.
The system does not keep anything it exists and behaves. Deployment of DB there is always problematic. There are much ongoing efforts in this area in recent years, mainly to standardize the schemas. That is not enough, because relational model does not fit. Channels are largely event controlled with time stamps. So you cannot make any reasonable relations beyond (time, value) without data corruption. The queries would be like "give me the oil temperature profile when the velocity was out of range for longer than 10s before the event E." You need various interpolation methods, calculated channels and ones simulated from previously recorded measurements. Channels are created and destroyed, their properties change. The end effect is that when DBs are used then only marginally. I remember an amusing customer requirement: "we want to be able to run our tests even if the DB server is off-line." (:-))
-- Regards, Dmitry A. Kazakov http://www.dmitry-kazakov.deReceived on Tue Mar 04 2008 - 23:19:42 CET