Re: What is the logic of storing XML in a Database?

From: Bob Badour <>
Date: Wed, 28 Mar 2007 15:13:27 GMT
Message-ID: <rCvOh.16501$>

David Cressey wrote:
> "Bernard Peek" <> wrote in message

>>From my experience of replacing legacy formats with XML the main benefits

> It depends on what you include as a "legacy format". Some XML evangelizers
> are calling SQL tables a "legacy format". I don't believe them for an
> instant, but it's worth making this explicit.
>>Data can be validated before it's transmitted. Validation against a schema
>>will trap most major errors. It will trap most of the minor errors that
>>would normally require action by an expensive and extremely bored human

> being.

>>Therefore it reduces processing costs and staff turnover.

To amplify David's response, I would observe that having a formal specification to describe the content has benefits. Bernard, though, presents the above as an absolute advantage of XML.

Certain, one can validate XML. The XML folks have, in fact, invented several such formal specification languages. My question is: What advantage do any of DTD's and XML Schemas have over the predecessors including COBOL copy books, regular expressions, BNF grammars, SQL Schemas, AWK, perl etc.

Compared to what came before, the XML alternatives seem less functional and extremely bloated.

>>Errors are rejected by a machine. That usually makes it the sender's
>>responsibility to check and correct the data. Making that unambiguous

> saves

>>a lot of time and endless arguments between business partners.

If the receiver fails to check the data, the receiver is an idiot.

> and a DBMS doesn't do this?

Neither cures stupidity.

>>Code to handle XML is standardised and therefore doesn't need to be
>>rewritten for each individual application. This makes it more reliable and
>>cheaper to develop and maintain.

> and a DBMS interface language doesn't do this?

BNF didn't do this? Regular expressions didn't do this? COBOL copy books didn't do this? AWK didn't do this? Perl didn't do this? XML invented several new "standards" forcing folks to rewrite everything multiple times in any case.

"Oh, you use DTD? We use XML Schema."

>>It is difficult to extend CSV systems boyond the simple flat-file system >>with a single record type.

Horseshit. I have seen no shortage of CSV files with multiple record types.

  Traditionally, at least in the systems I've

>>worked with, the solution is to denormalise the data from more than one
>>table. Therefore CSV is usually more verbose than XML and can take up much
>>more storage space. (The storage space argument isn't one I usually have a
>>lot of time for - it's not usually worth bothering with.)

> CSV is useful (among other things) for unloading an SQL table into a text
> file,
> transporting the text file into a completely different environment, and
> loading the data into another SQL table. There are better ways, but CSV
> sometimes works where the better ways are unavailable.
> You can deal with normalization problems by simply unloading separate tables
> to separate CSV files. Your resulting CSV files will typically be smaller
> than XML files.

UUENCODED CSV files will typically be smaller than XML files.

>>XML data is not generally manually edited, this is a huge advantage.

> Fixing
>>manually prepared data files soaks up vast amounts of time and effort.

> It's
>>more likely that XML files will be generated and read by automated systems
>>than by someone typing data. That makes XML data much more reliable than

Wasn't the fact that XML is plain text that humans can understand even in a text editor one of the early marketing pitches? I could swear it was.

> You can do the same thing with CSV. I've done it.

One can automate just about everything from playing video games to driving tractors to clustering articles by relevance. Received on Wed Mar 28 2007 - 17:13:27 CEST

Original text of this message