Re: OI and 'business intelligence' and reality

From: mountain man <hobbit_at_southern_seaweed.com.op>
Date: Tue, 03 May 2005 11:19:29 GMT
Message-ID: <57Jde.1657$31.351_at_news-server.bigpond.net.au>


"Kenneth Downs" <knode.wants.this_at_see.sigblock> wrote in message news:i46ck2-jce.ln1_at_pluto.downsfam.net...
> mountain man wrote:
>
>>
>> No, in fact I think you missed the point.
>> The definition was simply
>> OI = Sum (Data) + Sum(SoftwareCode)
>>
>> The domains are capable of union.
>> Why wouldn't they be?
>>
>> Get the RDBMS data file and throw it in a directory.
>> Now you have the data side of the domain.
>>
>> Get the OS software and the RDBMS software
>> and all the application software components, and
>> throw them into the same directory.
>>
>> Reasonably simple exercise. Reality check.
>>
>> The sum of this represents the present instance
>> of the DATA for an organisation and the SOFTWARE
>> at various layers in use at that organisation.
>>
>>
>
> But what are you summing? Bytes?

  1. I assert OI to be present in some measure (both positive and negative) in every element of data and every line of program code, even though this measure might be exceedingly small in some instances.
  2. When an instance of the system (data + code) is running ([r]dbms assumed) then it follows from 1 that a total measure (in respect of the organisation's investment in development) can be obtained. NOTE: that is all; I am not claiming a formula to measure a value.
  3. This total amount of OI is bound by the limit of the union of all data (of course including schema) and all software (in fact all lines of code) that the organisation has developed at that time. NB: It has been assembled into a directory above.

> Seems you can only measure the usefulness
> of any commodity by its contribution to the bottom line.

That's correct, and this assessment often requires great experience and knowledge of the specific environment, such as the schema of the database on the one hand, and on the other hand, knowledge of the application software programs.

The bottom line is intelligence, not data.

It is the (artificial) intelligence written into the code and schema to support the organisation that is being summed (in principle) to arrive at a "whole" for the total.

> Putting some
> programs into that directory can REDUCE the bottom line.

Of course it can.
Negative intelligence (eg: bugs) exists.

> To have a workable concept you have to better define the operator (+) and
> the operands (code) and (data).

I am not ready for a working concept.
I am trying to develop a principle ....
Let me express the principle in another manner.

Supposing we were to actually go to some organisation and collect all data and code as described above into a single directory, and then examine it critically from the perspective of IT management of that organisation.

It represents the present instance of development. It is the life and blood of the organisation (from the IT perspective)

Supposing now that we returned to that organisation after a period of time had elapsed, and performed the same review of this directory.

Now, we would naturally ask what is the difference between the state of the directory before, and its present state, and we could determine the elements which have changed.

These changes represent developments implemented by the organisation. Let's have a look at them.

Say we find there have been two separate developments in this intervening period of time.

  1. schema change: a column has been added to the client table to reflect a credit control status, and a credit control status table has been added to the database. This table holds credit control codes, descriptions & a processing flag.
  2. Lines have been added to a specific number of application programs, to the effect that, if a client's credit control status has an associated processing flag value of "NO" then further work, data entry and processing for that client is inhibited via an error message, directing all users in the organisation to the accounts/finance section.

NOTE: The above example attempts to paint a major development however the principle that I am trying to outline holds no matter how small the development is during the intervening period. For example, the entire development could have been reformatting a report.

In all instances, the development increases the (artificial) intelligence of the organisation. Of course, as you noted, some developments are failures, in which case the change of OI is negative.

Note that I do not claim to measure anying in the absolute sense as all of this is relative only to the organisation. It is the organisation which funds development for the purpose of constructing a system of "intelligence" that may be relied upon, and gradually incremented.

Reality exists with data and code heavily intertwined. Both of these contribute to OI, but again, I do not have to allocate the relative abundance or be specific, because at the moment I am dealing only with the principle of the concept of OI.

Organisations, and business in general, are tending towards the specialisation of their processes by development of their computer (database related) systems.

I contend that from a theoretical perspective (and IT management in respect of that organisation) that what is actually being developed here may be validly termed "artifical intelligence", or OI.

Am I making myself understood here,
or is this still viewed as metaphysics?

Pete Brown
Falls Creek
Oz
www.mountainman.com.au Received on Tue May 03 2005 - 13:19:29 CEST

Original text of this message