Re: Data type question

From: Teresa Robinson <usenet_at_teresarobinson.net>
Date: Fri, 19 Aug 2005 12:16:03 -0400
Message-ID: <ra0cg1hst9ibefd2216d6vgo2qmukqlequ_at_4ax.com>


[Quoted] "Ed Prochak" <ed.prochak_at_magicinterface.com> said on 18 Aug 2005 19:22:06 -0700, in comp.databases.oracle.tools...:

>>
>What is the business need?
>Must all the changes go as one transaction? Or are they independent?
>If they are independent, you may want a way to checkpoint where your
>updates left off so that you do not need to go back and reprocess the
>temp table.
>
>The best solution solves the business need first.
> Ed

The program is for my boss or myself to run, or some other who works with us who would be able to run the data loads to the db. This is the process:

  1. Raw data load into one table, this being a permanent dump for all data.
  2. Process raw data from dump table, insert data into other tables according to what the data is.
  3. Errors in this part are written to two error tables: one gets the whole row, the other gets details.
  4. Process errors; either correct error and insert into other tables, or save for processing later.
  5. A few times a year we will get a huge file with all raw data that had been supplied before, with some exceptions:
  6. Existing records that have changed from what we have in our db,
  7. New records, and
  8. Deleted records, which are records we have in the raw dump table that are not in the huge file.

This is where my question comes in. During processing of this huge file, the changed records must be validated on various points (such as what the table/columns require, datatype, null; also on reference points such as valid geographic data) either in the code or by the user (myself or my boss). During this process, we will have a subset of records that will have data that will be used to update one particular other table, one of the ones in (2) above. No way to tell how many records this subset will be, it could be as many as several thousand. Some of these records will go together, such as city/state/zip. They would be separate records, but they would all update one record in the receiving table. All others would each update one record each.

In my VB code, I would usually hold this info in an array of type that matches the tables the info comes out of and goes into. I would loop through this array and do the insertions/updates that way, making it all go at once.

I don't know how to do this with Forms (6i), but I have seen examples of the Record datatype and also the Varray datatype which look like they may work. The other alternatives, building a SQL statement as the process goes or doing each update as the user goes from one record to the next, just don't appeal to me. There's also the possibility of using one column in the table that's being processed as a flag of sorts, which I am doing for records that are marked as error. I could set that to 'Y', meaning to update this record, and then write a routine to just go update if UPDATE_RECORD = 'Y'.

I want to put as little stress as possible on the db, and also have this be as short in processing time as possible. I'm not sure if I've answered your question about the business need, but I'll be happy to respond further. Sorry to be so long-winded...

Thanks!

-- 
Teresa Robinson
Staff Programmer Analyst
Anteon Corporation
trobinson at anteon dot com

 Posted Via Usenet.com Premium Usenet Newsgroup Services
----------------------------------------------------------
    ** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
----------------------------------------------------------        
                http://www.usenet.com
Received on Fri Aug 19 2005 - 18:16:03 CEST

Original text of this message