Interface

From: (wrong string) ácio <coreto03_at_axa-seguros.pt>
Date: Thu, 2 Sep 1999 14:49:54 +0100
Message-ID: <7qlvka$qc$1_at_duke.telepac.pt>



Hi !

I'm working in interface between machines. I have to pass data from more then 30 tables relative to some entity from oracle to IBM Mainframe each time some new entity is added or changed. I could write triggers on insert and on update to control these situations. Each time the trigger is activated I could mark the record as changed. But its a large number or tables so I'm using a different aproach.

At the end of the day I will select * from the master table and build a text file (600 bytes per record) with all the entity information, even if no changes where produced (a mirror of the data). The problem is that this file is big: more then 15 Mb.

Since 95% of the data has no changes and it's old it's a lost of time to process it, I'm thinking about a auxiliary table to hold this data and a method to determine wich data to pass, something like:

    set all record of auxiliary table as "deleted" (if not found in master table means it was deleted)

    fetch master table entity record
    fetch all other entity data
    form the complete record to export
    goto auxiliary table and see if complete record exists:

      if it doesn't then insert it and mark as "to transfer" (new)
      if it does and it's different then mark it as "to transfer" (changed)
      if it exists and its equal then mark as "not to pass" (exists but no
change)

The first time I will pass all data to the file and the auxiliary table. From that point on only the not equal to "not to pass" will be saved to the export file and processed by the IBM Mainframe.

I wold like to know about your opinion, suggestions, comments, improvments. Is there a better way to solve this kind of problems ?

Regards
Rui Anastácio Received on Thu Sep 02 1999 - 15:49:54 CEST

Original text of this message