Home » SQL & PL/SQL » SQL & PL/SQL » Data export to flat file (merged 3) (Oracle Database 11g Enterprise Edition Release 11.2.0.3.0, Linix)
Data export to flat file (merged 3) [message #645253] Tue, 01 December 2015 10:15 Go to next message
raghx
Messages: 35
Registered: March 2006
Location: Bangalore
Member
Hi All,
We have around 100 odd tables, where we have to export the data into CSV files as initial load and then delta everyday respectively.

Also we have thought of a solution to extract as below:
1. Introduce a timestamp column in all the 100 odd tables.
2. Update all the previous/historical records timestamp_col = sysdate-2. Refer to this during the initial load activity.
2. Put an insert and update triggers on all tables to update the timestamp column.
3. Write a SELECT with WHERE timestamp_col = sysdate-1 to extract previous days data.
4. Spool the results and generate a flat file.

The worry is, some of these tables holds millions of records. Kindly provide your astute inputs on the following:
1. How can we handle the present records when new column is introduced, I mean any best way to update the new column?
2. What is the best approach to extract the delta, apart from the above solution.


Kind Regards,
Re: Data export to flat file (merged 3) [message #645259 is a reply to message #645253] Tue, 01 December 2015 10:32 Go to previous messageGo to next message
Littlefoot
Messages: 21807
Registered: June 2005
Location: Croatia, Europe
Senior Member
Account Moderator
What problem are you trying to solve? I mean, what will you do with millions of records stored in CSV files? Maybe there's another approach to the problem (such as data pump export/import) ...
Re: Data export to flat file (merged 3) [message #645260 is a reply to message #645259] Tue, 01 December 2015 10:59 Go to previous messageGo to next message
raghx
Messages: 35
Registered: March 2006
Location: Bangalore
Member
There csv files will be sftpd to a Datawarehouse app which consumes the data.

Yes I had roughly gone thru the data pump. But I thought that this could be used for initial load, and just write a sql script to generate delta.

Should this be fine?
Re: Data export to flat file (merged 3) [message #645261 is a reply to message #645260] Tue, 01 December 2015 11:18 Go to previous messageGo to next message
Michel Cadot
Messages: 68643
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator

Data Pump may also be right for delta, have a look at QUERY clause.

Re: Data export to flat file (merged 3) [message #645262 is a reply to message #645253] Tue, 01 December 2015 16:54 Go to previous messageGo to next message
John Watson
Messages: 8930
Registered: January 2010
Location: Global Village
Senior Member
You have EE licences, so I would use the Change Data Capture facility,
http://docs.oracle.com/cd/E11882_01/server.112/e25554/cdc.htm#DWHSG016
It is based on Streams which is officially deprecated in release 12 because Larry wants you to buy GoldenGate instead, but it is fully supported. From the docs:Quote:
If you are currently using Change Data Capture, then you will be able to continue to do so for the foreseeable future. However, Change Data Capture will not be further enhanced, and will only be supported based on the current, documented functionality.
I've found CDC to be rock solid reliable and the best way to ensure that every change is captured exaclty once. You won't miss anything, you won't do anything twice.
Re: Data export to flat file (merged 3) [message #645406 is a reply to message #645262] Sat, 05 December 2015 00:34 Go to previous messageGo to next message
raghx
Messages: 35
Registered: March 2006
Location: Bangalore
Member
Yea, we have CDC implementation on other new apps. But this being legacy, they dont like to invest Smile
Re: Data export to flat file (merged 3) [message #645419 is a reply to message #645253] Sat, 05 December 2015 07:23 Go to previous message
Solomon Yakobson
Messages: 3273
Registered: January 2010
Location: Connecticut, USA
Senior Member
Version??? Starting 11G adding NOT NULL column with deterministic default value takes no time. When you issue:

ALTER TABLE TBL
  ADD new_column TIMESTAMP DEFAULT deterministic-expression NOT NULL


Oracle is smart enough not to make any table data changes. It only updates data dictionary noting that column new_column default vause is value of deterministic-expression. Compare timing:

SQL> set timing on
SQL> drop table tbl purge
  2  /

Table dropped.

Elapsed: 00:00:00.02
SQL> create table tbl as select level l from dual connect by level <= 1000000
  2  /

Table created.

Elapsed: 00:00:00.61
SQL> alter table tbl
  2    add tstamp timestamp
  3  /

Table altered.

Elapsed: 00:00:00.02
SQL> update tbl
  2     set tstamp = timestamp '2015-12-01 00:00:00'
  3  /

1000000 rows updated.

Elapsed: 00:00:52.46
SQL> alter table tbl
  2    modify tstamp default systimestamp not null
  3  /

Table altered.

Elapsed: 00:00:00.06
SQL> insert
  2    into tbl(l)
  3    values(0)
  4  /

1 row created.

Elapsed: 00:00:00.01
SQL> select  *
  2    from  tbl
  3    where l <= 3
  4  /

         L TSTAMP
---------- ---------------------------------------------------------------------------
         1 01-DEC-15 12.00.00.000000 AM
         2 01-DEC-15 12.00.00.000000 AM
         3 01-DEC-15 12.00.00.000000 AM
         0 05-DEC-15 08.20.25.981000 AM

Elapsed: 00:00:00.05
SQL> drop table tbl purge
  2  /

Table dropped.

Elapsed: 00:00:00.22
SQL> create table tbl as select level l from dual connect by level <= 1000000
  2  /

Table created.

Elapsed: 00:00:00.63
SQL> alter table tbl
  2    add tstamp timestamp default timestamp '2015-12-01 00:00:00' not null
  3  /

Table altered.

Elapsed: 00:00:00.02
SQL> alter table tbl
  2    modify tstamp timestamp default systimestamp
  3  /

Table altered.

Elapsed: 00:00:00.01
SQL> insert
  2    into tbl(l)
  3    values(0)
  4  /

1 row created.

Elapsed: 00:00:00.01
SQL> select  *
  2    from  tbl
  3    where l <= 3
  4  /

         L TSTAMP
---------- -----------------------------------
         1 01-DEC-15 12.00.00.000000 AM
         2 01-DEC-15 12.00.00.000000 AM
         3 01-DEC-15 12.00.00.000000 AM
         0 05-DEC-15 08.20.27.244000 AM

Elapsed: 00:00:00.09
SQL> 


SY.
Previous Topic: Query not processing by JDBC (Hanging on the executeQuery line)
Next Topic: Distribute payments over due dates and amounts
Goto Forum:
  


Current Time: Tue Apr 23 06:01:15 CDT 2024