Home » RDBMS Server » Server Utilities » Split record in SQL*Loader
Split record in SQL*Loader [message #186778] Wed, 09 August 2006 08:17 Go to next message
hansgruijs
Messages: 10
Registered: March 2006
Location: The Netherlands (Amsterda...
Junior Member
Hello,

Is there a solution to the following SQL*Loader problem:

I have a data-file like:
123HHH345KKK

and a table like:
Naam Null? Type
----------------------- -------- --------------
ID NUMBER
NAME VARCHAR2(3)

I want the data rceord to be split in two records:
( 123, HHH)
and
( 345, KKK)

Is this possible in SQL*Loader?

Best regards, Hans
Re: Split record in SQL*Loader [message #186853 is a reply to message #186778] Wed, 09 August 2006 21:40 Go to previous messageGo to next message
rleishman
Messages: 3724
Registered: October 2005
Location: Melbourne, Australia
Senior Member
Type including two INTO clauses in your CTL file that load into the same table. You will need to use FILLER columns as well.
INTO tab
(col1 CHAR(3)
,col2 CHAR(3)
)
INTO tab
(FILLER CHAR(3)
,FILLER CHAR(3)
,col1 CHAR(3)
,col2 CHAR(3)
)



Ross Leishman
Re: Split record in SQL*Loader [message #186940 is a reply to message #186853] Thu, 10 August 2006 04:10 Go to previous messageGo to next message
hansgruijs
Messages: 10
Registered: March 2006
Location: The Netherlands (Amsterda...
Junior Member
Thanks for your reply, but it is not exacly what I mean.

I don't know the length of the record of my data-file, it's variable. But it's always a multiple of 12 characters.
Let's say it's 72 characters, then I have to split the file into 6 peaces, and insert 6 records into my table.

regards, Hans
Re: Split record in SQL*Loader [message #187106 is a reply to message #186940] Thu, 10 August 2006 21:41 Go to previous messageGo to next message
rleishman
Messages: 3724
Registered: October 2005
Location: Melbourne, Australia
Senior Member
Run the file through a filter to split each row into records. Perl is good for this sort of thing.

Ross Leishman
Re: Split record in SQL*Loader [message #187123 is a reply to message #186940] Fri, 11 August 2006 00:39 Go to previous messageGo to next message
Barbara Boehmer
Messages: 8636
Registered: November 2002
Location: California, USA
Senior Member
You can use FIX to specify the length of each record, as shown below.

-- test.dat:
123HHH345KKK678AAA901BBB234CCC567DDD


-- test.ctl:
LOAD DATA
INFILE 'test.dat' "FIX 6"
INTO TABLE test_tab
FIELDS
(id POSITION (1:3),
name POSITION (4:6))


SCOTT@10gXE> CREATE TABLE test_tab
  2    (id   NUMBER,
  3  	name VARCHAR2 (3))
  4  /

Table created.

SCOTT@10gXE> HOST SQLLDR scott/tiger CONTROL=test.ctl

SCOTT@10gXE> SELECT * FROM test_tab
  2  /

        ID NAM
---------- ---
       123 HHH
       345 KKK
       678 AAA
       901 BBB
       234 CCC
       567 DDD

6 rows selected.

SCOTT@10gXE>


icon14.gif  Re: Split record in SQL*Loader [message #187148 is a reply to message #187123] Fri, 11 August 2006 02:19 Go to previous messageGo to next message
hansgruijs
Messages: 10
Registered: March 2006
Location: The Netherlands (Amsterda...
Junior Member
Barbara,
Wow, great stuff!
This is exactly what I was looking for.
Thanks, Hans

Ross,
Also thanks, but that was already my backup-scenario.
Razz Regards, Hans
Re: Split record in SQL*Loader [message #187154 is a reply to message #187148] Fri, 11 August 2006 02:35 Go to previous messageGo to next message
rleishman
Messages: 3724
Registered: October 2005
Location: Melbourne, Australia
Senior Member
Hmmm, haven't seen that one (have seen RECSIZE though), and the doco's a bit thin on it as well. Does it cope with the linefeed characters as being optional record delimiters as Hans requires? (Sorry, too bone-lazy to test myself).

Ross Leishman
Re: Split record in SQL*Loader [message #187253 is a reply to message #187154] Fri, 11 August 2006 09:02 Go to previous message
Barbara Boehmer
Messages: 8636
Registered: November 2002
Location: California, USA
Senior Member
It counts the linefeeds as part of the fixed size, so if the data is on more than one line, it would cause an offset and I can't think of an easy workaround for that. I can't picture a solution with recsize either, but if you can post one, please share. If the data is on more than one line, I would probably load it into a staging table, then use sql to parse and insert, as shown below. Whether you choose to parse before loading using PERL or parse aftter loading using SQL all depends on your preferences and what you are most familiar with. If it is a one-time thing, it is probably quicker to use what you are most familiar with. If it is to be run repeatedly or automated, then it would be worth testing and comparing.

-- test.dat:
123HHH345KKK
678AAA901BBB234CCC567DDD


-- test.ctl
LOAD DATA
INFILE 'test.dat'
INTO TABLE stage_tab
FIELDS TERMINATED BY WHITESPACE
(all_in_one)


SCOTT@10gXE> CREATE TABLE stage_tab
  2    (all_in_one VARCHAR2 (4000))
  3  /

Table created.

SCOTT@10gXE> HOST SQLLDR scott/tiger CONTROL=test.ctl

SCOTT@10gXE> COLUMN all_in_one FORMAT A60
SCOTT@10gXE> SELECT * FROM stage_tab
  2  /

ALL_IN_ONE
------------------------------------------------------------
123HHH345KKK
678AAA901BBB234CCC567DDD

SCOTT@10gXE> CREATE TABLE test_tab
  2    (id   NUMBER,
  3  	name VARCHAR2 (3))
  4  /

Table created.

SCOTT@10gXE> INSERT INTO test_tab (id, name)
  2  SELECT SUBSTR (all_in_one, (t.rn * 6) + 1 , 3),
  3  	    SUBSTR (all_in_one, (t.rn * 6) + 4, 3)
  4  FROM   stage_tab,
  5  	    (SELECT ROWNUM - 1 rn
  6  	     FROM   DUAL
  7  	     CONNECT BY LEVEL <=
  8  		     (SELECT MAX (LENGTH (all_in_one)) / 6
  9  		      FROM   stage_tab)) t
 10  WHERE  SUBSTR (all_in_one, (t.rn * 6) + 1, 6) IS NOT NULL
 11  /

6 rows created.

SCOTT@10gXE> SELECT * FROM test_tab
  2  /

        ID NAM
---------- ---
       123 HHH
       678 AAA
       345 KKK
       901 BBB
       234 CCC
       567 DDD

6 rows selected.

SCOTT@10gXE> 



Previous Topic: SQL loader null and datatype
Next Topic: Malformed UTF-8 in external table load causes invalid number errors
Goto Forum:
  


Current Time: Sat Dec 10 11:10:34 CST 2016

Total time taken to generate the page: 0.08553 seconds