Re: Loading words and SQL*Loader

From: DA Morgan <damorgan_at_psoug.org>
Date: Sun, 28 Jan 2007 09:31:05 -0800
Message-ID: <1170005461.929356_at_bubbleator.drizzle.com>


SQLJ wrote:
> Thank you everybody for your suggestions.
>
> Line1, line2, line3, ... are actually sequential numbers 1, 2, 3 and
> so on.
>
> I don't think it will make much of a difference if I use utl_file
> package instead of SQL*Loader to read lines as I still need to parse
> every line and have to run a separate INSERT for every number. When
> the file is large (30000 to 50000 lines), the entire processing takes
> a while, and CPU intensive. I hoped that SQL*Loader can somehow do it
> more efficiently so I can avoid line parsing in PL/SQL and running
> lots of INSERTs. If there is no other way. I guess I need to look ways
> to optimize inserts, maybe drop indexes on the target table and
> recreate them after the load or use nologging options.
>
> Jil
>
> On Jan 26, 12:16 pm, "Jimmu" <J..._at_cccd.edu> wrote:

>> Hi,
>> Why not use a simple pl/sqlscript to read in, parse, and insert the
>> values instead of trying to shoehorn it intosql*Loader?
>> Jim
>>
>> On Jan 25, 11:04 pm, "SQLJ" <sqlju..._at_gmail.com> wrote:
>>
>>
>>
>>> Hi,
>>> I have a text file which is like
>>> Line1Number1 Number2 Number3
>>> Line2Number4
>>> Line3 Number5, Number6
>>> There could be any number of lines in the file and each line can have
>>> variable number of numbers. Numbers are single space separated. The
>>> maximum possible number of numbers in a single line is probably less
>>> then 100 but this is not known for sure
>>> Is there a way to useSql*Loaderto load each number as a separate
>>> value into a table havinhg structure like CREATE TABLE mytab (id int,
>>> some_number number)?
>>> The method I am using right now is loading the file lines into a
>>> staging table and then looping through all records and parsing them.
>>> With large data files this methods appears to be very slow and the
>>> parsing is CPU intensive. Are there any alternatives?
>>> Thanks,
>>> Jil- Hide quoted text -- Show quoted text -

  1. Please don't top post. Scroll to the bottom to reply.
  2. 30-50K rows is nothing and should not take more than a second even if you ask it to play a bit of Bach. It is not uncommon to load millions and billions of rows.
  3. What you describe is possible with SQL*Loader but also possible with external tables: Consider both. -- Daniel A. Morgan University of Washington damorgan_at_x.washington.edu (replace x with u to respond) Puget Sound Oracle Users Group www.psoug.org
Received on Sun Jan 28 2007 - 18:31:05 CET

Original text of this message