Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Automate Load of Multiple csv files

Re: Automate Load of Multiple csv files

From: April <PrivateBenjamin_at_hushmail.com>
Date: 4 Mar 2004 12:08:17 -0800
Message-ID: <54df0379.0403041208.4c16c8d1@posting.google.com>


"Niall Litchfield" <n-litchfield_at_audit-commission.gov.uk> wrote in message news:<4045fce1$0$19223$ed9e5944_at_reading.news.pipex.net>...
> "April" <PrivateBenjamin_at_hushmail.com> wrote in message
> news:54df0379.0403030727.189dcd34_at_posting.google.com...
> > Hi Daniel,
> >
> > I have got to the point where I have the oracle function working and
> > scheduled as a job. But I have the source files located on the
> > database server. The next step is to move them to a mapped network
> > share. I understand that you can use a UNC name in utl_file_dir for
> > the location of the files, but where do you specify the NT userid and
> > password that will be required to access the share?
> >
> > Some people have been suggesting that I use NT scheduler and create a
> > batch file that maps the network drive and invokes sqlplus to execute
> > the stored procedure. Shouldn't dbms_jobs in combination with
> > utl_file_dir accomplish the same task? (Oracle 7.3.4)
> >
> > By the way I will be upgrading to 9i, but may not be able to
> > coordinate the upgrade with the requirement for a scheduled job. So I
> > am currently stuck with 7.3.4.
> >
> > Thanks alot for your help!
>
>
> Hi
>
> UTL_FILE executes in the security context of the user that runs the Oracle
> software - this will almost certainly be local system and thus will not have
> network access. I think you should be thinking in terms of an OS scheduled
> task to copy and optionally rename the files and an oracle scheduled job to
> load them.
>
> There is an article by Norman Dunbar on
> http://www.jlcomp.demon.co.uk/faq/utl_file.html which covers exactly this
> issue, but I disagree strongly with the suggestion to change the security
> context of the Oracle service that it contains (sorry Norm).

Hi,

thanks to everyone for their help!

I took Niall's advice and had an NT job scheduled which ftps the files to the database server and then took Daniels advice to use a header file listing the files to process (which is created upon the ftp). The stored procedure then uses utl_file to read the header file and open each listed file, then dbms_sql to execute the DML statements in the listed file.

For anyone who may need a process like this, here is the stored procedure.

CREATE OR REPLACE procedure test_file_read --PURPOSE:to open text files containing DML statements and to execute those statements on the server
--METHOD: use of oracle packages UTL_FILE to read the files and DBMS_SQL to run dynamic sql statements
--files are ftp'd to db server via an NT scheduled job from a network share
is

    v_headerfile utl_file.file_type;
    v_filetoprocess utl_file.file_type;
    v_data varchar2(1022);
    v_dir varchar2 (100);
    v_cursorid number;
    v_filename varchar2(1022);
    v_result integer;

begin

    v_dir := 'D:\Ecms';

Received on Thu Mar 04 2004 - 14:08:17 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US