Home » SQL & PL/SQL » SQL & PL/SQL » How to decrease time of inserting large data (Oracle 9i)
|
|
Re: How to decrease time of inserting large data [message #423901 is a reply to message #423896] |
Tue, 29 September 2009 03:11   |
JRowbottom
Messages: 5933 Registered: June 2006 Location: Sunny North Yorkshire, ho...
|
Senior Member |
|
|
Top causes I can think of, based on the total absence of iformation provided:
1) You've got triggers on that table that run for each row
2) You're using a call to a function in the sql.
3) You're doingthe insert from a very poorly performing piece of SQL.
|
|
|
|
Re: How to decrease time of inserting large data [message #423903 is a reply to message #423902] |
Tue, 29 September 2009 03:18   |
Frank
Messages: 7901 Registered: March 2000
|
Senior Member |
|
|
One step back:
- where do you get your data from? What is your source? Is it a temporary table (as in a "Global Temporary Table" in Oracle's definitions?)
- Do you say that moving the data from the temporary table to the staging tables takes 6 hours for 25 thousands rows?
- You are not running on any kind of weird out-of-date hardware?
- Like JRowbottom said: check the tables for triggers
- Post your code!!
|
|
|
Re: How to decrease time of inserting large data [message #423904 is a reply to message #423903] |
Tue, 29 September 2009 03:26   |
carillpower
Messages: 48 Registered: April 2009 Location: malaysia
|
Member |

|
|
create table A(
emplid varchar2(30),
last_name varchar2(30),
male_first_name varchar2(30),
female_first_name varchar2(30),
unisex_name varchar2(30))
tablespace ENROLL_DATA01
pctused 0
pctfree 10
initrans 1
maxtrans 255
storage (
INITIAL 16K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
)
logging
nocache
noparallel
monitoring;
create unique index A_ind on A(emplid);
logging
tablespace ENROLL_DATA01
pctused 0
pctfree 10
initrans 1
maxtrans 255
storage (
INITIAL 24K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
)
noparallel;
commit;
exec dbms_stats.gather_table_stats
('BENAEADM','SCRAMBLED_CHUBB_EMPL_TBL',ESTIMATE_PERCENT=>dbms_stats.auto_sample_size,METHOD_OPT=>'for all columns size auto',CASCADE=>True);
@table_create.sql;
exec table_create(1,'AABERG', 'AARON', 'Aaliyah','Abie');
exec table_create(2,'AABY', 'Aaron', 'AARON','Adair' );
exec table_create(3,'AADLAND', 'AARON', 'ABBEY','Addison' );
exec table_create(4,'AAGAARD', 'Abdiel', 'Abbey','Adriel' );
exec table_create(5,'AAKRE', 'ABDUL', 'Abbie','Africa' );
exec table_create(6,'AALAND', 'Abdullah', 'ABBIE','Afton' );
exec table_create(7,'AALBERS', 'ABE', 'Abbigail','Aidan' );
exec table_create(8,'AALDERINK', 'Abel', 'Abby','Aiken' );
.
.
.
.
exec table_create(25902,'FLEAK', 'Darion', 'Sarai','Storm' );
P/S: table_create.sql it will compile procedure that hav insert statement base on the input
Thanks
|
|
|
Re: How to decrease time of inserting large data [message #423905 is a reply to message #423902] |
Tue, 29 September 2009 03:35   |
JRowbottom
Messages: 5933 Registered: June 2006 Location: Sunny North Yorkshire, ho...
|
Senior Member |
|
|
[quoteP/S: table_create.sql it will compile procedure that hav insert statement base on the input][/quote]
That's your problem right there. If this is compiling (or even just running) a pl/sql block for each row, then that's going to slow things down a lot
Can you show us this code from the script table_create.sql?
All you should need for this is:
insert into A values (1,'AABERG', 'AARON', 'Aaliyah','Abie');
insert into A values (2,'AABY', 'Aaron', 'AARON','Adair' );
insert into A values (3,'AADLAND', 'AARON', 'ABBEY','Addison' );
...
|
|
|
|
Re: How to decrease time of inserting large data [message #423908 is a reply to message #423907] |
Tue, 29 September 2009 03:57   |
JRowbottom
Messages: 5933 Registered: June 2006 Location: Sunny North Yorkshire, ho...
|
Senior Member |
|
|
It could be worse I suppose - you could be recompiling the procedure each time.
You don't need this procedure at all.
Try rewriting your scruipt the way I suggested in my previious post - replace the calls to table_create with direct insert statements.
Additionally, as EMPLID is a value you set, you can leave the unique index off until after you've done the insert - if there are any duplicate Emplid values, you can just renumber them.
|
|
|
Re: How to decrease time of inserting large data [message #423914 is a reply to message #423908] |
Tue, 29 September 2009 04:16   |
carillpower
Messages: 48 Registered: April 2009 Location: malaysia
|
Member |

|
|
Ok, now i can see the problem not comes from where that i assume. Let say i've made changes to the code...
1- i insert data from another table into the staging table
2- i update the existing data with adding the other data from outsource...
Like this...
create table A (
emplid varchar2(30),
last_name varchar2(30),
male_first_name varchar2(30),
female_first_name varchar2(30),
unisex_name varchar2(30))
tablespace ENROLL_DATA01
pctused 0
pctfree 10
initrans 1
maxtrans 255
storage (
INITIAL 16K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
)
logging
nocache
noparallel
monitoring;
--alter table A nologging;
insert into A (emplid) select distinct(emplid) from ps_job@cbpsprdp;
create unique index A_ind on A(emplid);
logging
tablespace ENROLL_DATA01
pctused 0
pctfree 10
initrans 1
maxtrans 255
storage (
INITIAL 24K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
)
noparallel;
commit;
exec dbms_stats.gather_table_stats
('BENAEADM','A',ESTIMATE_PERCENT=>dbms_stats.auto_sample_size,METHOD_OPT=>'for all columns size auto',CASCADE=>True);
@table_create.sql;
exec table_create(1,'AABERG', 'AARON', 'Aaliyah','Abie');
exec table_create(2,'AABY', 'Aaron', 'AARON','Adair' );
exec table_create(3,'AADLAND', 'AARON', 'ABBEY','Addison' );
exec table_create(4,'AAGAARD', 'Abdiel', 'Abbey','Adriel' );
exec table_create(5,'AAKRE', 'ABDUL', 'Abbie','Africa' );
exec table_create(6,'AALAND', 'Abdullah', 'ABBIE','Afton' );
exec table_create(7,'AALBERS', 'ABE', 'Abbigail','Aidan' );
exec table_create(8,'AALDERINK', 'Abel', 'Abby','Aiken' );
exec table_create(9,'AALUND', 'ABEL', 'ABBY','Akia' );
exec table_create(10,'AAMODT', 'Abraham', 'Abigail','Akiva' );
exec table_create(11,'AAMOT', 'ABRAHAM', 'ABIGAIL','Aldis' );
.
.
.
exec table_create(25902,'FLEAK', 'Darion', 'Sarai','Storm' );
P/S : table_create.sql are procedure like this::
create or replace
PROCEDURE table_create(num_row number, l_name VARCHAR, m_fname VARCHAR2, f_fname VARCHAR2, u_fname varchar2) IS
v_row A%ROWTYPE;
BEGIN
v_row.last_name:=l_name;
v_row.male_first_name:=m_fname;
v_row.female_first_name:=f_fname;
v_row.unisex_name:=u_fname;
update A set
last_name = l_name,
male_first_name = m_fname,
female_first_name = f_fname,
unisex_name = u_fname
where emplid = (select emplid from
(select row_number() over (order by emplid) as n, emplid
from A)
where n= num_row);
END table_create;
/
Do u hav any suggestion for me to do any changes for upgrade?
Thanks a lot anyway
|
|
|
Re: How to decrease time of inserting large data [message #423916 is a reply to message #423896] |
Tue, 29 September 2009 04:24   |
cookiemonster
Messages: 13963 Registered: September 2008 Location: Rainy Manchester
|
Senior Member |
|
|
Using staging tables rarely improves performance.
I suspect you want a single insert/select or possibly a CTAS (create table as select) but since you haven't supplied the full code it's impossible to tell for sure.
And next time you post code can you please use code tags - see the orafaq forum guide if you're not sure how..
|
|
|
Re: How to decrease time of inserting large data [message #423928 is a reply to message #423914] |
Tue, 29 September 2009 05:34   |
JRowbottom
Messages: 5933 Registered: June 2006 Location: Sunny North Yorkshire, ho...
|
Senior Member |
|
|
Yes, and this is the third (and last) time I'm going to tell you.
Stop calling a this table create procedure once per row, and instead just make a straight forwards insert statement.
IE replace all the lines like this:exec table_create(1,'AABERG', 'AARON', 'Aaliyah','Abie'); with lines like this:insert into A values (1,'AABERG', 'AARON', 'Aaliyah','Abie');
Then go back and fill in any missing records with blank ids.
Or, if you do want to pre-populate the table with ids, then use Update statements, and replace the calls to table_create withupdate A
set last_name = 'AABERG'
,male_first_name = 'AARON'
,female_first_name = 'Aaliyah'
,unisex_name = 'Abie'
where emplid = 1;
Just stop making these calls to the procedure.
|
|
|
Re: How to decrease time of inserting large data [message #424615 is a reply to message #423928] |
Sun, 04 October 2009 20:36   |
carillpower
Messages: 48 Registered: April 2009 Location: malaysia
|
Member |

|
|
Hi,
Thanks for your suggestion...
I've change my code to this
set serveroutput on size 1000000
set echo off
set head off
set feed off
set pause off
--drop table scrambled_chubb_empl_tbl;
select 'Process starts at '||to_char(sysdate,'MM-DD-YYYY HH:Mi:SS') from dual ;
create table scrambled_chubb_empl_tbl (
num_id number(30),
emplid varchar2(30),
last_name varchar2(30),
male_first_name varchar2(30),
female_first_name varchar2(30),
unisex_name varchar2(30))
tablespace ENROLL_DATA01
pctused 0
pctfree 10
initrans 1
maxtrans 255
storage (
INITIAL 32K
)
logging
nocache
noparallel
monitoring;
insert into scrambled_chubb_empl_tbl (num_id, last_name, male_first_name, female_first_name, unisex_name) values (1,'AABERG', 'AARON', 'Aaliyah','Abie');
insert into scrambled_chubb_empl_tbl (num_id, last_name, male_first_name, female_first_name, unisex_name) values (2,'AABY', 'Aaron', 'AARON','Adair' );
insert into scrambled_chubb_empl_tbl (num_id, last_name, male_first_name, female_first_name, unisex_name) values (3,'AADLAND', 'AARON', 'ABBEY','Addison' );
.
.
.
insert into scrambled_chubb_empl_tbl (num_id, last_name, male_first_name, female_first_name, unisex_name) values (26311,'FORCINO', 'Genaro', 'TANGELA', 'Keran' );
commit;
@table_create.sql;
exec table_create.sql;
in table_create procedure...is like this :
create or replace
PROCEDURE table_create IS
cursor c1 is
select count(*) num_all from scrambled_chubb_empl_tbl;
rec c1%rowtype;
num_row number(30) := 1;
num_com number(30) :=4000;
BEGIN
open c1;
fetch c1 into rec;
while num_row <= rec.num_all loop
update scrambled_chubb_empl_tbl
set emplid = (select distinct emplid from (
select emplid,
count(distinct emplid) c_emp,
row_number() over (order by emplid) as n from ps_job@cbpsprdp
group by emplid)
where n= num_row)
where num_id = num_row;
num_row := num_row + 1;
if num_row = num_com then
commit;
end if;
num_com := num_com + 4000;
end loop;
close c1;
dbms_output.put_line (num_row - 1 ||' Data have been updated');
END table_create;
The problem now is...when it moves to procedure process...it shows error says dbms_lock issue...is there any suggestion for this?
Thanks a lot
|
|
|
|
Re: How to decrease time of inserting large data [message #424658 is a reply to message #423896] |
Mon, 05 October 2009 03:56  |
cookiemonster
Messages: 13963 Registered: September 2008 Location: Rainy Manchester
|
Senior Member |
|
|
1) Your code would be a lot more readable if you used code tags - see the orafaq forum guide if you're not sure how.
2) When you get errors it helps to copy and paste the full error message and number.
3) Nothing in the code uses dbms_lock. So if you're getting an error on that then there is some other code causing it.
4) You'd be better off getting rid of the loops and count(*)s and doing a single update statement that updates all the rows.
|
|
|
Goto Forum:
Current Time: Thu Feb 06 16:51:23 CST 2025
|