Home » SQL & PL/SQL » SQL & PL/SQL » Dynamic Merge (10.2.0.4.0)
Dynamic Merge [message #443286] |
Sun, 14 February 2010 22:49  |
maheshmhs
Messages: 93 Registered: October 2008 Location: Nepal
|
Member |
|
|
Hi all.
I have to create a table with around 450 columns.However, the 450 column names are not static. The names and data type are stored in another table. So i have to create columns by reading the values from another table. This is not a problem, as it can be created using cursor.
The problem is that i have to populate this table using another table which contains around 3 million records.Different logic is applied for each column. Since there are 450 columns simply populating it didn't work. I mean to say, it never finished execution so had to be interrupted. Then I tried creating several tables with around 90 columns each and merge them on the final table. Again, merging dindn't complete. It had to be stopped.
Is there any other ways of doing it? Now, i have run out of ideas.
Any hints would be highly useful.
Thank You
--maheshmhs
|
|
|
|
Re: Dynamic Merge [message #443299 is a reply to message #443287] |
Mon, 15 February 2010 00:44   |
maheshmhs
Messages: 93 Registered: October 2008 Location: Nepal
|
Member |
|
|
Thank you BlackSwan for your willingness to help.
But i must confess that the design is out of my scope. I am just told to find out the best way doing it, without altering the final structure of the table.
Also the table is already in 3rd normal form. Single column is a primary key. All columns are dependent on the primary key only.
Any other hints are highly appreciated.
Thanks
|
|
|
|
|
Re: Dynamic Merge [message #443352 is a reply to message #443286] |
Mon, 15 February 2010 04:42   |
JRowbottom
Messages: 5933 Registered: June 2006 Location: Sunny North Yorkshire, ho...
|
Senior Member |
|
|
It's possible that there are performance improvements that could be made in your code, but we can't know that, because we don't have your code.
On the other hand, if you're reading 3,000,000 records, and doing 450 calculations per record, then that's 1,350,000,000 calculations.
If the average calculation can be done in 1/1000 of a second then you're still looking at 375 hours, not including the time taken to do the inserts.
|
|
|
Re: Dynamic Merge [message #443367 is a reply to message #443286] |
Mon, 15 February 2010 06:08   |
maheshmhs
Messages: 93 Registered: October 2008 Location: Nepal
|
Member |
|
|
Hello!
Lets say below is the source table
CREATE TABLE zzz_main_src_test
(
nmemid VARCHAR2(10),
colnum VARCHAR2(10),
col_value VARCHAR2(10)
);
Now it is populated as below
BEGIN
FOR a IN 1..6667 LOOP
FOR b IN 1..450 LOOP
BEGIN
EXECUTE IMMEDIATE 'Insert into zzz_main_src_test(nmemid, colnum, col_value) values('''||a||''','''||b||''','||Round(Dbms_Random.value()*100,4)||')';
END;
END LOOP;
COMMIT;
END LOOP;
END;
The final table is created as
DECLARE
CURSOR cur_collist IS
SELECT DISTINCT 'C_'||colnum AS colname FROM zzz_main_src_test;
BEGIN
BEGIN EXECUTE IMMEDIATE 'DROP TABLE zzz_450col_test PURGE'; EXCEPTION WHEN OTHERS THEN NULL; END;
EXECUTE IMMEDIATE 'CREATE TABLE zzz_450col_test (nmemid varchar2(10))';
FOR a IN cur_collist LOOP
BEGIN
EXECUTE IMMEDIATE 'ALTER TABLE zzz_450col_test ADD '||a.colname||' VARCHAR2(10)';
END;
END LOOP;
END;
Now I have to populate zzz_450col_test using col_value of zzz_main_src_test based on colnum. So there are going to be 6667 rows in zzz_450col_test.
---This is just a sample data.
Any suggestions?
|
|
|
|
|
Re: Dynamic Merge [message #443383 is a reply to message #443286] |
Mon, 15 February 2010 07:28   |
cookiemonster
Messages: 13964 Registered: September 2008 Location: Rainy Manchester
|
Senior Member |
|
|
Not really since it only shows the first part of what you are doing.
It shows how you generate the table structure but gives us no clue as to how you are then populating it with data.
Also we have no idea what you are trying to achieve with this, and without that information there really isn't anything we can suggest to improve things.
As JRowbottom observed above the amount of work you claim you need to do is always going to take ages, even if you were running it on the most powerful server known to man.
I wouldn't be looking for ways to speed up the creation/population of this table. I'd be looking for ways to bypass having to create it at all. But unless you give us a lot more details on why you are trying to do this neither I nor anyone else can give you any advise on how to achieve this.
|
|
|
Re: Dynamic Merge [message #443384 is a reply to message #443375] |
Mon, 15 February 2010 07:32   |
 |
Alessandro Rossi
Messages: 166 Registered: September 2008 Location: Rome
|
Senior Member |
|
|
There's no way to catch the sense of what you'd like to do with that script.
The problem is not yet explained.
To explain a problem you should well explain all the followings:
- What you have as the starting environment: describe entities and relations involved in your problem.
- What data model you use: describe the tables you used to represent the previous environment and their columns.
- What you want as the output. A table, a string, a particular report or whatever you want to have after the elaboration
of the solution.
If you follow this requirements, any reader could easily understand if he's able to find the solution for you.
Bye Alessandro
[Updated on: Mon, 15 February 2010 07:33] Report message to a moderator
|
|
|
|
Re: Dynamic Merge [message #443406 is a reply to message #443401] |
Mon, 15 February 2010 09:10   |
cookiemonster
Messages: 13964 Registered: September 2008 Location: Rainy Manchester
|
Senior Member |
|
|
BlackSwan wrote on Mon, 15 February 2010 14:50> EXECUTE IMMEDIATE 'Insert into zzz_main_src_test(nmemid, colnum, col_value) values('''||a||''','''||b||''','||Round(Dbms_Random.value()*100,4)||')';
Why do you make a table with 450 columns & then only INSERT 3 columns into each/every row?
If you re-read the example code you'll notice that table only has three columns. It's being using to store the column list for the other table (zzz_450col_test) which does have 450 columns (451 actually).
|
|
|
|
Re: Dynamic Merge [message #443423 is a reply to message #443367] |
Mon, 15 February 2010 11:35   |
 |
Alessandro Rossi
Messages: 166 Registered: September 2008 Location: Rome
|
Senior Member |
|
|
maheshmhs wrote on Mon, 15 February 2010 13:08
Now I have to populate zzz_450col_test using col_value of zzz_main_src_test based on colnum. So there are going to be 6667 rows in zzz_450col_test.
Any suggestions?
So you're looking for a pivot query!
To pivot your data Oracle 11g has a proper clause for it and on 10g and 9i it's possible to do such a task with a few steps.
Suppose your data as follows:
create table keys as
select rownum as id
from dual
connect by rownum <= 6666
/
create table cols as
select 'C'||lpad(rownum,3,'0') as col_name
from dual
connect by rownum <= 450
/
create table vals as
select id,col_name,dbms_random.value as val
from keys
cross join cols
/
create index vals_dx on vals (
id,col_name
)
/
You now need a query to pivot data on table vals among the columns described in table cols. The index on the vals table can be useful during the population phase.
To do so you need also to use a method for string aggregation, and, in this example, I opted for the collection based one that uses the following collection and function.
create or replace type string_table as table of varchar2(4000)
/
CREATE OR REPLACE FUNCTION tab2clob (
tab in string_table,
delimitatore in varchar2
) return clob
as
out_string CLOB;
begin
out_string := NULL;
if ( cardinality(tab) > 0 ) then
for i in tab.first .. tab.last loop
if ( i != tab.first ) then
if ( delimitatore is not null ) then
DBMS_LOB.writeappend(OUT_STRING,LENGTh(delimitatore),delimitatore);
end if;
else
dbms_lob.createtemporary(OUT_STRING,true);
end if;
IF ( TAB(I) IS NOT NULL ) then
DBMS_LOB.writeappend(OUT_STRING,LENGTh(tab(i)),tab(i));
end if;
end loop;
return out_string;
else
return null;
end if;
end;
/
Now a pivot query can be created in two ways.
select /*with a lot of RAM for SORT_AREA this would be really efficient*/
'select id,
'||tab2clob(
cast(collect(
' max(decode(col_name,'''||col_name||''',val)) as '||col_name
) as string_table),
',
'
)||'
from vals
group by id'
from cols
/
select /*Immediately fetches the first rows and doesn't require too much RAM to be efficient*/
'select id,
'||tab2clob(
cast(collect(
' (select val from vals v where v.id=k.id and v.col_name='''||col_name||''') as '||col_name
) as string_table),
',
'
)||'
from keys k
group by id'
from cols
/
Witch output something like:
select id,
max(decode(col_name,'C001',val)) as C001,
max(decode(col_name,'C002',val)) as C002,
max(decode(col_name,'C003',val)) as C003,
...
...
...
max(decode(col_name,'C449',val)) as C449,
max(decode(col_name,'C450',val)) as C450
from vals
group by id
/
Or
select id,
(select val from vals v where v.id=k.id and v.col_name='C001') as C001,
(select val from vals v where v.id=k.id and v.col_name='C002') as C002,
(select val from vals v where v.id=k.id and v.col_name='C003') as C003,
...
...
...
(select val from vals v where v.id=k.id and v.col_name='C449') as C449,
(select val from vals v where v.id=k.id and v.col_name='C450') as C450
from keys k
group by id
/
Your 450cols table can then be created with the "create table as select" method using one of these queries. Obviously it will take a lot but there is the chance to create a table of 3,000,000 records within 1 day on a decent system.
Bye Alessandro
|
|
|
Re: Dynamic Merge [message #443456 is a reply to message #443286] |
Mon, 15 February 2010 22:18  |
maheshmhs
Messages: 93 Registered: October 2008 Location: Nepal
|
Member |
|
|
Thank You all!
Thank you Alessandro Rossi!!
Although, the problem that I posted isn't the actual problem that I had to solve. I couldn't post the actual problem due to its complexity and some security issues.
But the soln that I got from Alessandro Rossi is sufficient for me. Thank you for your effort in writing down the code.
And you know what, it took only 160.738 sec in my machine for the final ctas to complete.
Thank you again.
|
|
|
Goto Forum:
Current Time: Thu Feb 13 09:56:03 CST 2025
|