Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.server -> Re: Select Distinct over Database Link
On 20 Aug 2002 04:41:20 -0700, gwhubert_at_hotmail.com (Gene Hubert)
wrote:
It seems that your statement
insert into mylocaltable
select distinct mypkey from monster_at_mydblink;
first fetches all the 500 M rows and then does the sort on your local DB. Check the execution plan.
Anyway, an easy solution is to create a PL/SQL block to do the job, like this :
declare
cursor mycur is
select distinct(mypkey)
from monster_at_mydblink;
COMMIT_FREQUENCY CONSTANT binary_integer := 100000;
i binary_integer := 0;
begin
for mycur_rec in mycur loop
insert into mylocaltable values (mycur_rec.mypkey);
i := i + 1;
if mod(i,COMMIT_FREQUENCY ) = 0 then
commit;
end if;
end loop;
commit;
end;
/
>There are no indexes on the local table. I have tried "create table
>as select" with no improvement.
>
>Thanks,
>Gene Hubert
>
>TurkBear <jgreco1_at_mn.rr.com> wrote in message news:<s0n2muco83iqa72f38idma3fe6cfg96i3i_at_4ax.com>...
>> If you have any indexes on mylocaltable drop them before the insert..
>> have you tried
>> create table newlocaltable as select distinct mypkey from monster_at_mydblink; ?
>> Is it as fast as the select distinct mypkey from monster_at_mydblink;
>>
>> Just some thoughts...
>>
>> gwhubert_at_hotmail.com (Gene Hubert) wrote:
>>
>> >Oracle8i Enterprise Edition Release 8.1.7.2.0 - Production
>> >SunOS 5.6
>> >
>> >I'm having a problem with a query over a database link.
>> >If (in SQLPlus) I interactively do:
>> >
>> >select distinct mypkey from monster_at_mydblink;
>> >
>> >The result set (almost 1 million records) scrolls across my screen in
>> >about 15 minutes. monster_at_mydblink has a bit under 500 million
>> >records in it.
>> >
>> >Now if I do:
>> >
>> >insert into mylocaltable
>> >select distinct mypkey from monster_at_mydblink;
>> >
>> >The query hangs or runs out of temp space if I let it go long enough.
>> >I've tried the DRIVING_SITE hint but to no effect. It acts like all
>> >500 million records are going over the network to the database where I
>> >am logged on.
>> >
>> >I'm looking to get the latter query to run in some reasonable time,
>> >say 30 minutes more or less. Thanks for any assistance,
>> >
>> >Gene Hubert
>> >Durham, NC
>>
>>
>>
>> -----------== Posted via Newsfeed.Com - Uncensored Usenet News ==----------
>> http://www.newsfeed.com The #1 Newsgroup Service in the World!
>> -----= Over 100,000 Newsgroups - Unlimited Fast Downloads - 19 Servers =-----
Received on Tue Aug 20 2002 - 07:26:35 CDT