Bulk collect : impact in performance [message #317583] |
Fri, 02 May 2008 01:50  |
hidnana
Messages: 87 Registered: November 2007 Location: chennai
|
Member |
|
|
Is bulk collect a better option to select about 4.5 millions of records into an array.
Does it improve the performance or will it choke the server ?
Is there a better way to do it ?
Also want to know the key advantages in using FORALL instead of FOR.
|
|
|
Re: Bulk collect : impact in performance [message #317587 is a reply to message #317583] |
Fri, 02 May 2008 02:02   |
pablolee
Messages: 2882 Registered: May 2007 Location: Scotland
|
Senior Member |
|
|
Quote: | Is bulk collect a better option to select about 4.5 millions of records into an array.
|
Better than what?
Quote: | Does it improve the performance or will it choke the server ?
| Improve in comparison to what?
Quote: | Is there a better way to do it ?
| A better way to do what? (Please do not say 'To load 4.5 million rows into an array', I know that, WHY do you feel that you need to load 4.5 million rows into an array? What is it that you need to do, once those values are IN the array (that's the important bit).
Quote: | Also want to know the key advantages in using FORALL instead of FOR.
| You have been pointed to the documentation many times, so you know the link.
|
|
|
|
Re: Bulk collect : impact in performance [message #317590 is a reply to message #317587] |
Fri, 02 May 2008 02:11   |
hidnana
Messages: 87 Registered: November 2007 Location: chennai
|
Member |
|
|
I meant Is that better than processing one-by-one in a loop.
This is a DWH scenario, wherein there is a huge table whose rows needs to be processed and calculations done on the columns of those rows. Hence the need to load 4.5 million rows in an array.
|
|
|
|
Re: Bulk collect : impact in performance [message #317863 is a reply to message #317600] |
Sun, 04 May 2008 07:41  |
rleishman
Messages: 3728 Registered: October 2005 Location: Melbourne, Australia
|
Senior Member |
|
|
A 4.5 million row array is going to take up a lot of memory - probably more than you have. When this happens, the array starts paging to disk. Then when you go to use the array, parts of it are on disk and parts are in memory. Worst case, you keep trying to accessrows that are not in memory and end up doing a lot of I/O. In this case, you would have been much better of selecting rows from the table as you needed them.
A much better way is to BULK COLLECT rows (say) 1000 at a time, do whatever you need to do with them (process, write them to another table, whatever) and repeat. This way, you never have more than 1000 rows in memory at a time.
See this article.
Ross Leishman
|
|
|