Home » SQL & PL/SQL » SQL & PL/SQL » Data Purging
icon5.gif  Data Purging [message #576311] Sun, 03 February 2013 02:00 Go to next message
vedha_wipro
Messages: 7
Registered: January 2012
Location: BANGALORE
Junior Member
Hi,
i have a list of 500 tables. I want to delete data from those tables based on a condition. (Data before 2008 year needs to be deleted). Each table has a column based on which data needs to be deleted. Can you please provide a code which does this efficiently and fast. Bulk collect is preferable.

Thanks,
VEDHA
Re: Data Purging [message #576312 is a reply to message #576311] Sun, 03 February 2013 02:35 Go to previous messageGo to next message
Littlefoot
Messages: 18826
Registered: June 2005
Location: Croatia, Europe
Senior Member
Account Moderator
DELETE FROM table_name WHERE date_column < DATE '2008-01-01'
seems to be the most efficient. As you probably don't want to type (or copy/paste) the same statement 500 times, write SQL which will create that script for you, such as
select 'delete from ' || table_name || ' where date_column < date ''2008-01-01'';' from user_tables;

That would be OK if all tables contain the same DATE_COLUMN. If its name differs from table to table, then you need to develop it a little bit further.

Also, foreign key constraints might complicate the whole process a lot.
Re: Data Purging [message #576313 is a reply to message #576311] Sun, 03 February 2013 03:01 Go to previous messageGo to next message
Michel Cadot
Messages: 57611
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Or in PL/SQL:
begin
  for rec in (select table_name from user_tables) loop
    execute immediate 'delete from ' || rec.table_name || ' where date_column < date ''2008-01-01''';
  end loop;
end;
/

With the same remarks than Littlefoot.

Regards
Michel
Re: Data Purging [message #576328 is a reply to message #576313] Sun, 03 February 2013 08:54 Go to previous messageGo to next message
vedha_wipro
Messages: 7
Registered: January 2012
Location: BANGALORE
Junior Member
Thank you for your reply but a normal delete will take very long time since i have 90 million records in each table. Can you give me the code using bulk collect pls.

Re: Data Purging [message #576329 is a reply to message #576328] Sun, 03 February 2013 08:59 Go to previous messageGo to next message
BlackSwan
Messages: 21945
Registered: January 2009
Senior Member
> Can you give me the code using bulk collect pls.
PL/SQL will never be faster than plain SQL.
Re: Data Purging [message #576330 is a reply to message #576329] Sun, 03 February 2013 09:01 Go to previous messageGo to next message
vedha_wipro
Messages: 7
Registered: January 2012
Location: BANGALORE
Junior Member
>PL/SQL will never be faster than plain SQL.
Yes. That is OK but i have to write it using Bulk collect. Please help me friend.Expecting a reply from you.
Re: Data Purging [message #576331 is a reply to message #576313] Sun, 03 February 2013 09:11 Go to previous messageGo to next message
vedha_wipro
Messages: 7
Registered: January 2012
Location: BANGALORE
Junior Member
Can you please give me the code using bulk collect.
Re: Data Purging [message #576332 is a reply to message #576330] Sun, 03 February 2013 09:12 Go to previous messageGo to next message
BlackSwan
Messages: 21945
Registered: January 2009
Senior Member
>That is OK but i have to write it using Bulk collect.
explain why you "have to write it using Bulk collect".
You should write code that is most efficient; which is NOT BULK COLLECT.
Re: Data Purging [message #576333 is a reply to message #576332] Sun, 03 February 2013 09:15 Go to previous messageGo to next message
vedha_wipro
Messages: 7
Registered: January 2012
Location: BANGALORE
Junior Member
hmm.. I agree with you. I need to explain to some people the difference between efficient and non-efficient code(Bulk collect). Can you provide me both efficient and non-efficient way(bulk collect) so that i can compare and submit my test results. Please do me this favour.

Re: Data Purging [message #576334 is a reply to message #576332] Sun, 03 February 2013 09:17 Go to previous messageGo to next message
vedha_wipro
Messages: 7
Registered: January 2012
Location: BANGALORE
Junior Member
Also when i give a simple plain delete statement to delelte 90million records,it took 2 hours. I have 500 tables and this will not work rite..
Re: Data Purging [message #576335 is a reply to message #576333] Sun, 03 February 2013 09:17 Go to previous messageGo to next message
BlackSwan
Messages: 21945
Registered: January 2009
Senior Member
NICE TRY.


Please read and follow the forum guidelines, to enable us to help you:

http://www.orafaq.com/forum/t/88153/0/
Re: Data Purging [message #576336 is a reply to message #576335] Sun, 03 February 2013 09:21 Go to previous messageGo to next message
vedha_wipro
Messages: 7
Registered: January 2012
Location: BANGALORE
Junior Member
What do you mean my nice try. I have asked you a question and i didn't get a solution from you till now. What data do i need to provide? Please give me the most efficient code you believe will solve my problem.leave aside bulk collect.
Re: Data Purging [message #576337 is a reply to message #576336] Sun, 03 February 2013 09:24 Go to previous messageGo to next message
BlackSwan
Messages: 21945
Registered: January 2009
Senior Member
try #7 from Posting Guidelines
Re: Data Purging [message #576380 is a reply to message #576337] Mon, 04 February 2013 05:12 Go to previous messageGo to next message
sss111ind
Messages: 427
Registered: April 2012
Location: India
Senior Member

http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:5033906925164
Re: Data Purging [message #576628 is a reply to message #576311] Wed, 06 February 2013 13:53 Go to previous messageGo to next message
alan.kendall@nfl.com
Messages: 163
Registered: June 2012
Location: Culver City, California
Senior Member
I have been using the following sql to do slow purges of 500,000 per night on large tables.
ENWEBS1S > select count(*) from site.email_audit;

  COUNT(*)
----------
    461008

ENWEBS1S > @CURSOR_DELETE_PURGE_SITE_EMAIL_AUDIT.sql
100000 lines deleted
100000 lines deleted
100000 lines deleted
100000 lines deleted
460912 rows attempted to delete.
0 errors.
460912 lines deleted.

PL/SQL procedure successfully completed.

ENWEBS1S > select count(*) from site.email_audit;

  COUNT(*)
----------
        96

The full sql follows.
set termout on
SET SERVEROUTPUT ON SIZE 1000000 FORMAT WRAPPED
DECLARE
   v_sqlerrm number;
   ncount number;
   nerrcount number;
   nrows number;
   CURSOR C1 IS 
select rowid from SITE.EMAIL_AUDIT where status='SENT' and uuid is not null and date_requested < sysdate-3 and rownum < 500001;
BEGIN
   ncount:=0;
   nerrcount:=0;
   nrows:=0;
   FOR III IN C1 LOOP
      ncount := ncount + 1;
      /* DBMS_OUTPUT.PUT_LINE(to_char(III.rowid)) */
      begin
   delete from SITE.EMAIL_AUDIT where rowid=III.rowid;
      EXCEPTION  -- exception handlers begin
   WHEN ZERO_DIVIDE THEN  -- handles 'division by zero' error
      dbms_output.put_line('zero division.');
   WHEN OTHERS THEN  -- handles all other errors
        nerrcount := nerrcount + 1;
        if trunc(nerrcount/100000)=nerrcount/100000 then
        dbms_output.put_line('100000 errors');
        END IF;
        end;
      if trunc(ncount/10)=ncount/10 then 
      commit;                                                    /* you can comment out the commits to do some tests then rollback*/
      END IF;
      if trunc(ncount/100000)=ncount/100000 then 
      DBMS_OUTPUT.PUT_LINE('100000 lines deleted');
      END IF;
   END LOOP;
   commit;                                                      /* you can comment out the commits to do some tests then rollback*/
   dbms_output.put_line(to_char(ncount)||' rows attempted to delete.');
   dbms_output.put_line(to_char(nerrcount)||' errors.');
   nrows:=ncount-nerrcount;
   dbms_output.put_line(to_char(nrows)||' lines deleted.');
END;
/
Re: Data Purging [message #576635 is a reply to message #576628] Wed, 06 February 2013 14:39 Go to previous messageGo to next message
Michel Cadot
Messages: 57611
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Note WHEN OTHERS.

Regards
Michel
Re: Data Purging [message #576753 is a reply to message #576635] Thu, 07 February 2013 14:01 Go to previous messageGo to next message
Bill B
Messages: 1049
Registered: December 2004
Senior Member
Using a loop is the slowest possible way to delete the rows. The fastest is

delete from SITE.EMAIL_AUDIT where status='SENT' and uuid is not null and date_requested < sysdate-3;
commit;
Re: Data Purging [message #576758 is a reply to message #576753] Thu, 07 February 2013 14:31 Go to previous message
alan.kendall@nfl.com
Messages: 163
Registered: June 2012
Location: Culver City, California
Senior Member
It is the slowest but it causes the least locks on tables that are being updated frequently. I can run the following script in another session, and if locking occurs, I see it and I kill my purging process. I can run the deletes in the day and permanently schedule them in the night with no worries. I used to delete and commit 1000 rows at a time but had locking so I reduced deletes to 100 rows, had locking so now I delete 10 rows at a time and the locking went away.
ENWEBP1P > @active
 SID SERIAL# MACHINE      LOGIN           DB     SEC_WAIT USERNAME  EVENT
---- ------- ------------ --------------- ------ -------- --------- ------------------
1954    4761 US\KENDALLA- 07-FEB-13 12:11 NWEBP2        0 ENWEBP2P  gc current request
1832   14525 prodapp04    06-FEB-13 21:39 NWEBP4        0 SITE_USER SQL*Net message to
1010     517 US\KENDALLA- 07-FEB-13 12:09 NWEBP3        0 ENWEBP3P  gc current request

The full sql follows:
set lines 200
set pages 50
set trunc off
set wrap off
column username print
column username format a11
column program format a60
column machine format a18
column sid format 999999
col sec_wait format 9999999
column serial# format 999999
column login format a15
col db for a10
col sid print
col serial# print
SELECT sid,
       serial#,
       machine,
       To_char(s.logon_time, 'DD-MON-RR HH24:MI') login,
       i.instance_name    db,
       s.seconds_in_wait   sec_wait,
       s.username,
       s.event,
       s.status,
       s.program,
       s.machine,
       s.MODULE,
       s.terminal
FROM   gv$session s,
       gv$instance i
WHERE  i.inst_id = s.inst_id
       AND s.status = 'ACTIVE'
       AND s.username IS NOT NULL
       AND s.wait_class <> 'Idle'
--and seconds_in_wait>0
ORDER  BY seconds_in_wait
/ 

I find the advantage of being able to do purges on critical tables during normal working hours a big benefit.
Previous Topic: Backup and Purge Data
Next Topic: COUNTING THE DISTINCT OCCURENCE OF DATA FROM MULTIPLE COLUMNS
Goto Forum:
  


Current Time: Sat Apr 19 00:51:49 CDT 2014

Total time taken to generate the page: 0.14141 seconds