Re: Adding column and updating a huge partitioned table.

From: Kenny Payton <>
Date: Wed, 7 Jan 2015 22:14:53 -0500
Message-Id: <>

Do you have global indexes? If not the following might work nicely.

Add empty column to the table
Stop DML on partition A
CTAS partition A to create a new Table B populating the new column with function result as part of the CTAS select Create necessary indexes on new table B
Generate stats on table B
exchange partition A with table B
drop table B ( This should contain the data from the old partition and the partition should contain the result of the CTAS. Verify before dropping )

Repeat for each partition. Use parallel during CTAS and create indexes as necessary.

Depending on the size of the new added column and the free space in your blocks this approach would also prevent chaining rows.


> On Jan 7, 2015, at 9:46 PM, Michael Cunningham <> wrote:
> It sounds crazy to me, but I thought I'd give it a try.
> I have a table that is over 1.2 TB with 64 hash partitions.
> If you have a great method for this I'd love to hear it. Oracle enterprise on Linux 6.5 with SSD storage.
> I need to add a column and populate the column with data from a function. It's a data warehouse so I can prevent DML during this time (one partition at a time), but cannot prevent during DML during the update of all 64 partitions at once.
> I also need an index on this new column.
> I sure would appreciate all the input I can get on this topic because I feel like I'm missing something and there must be a better way of doing this.
> During a test it took be 6 hours to update the data in a single partition in a test environment. That comes out to around 700 hours for all 64 partitions. Please help.
> --
> Michael Cunningham

Received on Thu Jan 08 2015 - 04:14:53 CET

Original text of this message