Skip navigation.

DBA Blogs

Oracle Database: Script to Purge aud$ Table Using dbms_audit_mgmt Package

Pythian Group - Tue, 2014-09-02 07:59

With an increase in requirements on database security, database auditing is part of most production databases nowadays. The growth on Oracle database audit table”‘aud$” always add spice on a DBA’s life. Since this table growth directly impacts the database performance, this table got special place on every DBA’s heart.

Traditionally we follow many methods to purge the old data from this table, all these methods require application downtime for most of the time. Oracle introduced a new in-house package named “DBMS_AUDIT_MGMT”, which gives more control for a DBA over management of auditing records.

I tried to test this package on my 11gR2 test database. Although this can be done from oracle rdbms version But we need to apply the required patches on (Patch 6989148) and (Patch 6996030) versions. This package is installed by default on versions and I configured audit_trail parameter value to db_extended and enabled database auditing. Later I moved this db audit table and associated LOB segments to dedicated tablespace named “AUDTBS”. I confirmed the audit functionality is running fine after the tablespace change.

Thanks to MOS notes 1362997.1 and 1508787.1, I successfully tested this purge procedure. I configured the database scheduler jobs in such a way that they should run once per 12 hours, purges data from aud$ table, which are older than 7 days. Here is the script(purge_job.sql) used myself to configure the required jobs.

Script: purge_job.sql ==> Run this script as SYS database user account.

prompt start of the script
set serveroutput on
prompt Change based on our customization done
update dam_config_param$ set string_value=’AUDTBS’ where audit_trail_type#=1 and param_id=22;

prompt First Step: init cleanup (if not already)

dbms_output.put_line(‘Calling DBMS_AUDIT_MGMT.INIT_CLEANUP’);
audit_trail_type => dbms_audit_mgmt.AUDIT_TRAIL_AUD_STD,
default_cleanup_interval => 12);
dbms_output.put_line(‘Cleanup for STD was already initialized’);
end if;

prompt revert back to default values again
update dam_config_param$ set string_value=’SYSAUX’ where audit_trail_type#=1 and param_id=22;

prompt set last archive timestamp to older than 7 days

last_archive_time => sysdate – 7);

prompt setup a purge job

AUDIT_TRAIL_PURGE_NAME => ‘Standard_Audit_Trail_PJ’,

prompt call DBMS_AUDIT_MGMT.SET_LAST_ARCHIVE_TIMESTAMP regularly to advance the last archive timestamp

create or replace procedure set_archive_retention
(retention in number default 7) as
last_archive_time => sysdate – retention);

DBMS_SCHEDULER.create_job (
job_name => ‘advance_archive_timestamp’,
job_type => ‘STORED_PROCEDURE’,
job_action => ‘SET_ARCHIVE_RETENTION’,
number_of_arguments => 1,
start_date => SYSDATE,
repeat_interval => ‘freq=hourly;interval=12′ ,
enabled => false,
auto_drop => FALSE);
(job_name =>’advance_archive_timestamp’,
argument_position =>1,
argument_value => 7);

DBMS_SCHEDULER.run_job (job_name => ‘advance_archive_timestamp’,
use_current_session => FALSE);

prompt End of the script

To verify the purge status and configured jobs status execute the following queries.

SQL> select min(NTIMESTAMP#) from aud$;
SQL> select LAST_START_DATE,LAST_RUN_DURATION,NEXT_RUN_DATE from dba_scheduler_jobs where job_name=’STANDARD_AUDIT_TRAIL_PJ’;

We can definitely customize this script based on requirement, which is different for each database. But testing is required on the cloned database before configuring these purge jobs on the production database.


Categories: DBA Blogs

Azure Storage: Creating, Maintaining, and Deleting SQL Server Backups

Pythian Group - Tue, 2014-09-02 07:50

This post covers how to create, monitor, maintain, and automatically delete SQL Server backups.

What is Azure Storage and why should you use it?

Microsoft Windows Azure is Microsoft’s cloud offering for offsite storage. It offers the ability to seamlessly enable massive storage, Virtual Servers, SQL Server database instances, and many other options without having to worry about the hardware or maintenance in house.

Many companies are currently using Azure as offsite storage for their nightly Production backups. A company chooses one of 15 datacenters that Microsoft has around the world. This datacenter automatically and transparently maintains three copies of each backup file, and also replicates to a second datacenter in a different geographic location. The replication is not real-time, but in general there will always be six copies of each backup file available in case of an emergency.

In the event the Primary datacenter fails, Microsoft will decide when or if to failover to the Secondary datacenter. However, in the coming months they plan to roll out an API which would allow individual clients to make that decision.

SQL Server 2012 SP1 CU6+ is required.

The current pricing is about $90 per month per TB of storage used.

Accessing the Azure front end
To access the Azure front end:

  • Open Internet Explorer and navigate to
    • You will be prompted to login with a Microsoft MSDN Account.
  • The Azure administrator in your company should have granted this account access.
  • Click on the Azure Portal icon to bring up the Azure Main Page for your account.
  • Click on the Storage Icon on the left.
  • Drill down into your storage account to open the Storage Main Page.
  • Click on the Containers tab
  • Drill down into your containerThis is a list of all of the backups being written to the Production container.

The backups are ordered by their name, and unfortunately there is no way to sort by Date Modified or other field. You can see more backups than are listed on the front page by clicking on the arrow at the bottom left.

Checking the status of an Azure backup file
To check the status of a backup to Azure storage, you have two options:
1. The Azure Portal
2. SQL Server Management Studio

To use the Azure Portal, navigate to the container the backup is being written to, and find it in the list of files. If the size is 1 TB, then the backup is currently running. Using the Azure Portal, you can only see if a backup file is being created. You cannot see how much time is remaining until it is done.

To use SQL Server Management Studio, open SSMS and connect to the instance running the backups. Run the following command to get percent done & estimated time to completion:

CONVERT(NUMERIC(6,2),r.percent_complete) AS [PercentComplete],
CONVERT(VARCHAR(20),DATEADD(ms,r.estimated_completion_time,GetDate()),20) AS [ETA Completion Time],
CONVERT(NUMERIC(10,2),r.total_elapsed_time/1000.0/60.0) AS [Elapsed Min],
CONVERT(NUMERIC(10,2),r.estimated_completion_time/1000.0/60.0) AS [ETA Min],
CONVERT(NUMERIC(10,2),r.estimated_completion_time/1000.0/60.0/60.0) AS [ETA Hours],
CONVERT(VARCHAR(1000),(SELECT SUBSTRING(text,r.statement_start_offset/2, CASE WHEN r.statement_end_offset = -1 THEN 1000 ELSE (r.statement_end_offset-r.statement_start_offset)/2 END) FROM sys.dm_exec_sql_text(sql_handle)))
sys.dm_exec_requests r

Backing up to an Azure storage container
In order to write a backup file to a Microsoft Azure storage container, two items are needed:

      • A SQL Server credential to connect to the Azure Account
        • This requires the Account Name and a Storage Key.
      • The container URL

To find the Azure Account Name and Storage Key, navigate to the Azure Main Page. At the bottom of the page, there is a “Manage Access Keys”. Clicking on this icon bring ups the Account Name and a Primary & Secondary Access Key.

1. Copy the Account Name and one of the Access Keys. DO NOT REGENERATE THE KEYS.
2.    Copy the URL.
3.    Open SQL Server Management Studio and connect to the RLPRODMSSQL01 instance.
4.    From a new query window, run the following command:

IDENTITY = ‘Account Name’,
SECRET = ‘Storage Key’

5.    Run this same command on all instances that will backup to this container.
6.    Run the following command to backup a database to the Azure Storage container:

FROM URL = ‘Container URL + Backup File Name’
CREDENTIAL = ‘AzureBackups’

Restoring from an Azure Storage Container
To restore from an Azure Storage Container, two items are needed:

1. A credential to connect to the Azure Account (See steps 1-4 of Backing up to an Azure Storage Container)
2. The backup file URL

To get the backup file URL, navigate to the container where the backup file is stored. The URL is to the left of the backup name.

1. Copy the URL.
2. Run the following command on the instance you want to restore the database onto:

FROM URL = ‘Backup File URL’
CREDENTIAL = ‘AzureBackups’

Deleting SQL Server backups from Azure storage

In SQL Server 2012, Azure storage is not fully integrated with Maintenance Plans and deleting old backups is a manual process. This causes issues, because there is no way to quickly delete a batch of backups, and if this is forgotten for a few days then the cost of storage begins to rise quickly.

I have written the below code to create an executable that will connect to the Azure storage container and delete any backups older than x days.

In addition, the code can check for any backups that have a “locked lease”, break the lease, and then delete them.

The parameters for the executable are:

  • Parameter 1 – MS Azure Account Name (string)
  • Parameter 2 – MS Azure Storage Key (string)
  • Parameter 3 – Azure Container Name (string)
  • Parameter 4 – Number of days backups to retain (positive integer)
  • Parameter 5 – File type to delete (.bak, .trn, etc..)
  • Parameter 6 – Delete backups with locked lease? (True/False)
    • Note that a True value for parameter 6 will cause the executable to ignore parameters 4 and 5.
    • This is meant to be run after a failed backup job.

In order to work, the executable will need the Windows Azure Storage Client Library.

Alternatively, you can download and run the executable using this ZIP file.


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;

namespace DeleteAzureBackups
class Program

static void Main(string[] args)
if (args.Length != 6) { Console.WriteLine(“Please run with correct number of parameters. Type ? for help.”); return; }

if (args[0] == “?” || args[0] == “help” || args[0] == “h”)
Console.WriteLine(“Pythian Azure Backup Delete Utility”);
Console.WriteLine(“Parameter 1 : String : MS Azure Account Name”);
Console.WriteLine(“Parameter 2 : String : MS Azure Account Key”);
Console.WriteLine(“Parameter 3 : String : Container Name”);
Console.WriteLine(“Parameter 4 : Positive Integer : Number of days to retain backups”);
Console.WriteLine(“Parameter 5 : String : File type to delete (.bak, .trn, etc…)”);
Console.WriteLine(“Parameter 6 : True/False : Delete backups with locked leases (will ignore Parameters 4 & 5)”);

// Account name and key.
string accountName = args[0].ToLower(); //Account Name
string accountKey = args[1]; //Account Key
string containerName = args[2]; //Container Name
int numberOfDays = Int16.Parse(args[3]); //Number of Days before deleting
string fileType = args[4];
bool deleteLockedBlobs = bool.Parse(args[5]);

CloudBlobContainer container = openConnection(accountName, accountKey, containerName);

if (!deleteLockedBlobs)
{ deleteAzureBackups(container, numberOfDays, fileType); }
{ unlockLeasedBlobs(container); }

catch (Exception ex)


static CloudBlobContainer openConnection(string accountName, string accountKey, string containerName)
//Get a reference to the storage account, with authentication credentials
StorageCredentials credentials = new StorageCredentials(accountName, accountKey);
CloudStorageAccount storageAccount = new CloudStorageAccount(credentials, true);

//Create a new client object.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();

// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference(containerName);

return container;
catch (StorageException ex)
Console.WriteLine(“Failed to create connection to MS Azure Storage.”);
return null;

static void deleteAzureBackups(CloudBlobContainer container, int numberOfDays, string fileType)
DateTimeOffset now = DateTimeOffset.Now;

foreach (IListBlobItem item in container.ListBlobs(null, false))
CloudPageBlob blob = (CloudPageBlob)item;

//If date blob was last modified is more than x days out, then it gets deleted.
if ((now – blob.Properties.LastModified.Value).Days >= numberOfDays &&
blob.Name.Substring(blob.Name.Length – fileType.Length) == fileType)

static void unlockLeasedBlobs(CloudBlobContainer container)
foreach (IListBlobItem item in container.ListBlobs(null, false))
CloudPageBlob blob = (CloudPageBlob)item;

if (blob.Properties.LeaseStatus == LeaseStatus.Locked)
Console.WriteLine(“Breaking lease on {0} blob.”, blob.Name);
blob.BreakLease(new TimeSpan(), null, null, null);
Console.WriteLine(“Successfully broken lease on {0} blob.”, blob.Name);

catch (StorageException ex)
Console.WriteLine(“Failed to break lease on {0} blob.”, blob.Name);

static void deleteBlob(CloudPageBlob blob)
Console.WriteLine(“Attempting to delete {0}”, blob.Name);
Console.WriteLine(“Successfully deleted {0}”, blob.Name);
catch (StorageException ex)
Console.WriteLine(“Failed to delete {0}.”, blob.Name);


Categories: DBA Blogs

Webcast - Oracle Multitenant Option

On their road to adoption of Oracle Database 12c, many are considering the Multitenant Database Architecture, where a multitenant container database can hold many pluggable databases, enabling...

We share our skills to maximize your revenue!
Categories: DBA Blogs

Windows 7 error “key not valid for use in specified state”

The Oracle Instructor - Sun, 2014-08-31 00:37

When you see that error upon trying to install or upgrade something on your Windows 7 64-bit machine, chances are that it is caused by a Windows Security update that you need to uninstall. There is probably no point in messing around with the registry or the application that you want to upgrade. Instead, remove the Windows update KB2918614 like this:

Open the control panel, then click  Windows Update

Windows 7 control panelClick Update History and then Installed Updates:

Update HistoryScroll down to Microsoft Windows and look for KB2918614 (I have removed it already before I took the screenshot):

Uninstall KB2918614Finally, hide that update so you don’t get it installed later on again:

Hide KB2918614I’m using a corporate notebook with automatic Windows security updates coming from time to time and encountered that problem while trying to upgrade VirtualBox to version 4.3.12. It is not a VirtualBox issue, though, other installs or upgrades may fail for the same reason. For me, this was a serious problem, because I rely on virtual machines for many demonstrations. Kudos to the forums! They helped me resolve that problem within a day. Thank you once again, guys! :-)

Tagged: #KB2918614
Categories: DBA Blogs

My Speaking Schedule for Oracle Open World 2014

Galo Balda's Blog - Wed, 2014-08-27 12:22

A quick post to let you know about the two presentations that I’ll be doing at Oracle Open World 2014.

Session ID:         UGF4482
Session Title:     “Getting Started with SQL Pattern Matching in Oracle Database 12c
Venue / Room:  Moscone South – 301
Date and Time:  9/28/14, 13:30 – 14:15

Session ID:          CON4493
Session Title:      “Regular Expressions in Oracle Database 101″
Venue / Room:   Moscone South – 303
Date and Time:   10/2/14, 13:15 – 14:00

As usual, you might have to check before the session to make sure the room has not changed.

I hope to see you there.

Filed under: 12C, Open World, Oracle, Regular Expressions, Row Pattern Matching, SQL Tagged: 12C, Open World, Oracle, Regular Expressions, Row Pattern Matching, SQL
Categories: DBA Blogs

Hot off the press : Latest Release of Oracle Enterprise Manager 12c (R4)

Pankaj Chandiramani - Tue, 2014-06-03 06:53

Read more here about the PRESS RELEASE:  Oracle Delivers Latest Release of Oracle Enterprise Manager 12c

Service Catalog for Database and Middleware as a Service; Enhanced
Database and Middleware Management Help Drive Enterprise-Scale Private
Cloud Adoption

In coming weeks  , i will be covering latest topics like :

  1. DbaaS Service Catalog incorporating High Availability and Disaster Recovery

  2. New Rapid Start kit

  3. Other new Features 

Stay Tuned !

Categories: DBA Blogs

Interesting info-graphics on Data-center / DB-Manageability

Pankaj Chandiramani - Mon, 2014-05-19 04:21

 Interesting info-graphics on Data-center / DB-Manageability

Categories: DBA Blogs

Tackling the challange of Provisoning Databases in an agile datacenter

Pankaj Chandiramani - Wed, 2014-05-14 01:03

One of the key task that a DBA performs repeatedly is Provisioning of Databases which also happens to one of the top 10 Database Challenges as per IOUG Survey .

Most of the challenge comes in form of either Lack of Standardization or it being a Long and Error Prone Process . This is where Enterprise Manager 12c can help by making this a standardized process using profiles and lock-downs ; plus have a role and access separation where lead dba can lock certain properties of database (like character-set or Oracle Home location  or SGA etc) and junior DBA's can't change those during provisioning .Below image describes the solution :

In Short :

  • Its Fast

  • Its Easy 

  • And you have complete control over the lifecycle of your dev and production resources.

I actually wanted to show step by step details on how to provision a 11204 RAC using Provisioning feature of DBLM  , but today i saw a great post by MaaZ Anjum that does the same , so i am going to refer you to his blog here :

Patch and Provision in EM12c: #5 Provision a Real Application Cluster Database

Other Resources : 

Official Doc :

Screen Watch :

Others :

Categories: DBA Blogs

Nationwide Deploys Database Applications 600% Faster

Pankaj Chandiramani - Mon, 2014-04-28 03:37

Nationwide Deploys Database Applications 600% Faster

Heath Carfrey of Nationwide, a leading global insurance and
financial services organization, discusses how Nationwide saves time and
effort in database provisioning with Oracle Enterprise Manager

Key-points :

  1. Provisioning Databases using Profiles  (aka Gold Images)

  2. Automated Patching

  3.  Config/Compliance tracking

Categories: DBA Blogs

EMCLI setup

Pankaj Chandiramani - Mon, 2014-04-28 02:15

A quick note on how to install EMCLI which is used for various CLI operations from EM . I was looking to test some Database provisioning automation via EMCLI and thus was looking to setup the same . 

To set up EMCLI on the host, follow these steps:
1.    Download the emcliadvancedkit.jar from the OMS using URL https://<omshost>:<omsport>/em/public_lib_download/emcli/kit/emcliadvancedkit.jar
2.    Set your JAVA_HOME environment variable and ensure that it is part of your PATH. You must be running Java 1.6.0_43 or greater. For example:
o    setenv JAVA_HOME /usr/local/packages/j2sdk
o    setenv PATH $JAVA_HOME/bin:$PATH
3.    You can install the EMCLI with scripting option in any directory either on the same machine on which the OMS is running or on any machine on your network (download the emcliadvancedkit.jar to that machine)
java -jar emcliadvancedkit.jar client -install_dir=<emcli client dir>
4.    Run emcli help sync from the EMCLI Home (the directory where you have installed emcli) for instructions on how to use the "sync" verb to configure the client for a particular OMS.
5.    Navigate to the Setup menu then the Command Line Interface. See the Enterprise Manager Command Line Tools Download page for details on setting EMCLI.

Categories: DBA Blogs

Webcast: Database Cloning in Minutes using Oracle Enterprise Manager 12c Database as a Service Snap Clone

Pankaj Chandiramani - Thu, 2014-04-17 04:02

Since the demands
from the business for IT services is non-stop, creating copies of production
databases in order to develop, test and deploy new applications can be
labor intensive and time consuming. Users may also need to preserve private
copies of the database, so that they can go back to a point prior to when
a change was made in order to diagnose potential issues. Using Snap Clone,
users can create multiple snapshots of the database and “time
” across these snapshots to access data from any point
in time.

Join us for an in-depth
technical webcast and learn how Oracle Cloud Management Pack for Oracle
Database's capability called Snap Clone, can fundamentally improve the
efficiency and agility of administrators and QA Engineers while saving
CAPEX on storage. Benefits include:

  • Agile provisioning
    (~ 2 minutes to provision a 1 TB database)

  • Over 90% storage

  • Reduced administrative
    overhead from integrated lifecycle management


April 24 — 10:00 a.m. PT | 1:00 p.m. ET

May 8 — 7:00 a.m. PT | 10:00 a.m. ET | 4:00 p.m. CET

May 22 — 10:00 a.m. PT | 1:00 p.m. ET

Categories: DBA Blogs

Wed, 1969-12-31 18:00