Skip navigation.

Feed aggregator

Advanced Oracle Troubleshooting v2.5 (with 12c stuff too)

Tanel Poder - Fri, 2015-10-09 01:58

It took a while (1.5 years since my last class – I’ve been busy!), but I am ready with my Advanced Oracle Troubleshooting training (version 2.5) that has plenty of updates, including some more modern DB kernel tracing & ASH stuff and of course Oracle 12c topics!

The online training will take place on 16-20 November & 14-18 December 2015 (Part 1 and Part 2).

The latest TOC is below:

Seminar registration details:

A notable improvement of AOT v2.5: now attendees will get downloadable video recordings after the sessions for personal use! So, no crappy streaming with 14-day expiry date, you can download the video MP4 files straight to your computer or tablet and keep for your use forever!

I won’t be doing any other classes this year, but there will be some more (pleasant) surprises coming next year ;-)

See you soon!

NB! After a 1.5 year break, this year’s only Advanced Oracle Troubleshooting training class (updated with Oracle 12c content) takes place on 16-20 November & 14-18 December 2015, so sign up now if you plan to attend this year!

Related Posts

Oracle Database Developer Choice Awards (2nd)

Denes Kubicek - Thu, 2015-10-08 23:49
I was just looking at the voting results for ORDS and APEX. In the ORDS category Dietmar Aust is leading and in the category APEX it is Morten Braten. So far so good. These men deserve the award for sure - they did a lot for the community and the most of us are using (or have used) their solutions. Think of Morten's PL/SQL Utility Library or Dietmar's PDF Printing...

The only thing I have a problem with is the absolutely microscopic number of votes. The most votes in the category ORDS has Dietmar - 134. In APEX it is 98 votes for Morten. Even Jari Lane who is very well known in the APEX community and has helped numerous people with his solutions has received all together 103 votes - obviously almost the half of it as down-voitng. To my point of view this is incredible. But if I think back and do a comparison with my own experience then this is actually normal. In the last seven years I have received more than 9.000 emails from the community asking either for access to my workspace, my Demo Application or asking for an APEX solution of some kind. As a result of that I have created almost 4.000 accounts and answered 95% of those emails. But all together, maybe 50 receivers responded saying thank you.


Shame on us. We are lazy and cynic. We are constantly talking about how big the APEX community and how strong it is. Maybe, because there are a few people out there like Dietmar, Jari, Jürgen Schuster, Morten and a few others. The rest seem to be there only when they need something.

Categories: Development

OUAF Feature: New Help Engine

Anthony Shorten - Thu, 2015-10-08 19:26

Oracle Utilities Application Framework V4. includes a new help engine and changes to the organization of help. The Oracle Help for Web engine is now used to display help within the product. This engine has the following exciting features for customers:

  • The engine provides a cross browser more consistent interface for displaying help.
  • New toolbar to allow for printing, bookmarking (on the browser) and even giving feedback on help. For example:

  • All the documentation in one place for the product. Framework and Product documentation are now published together and available from the same interface. For example:

Example Help

  • Flexible more detailed indexing. For example:

  • Flexible searching across all products installed on an environment including boolean searches and scoring. For example:


This new engine is only available to customers using Oracle WebLogic. Customers using IBM WebSphere will use the legacy help engine.

OUAF Feature: Component Installs

Anthony Shorten - Thu, 2015-10-08 18:39

One of the new features of Oracle Utilities Application Framework is Component Installation.The Oracle Utilities Application Framework has a multi-channel multi-tiered architecture. In the past to install individual components in a distributed manner needed additional manual efforts. In the Oracle Utilities Application Framework V4. release there is a new channel (or role) based installation capability that has the following features:

  • Ability to identify the channels to install on a particular installation. The component installation has three distinct channels that can be installed together in combination and/or separately:
    • Online - Online Web Application (includes Business Application Server)
    • Integration - Web Services and Message Driven Bean (includes Business Application Server)
    • Batch - Batch Threadpool/submitter/Cluster
  • When roles are chosen, those components are available for configuration and those menu options are only visible in the installation process. The utilities for the product also only recognize active roles in the installation.
  • A new Environment Identifier has been introduced which allows disparate installations to be combined into a virtual environment. This is used by the patching utilities and also Oracle Enterprise Manager to collate targets into an Oracle Utilities Environment target. For example, you can have a number of installations across multiple machines that can be combined and managed as an environment. This identifier can be altered after installation to connect or disconnect an installation from an environment.
  • It is possible to change the roles AFTER installation. This represents maximum flexibility. To enable or disable a role, simply start the configureEnv utility and indicate the changed roles. If a new role is added, then the configureEnv utility will prompt for the configuration settings for the new role. If you have removed a role, then the components will be disabled. For customers using native installations directly or via Oracle Enterprise Manager, may have to manually deploy or undeploy the generated files from the server domain.
  • If a Batch Server is the only role, the parameters for the J2EE Web Application Server are not needed.

There are new settings in the ENVIRON.INI that covers the new facilities:

 Setting Comments ENVIRONMENT_ID Generated Environment Identifier
SERVER_ROLES List of active server roles for this installation
SERVER_ROLE_BATCH Whether BATCH server role is a valid role for this environment (used internally)
SERVER_ROLE_INTEGRATION Whether INTEGRATION server role is a valid role for this environment (used internally) SERVER_ROLE_ONLINE Whether ONLINE server role is a valid role for this environment (used internally)

This is part of a larger enhancement that will simplify the overall deployment options available for installation now and in the future.

Refer to the Installation Guide and Server Administration Guide for more details of the settings.

SQL Server Fast Food (or Objects)

Pythian Group - Thu, 2015-10-08 13:55


An environment where you have a high number of databases on one server, or many, can be time consuming to something as simple as a user account. You have the option of using the GUI with SQL Server Management Studio (SSMS), which if it was a rush to get something in place for 8 or 10 databases I can see possibly doing that to get it done. You could do this with a bit of typing using T-SQL and a cursor or that famed, undocumented procedure sp_MSForeachdb.

I recently had a request from a customer that fell into the above scenario and in using PowerShell to handle the request I just wanted to show how I went about getting it done. I think this is a situation where both T-SQL or PowerShell will work, I just picked the one I wanted to use.

Breaking this down, these are the basic steps I had to perform:

  1. Check for the login
  2. Create user
  3. Create role
  4. Assign INSERT and UPDATE to the role
  5. Add the user to the database role

All in all that is not too much, if you understand how PowerShell and SMO work for you. If you are not familiar with PowerShell you can reference the recent series I published on the Pillars of PowerShell that should help you get started. When I was learning PowerShell I always found I learned the best by reading through other folks scripts to find out how stuff was done. You can find the full script at the end of this post if you want to just skip right to it, I won’t be offended.

One thing I always find useful with SMO is remembering that everything MSDN documents everything for the namespace Microsoft.SqlServer.Management.Smo. If you spend the time to review it and at least get familiar with how the documentation is laid out, using and finding answers for things with SMO becomes much easier.


The Bun

As always the first step is going to be to create the object for the instance or server:

$s = New-Object Microsoft.SqlServer.Management.Smo.Server $server

The task of verifying the login exists, I utilized one of the common methods that is available with a string type, Contains(). Now you generally use the Get-Member cmdlet to find the various methods available for an object, but this particular one does not show if you were to run: $s.Logins | Get-Member. There are a set of methods that follow each type of value (e.g. String, integer, date, etc.) and the Contains() method is one with the string type. There are two ways I have found to discover these type of methods:

  1. Pass the value type to Get-Member [e.g. “A string” | Get-Member]
  2. Use tab completion [e.g. Type out “$s.Logins.” with the period on the end, and then just start hitting the tab key]

If you want a bit of exercise you can see if you can add in code to actually create the login if it does not exist. I was only working with one server in this case so did not bother adding it this time around.

Being that I need to add these objects to each database I start out by getting the collection of databases on the instance:

$dbList = $s.Databases

From there I am simply going to iterate over each database that will be stored in the variable: $d.


The Meat

The first thing I want to do is verify the database is online and accessible, so each database (e.g. $d) has a property called “isAccessible” that simply returns true or false. The equivalent of this in T-SQL would be checking the value of the status column in sys.databases for T-SQL. One shortcut you will see in PowerShell at times is the use of an explanation point ( ! ) before an object in the if statement, this basically tells it to check for false to be returned:

if (!$d.isAccessible) {…}
#equates to:
if ($d.isAccessible -eq $false) {…}

Now that I know the database is online I need to create and modify some objects in the database. When dealing with objects such as user accounts, roles, tables, etc. in a database, in PowerShell these are going to be classes under the SMO namespace. So in this script I am going to use the following classes for the user and database role:

Under the User and Database Role class you will see the constructors section that shows what is needed to create the object. So for example, digging into the link for the database role constructor I see it takes two parameters:

  1. Microsoft.SqlServer.Management.Smo.Database object
  2. a string value of what you want to call the role.

The $d variable is my database object, so that is covered and then I wrote the function to pass the database role name into the $roleName:

$r = New-Object Microsoft.SqlServer.Management.Smo.DatabaseRole($d,$roleName)

I continued through the article for the database role class and in the Properties list see that some have a description of “Gets the…” and then some have “Gets or sets…”. This basically means “Gets the…” = read only property, and “Gets or sets” = property can be read or modified. When you are using CREATE ROLE, via T-SQL, you have to provide the name of the role and the owner of that role. I passed the name of the role when creating the database role object ($r) so I just need to set the owner and then call the method to actually create it:

$r.Owner = 'dbo'
The Ingredients

The only thing I needed to do in this situation was set INSERT and UPDATE permissions, and at the schema level to handle the client’s requirements. Assigning permissions in SMO took me a bit to figure out, majority of the time on writing this script actually. There are two additional classes I need to handle setting permissions on a schema:

I create the object for the schema, according to the documented constructor. Within each class that deals with specific objects in a database that can be given access, you should find a Grant() method and in my case what I need is Grant(ObjectPermissionSet, String[ ]). The second parameter is an object that contains the permissions I want to assign to this role. This is where the second class above came into play.

The properties for the ObjectPermissionSet class are the permissions I can assign via SMO to an object in a database, and simply setting them to true will assign that permission:

$dboSchema = New-Object Microsoft.SqlServer.Management.Smo.Schema($d,'dbo')
$perms = New-Object Microsoft.SqlServer.Management.Smo.ObjectPermissionSet
$perms.Insert = $true
$perms.Update = $true

Then to finish it off that last line in the script is to just add the user as a member of the database role created. You can find the full script below for your pleasure. Enjoy!


Full Script
Import-Module SQLPS -DisableNameChecking -NoClobber
function Create-RoleUserInAllDatabases
Create database role, assign permission, create user, assign user to database role
Iterates through all databases that are online and creates the role and user (if the login exist).
Assigns INSERT and UPDATE permissions to the role created.
You can find the other properties that can be set on MSDN site:
String. Name of the instance or server (for default instance)
String. Current login on the instance, can be Windows or SQL Login
Switch. Name of role you want to create.
String. Array, or single, permission you want to assign. **See notes**
Create the role AppRole and add "SQLLogin1" as member of that role
Create-RoleUserInAllDatabaes -server MyServer -loginToUse SQLLogin1 -roleName AppRole
param (
[Parameter( Mandatory=$true,ValueFromPipeline=$false )]



$s = New-Object Microsoft.SqlServer.Management.Smo.Server $server

# Make sure login already exist
if (!($s.Logins.Contains($loginToUse)))
Write-Warning "$loginToUse does not exist on $server"
$dbList = $s.Databases

foreach ($d in $dbList) {
#if databases is not accessible
if (!$d.isAccessible) {
Write-Verbose "$($d.Name) is offline"
else {
Write-Verbose "******WORKING ON*****************$d******************"
# Check if user already exist in database
if (!($d.Users.Contains($loginToUse))) {
Write-Verbose "$loginToUse does not exist, creating"
$u = New-Object Microsoft.SqlServer.Management.Smo.User ($d,$loginToUse)
$u.Login = $loginToUse
else {
Write-Verbose "$loginToUse already exist, skipping step"

# Check if role already exist in database
if (!($d.Roles.Contains($roleName))) {
Write-Verbose "$roleName does not exist, creating"
$r = New-Object Microsoft.SqlServer.Management.Smo.DatabaseRole($d,$roleName)
$r.Owner = 'dbo'
else {
Write-Verbose "$roleName already exists, skipping step"
} #end check if role exist

# grant permissions
$dboSchema = New-Object Microsoft.SqlServer.Management.Smo.Schema($d,'dbo')
$perms = New-Object Microsoft.SqlServer.Management.Smo.ObjectPermissionSet
$perms.Insert = $true
$perms.Update = $true

# now add user

} #end check database is online
} #end foreach $dblist
} #end function


Discover more about our expertise in SQL Server.

Categories: DBA Blogs

Oracle Priority Support Infogram for 08-OCT-2015

Oracle Infogram - Thu, 2015-10-08 13:46

Oracle OpenWorld
The session schedules are rolling in. Here are a few:
Online Calendar for Data Warehousing and Big Data Sessions at #oow15 now available
Featuring Big Data Preparation Cloud Service and other Cloud Data Integration Sessions at Oracle OpenWorld 2015
Catch Up on Oracle GoldenGate at Oracle OpenWorld
Plan Your MySQL Central @ OpenWorld Sessions with My Schedule
OPN Central @ OpenWorld: What’s in it for you?
Last week on AskTom, from All Things SQL.
Database Insider - October 2015 issue now available, from Exadata Partner Community – EMEA.
New in Oracle 12c: _optimizer_gather_stats_on_load, from Upgrade your Database - NOW!
What Is SQL Injection and How to Stop It, from All Things SQL. It’s been a while since we ran a posting on SQL injection. It’s an easy trap to fall into in coding, so this may be a good time to review your apps and make sure you aren’t vulnerable.
GlassFish 4.1.1 is now available!, from The Aquarium.
Patch Set Update: Oracle Smart View
Patch Set Update: Hyperion Planning
EPM Patch Set Updates - September 2015, from Business Analytics - Proactive Support.
Oracle Utilities
Oracle Utilities Customer Care and Billing available, from The Shorten Spot (@theshortenspot).
Are you on There is a mandatory patch that requires application., from the Oracle Demantra blog.
User Defined Field Fact History, from the Oracle Primavera Analytics Blog.
A bit of product evangelism combined with some prophecy and analysis on ZFS storage: This Is Our Time, from The Wonders of ZFS Storage.
From the Oracle E-Business Suite Support blog:
OpenWorld sessions for Focus on Oracle EBS Support and Services!
Which Transactions Are Copied Via Fixed Assets Mass Copy?
EBS General Ledger - Accounting Hub Reporting Cloud Service
From the Oracle E-Business Suite Technology blog:
E-Business Suite Technology Sessions at OpenWorld 2015

WebCenter Portal Certified with E-Business Suite 12.1

PeopleSoft Streams from Oracle University

Jim Marion - Thu, 2015-10-08 13:45

In February of this year, Oracle University launched the PeopleSoft Learning Stream. Oracle's Learning Streams are short, educational vignettes. I was given the privilege of recording 6 streams:

  • Using JavaScript with Pagelet Wizard is a 21 minute video showing you how to use Pagelet Wizard to convert a PeopleSoft query into an interactive D3 chart, a navigation collection into a carousel, a navigation collection into an accordion, and RequireJS for JavaScript dependency management.
  • REST Query Access Service is a 15 minute session showing you how to craft a Query Access Service REST URL.
  • Working with JSON in PeopleSoft Document Technology is a 23 minute video demonstrating how to use the PeopleCode Document, Compound, and Collection objects to read and write JSON.
  • Basic Java API with PeopleCode is a 26 minute session showing you how to use the delivered Java API with PeopleCode. This session covers constructors, instance methods, properties, and static method invocation. Java objects demonstrated include String, Hashtable, Regular Expression Pattern and Matcher, arrays, and String.format.
  • Intermediate Java API with PeopleCode is a 38 minute video that shows you how to configure JDeveloper to write Java for the PeopleSoft Application and Process Scheduler servers and provides some examples of writing and deploying Java to a PeopleSoft application server. Note: in this session you get to watch me attempt to troubleshoot an App Engine ABEND.
  • Advanced Java API with PeopleCode is a 26 minute recording showing you how to use Java Reflection to remove PeopleCode ambiguity as well as how to use JavaScript to avoid reflection.

You can access all of my streams here. From this page you can preview the first 2 minutes of each video or subscribe for unlimited access to all of the videos in the Oracle PeopleSoft Learning Stream.

Issues with Plan Cache Reuse & Row Goal Optimization

Pythian Group - Thu, 2015-10-08 13:11


I am presenting here on behalf of my colleague  Fabiano Amorim (he is busy resolving other exciting performance issues…  :-D ) .

Fabiano had an interesting case with one of our customers that is very common in SQL Server.

The case is about a performance issue caused by two optimizer decisions not working well together:


Problem Description

Let’s review the following query:

select top 1 col_date from tab1
where col1 = 10
and col2 = 1
and col3 = 1
order by col_date asc


Table tab1 have two indexes:

  1. ix1 (col1, col_date, col2) include(col3)
  2. ix2 (col1, col2, col3) include(col_date)


The Query optimizer (QO) has two query plan options:

  1. select -> top -> filter -> index seek (ix1) Read the ordered index ix1 by b-tree seeking by “col1 = 10”, apply the residual predicates (filter) “col2 = 1 and col3=1”, after reading just 1 row (TOP 1) the execution is finished since the index is ordered by  col1, col_date, the first col_date returned is already the TOP1 ASC according to the index order.
  2. select -> top N sort -> index seek (ix2) Read the covered index ix2 b-tree (notice it has all needed columns), seeking by “col1 = 10 and col2 = 1 and col3=1”, get the col_date in the index leaf level (included column), use “top N sort” algorithm to sort and keep only TOP 1 row, finish execution.

The problem, is that, if the QO chooses the first option, this will be good for high selectivity predicates.
For instance, let’s suppose that “col1 = 10” returns 5 rows; remember that index ix1 is ordered by col1, col_date, col2:


col1 | col2| col3 | col_date

10    | 4      | 4      | 2015-12-01

10    | 3      | 3      | 2015-12-02

10    | 1      | 1      | 2015-12-03

10    | 5      | 5      | 2015-12-04

10    | 2      | 2      | 2015-12-05


After seeking the index, SQL will need to apply the residual predicate (“col2 = 1 and col3=1”) until it finds the “row goal”: TOP iterator is asking for just one row, in this case the third row will match the predicate and SQL Server will return the first row that matches the residual predicate.

So, in this case it has to read only 3 rows. So far so good…

Now, let’s supposed SQL created that plan, and now it’s going to reuse it for a new value on col1 filter:


select top 1 col_date from tab1
where col1 = 99
and col2 = 1
and col3 = 1
order by col_date asc


What if after the seek (“col1 = 99”) 2 million of rows are returned? Now this plan is not so good, since it will need to apply the predicate on many rows before it finds a match:


col1 | col2| col3 | col_date

99    | 2      | 2      | 2015-12-01

99    | 2      | 2      | 2015-12-02

…after a couple of million rows…

99    | 1      | 1      | 2015-12-03

99    | 2      | 2      | 2015-12-04

99    | 2      | 2      | 2015-12-05


In this case, using the second option is better. Just go and seek the b-tree for all values (col1 = 99 and col2 = 1 and col3 = 1), this will return 1 row… TOP n SORT will do almost nothing and execution will finish quickly.

Here is the problem: most of the times, SQL knows whether to use option 1 or option 2 based on the parameters values. But if it is reusing the plan from cache, the optimization path may already be set improperly resulting in the known issue called “parameter sniffing” (plan reuse that is wrong for the specific set of rows)… That means that the row goal optimization should not be used if there is a covered indx.

Unfortunately by default, QO “thinks” this is cheaper than “seek+top n sort”… Of course it all depends on the distribution of data…So in a nutshell,  QO chooses rowgoal optimization where this should not be used therefore we should pay extra attention to those kind of plans…


Possible Solutions

There are many alternatives to fix it.

Some examples:

  1. Force the index (index=ix2)
  2. Option(recompile)
  3. drop the index ix1, define ix2 as a unique (tells QO that only 1 row will be returned)

Each one of the above has advantages and disadvantages.

We also need to ensure that statistics are up to date!


Additional Resources


Discover our expertise in SQL Server. 

Categories: DBA Blogs

Presidents of USA and their Birth Signs – Sankey Visualization

Nilesh Jethwa - Thu, 2015-10-08 13:00

In this analysis, we will visualize the relation between the Age at Presidency, State of Birth and birth sign.

Read more at:

Simplifying Docker Interactions with BASH Aliases

Pythian Group - Thu, 2015-10-08 12:21
Illustration of a long shadow whale icon  

Docker has been consuming my life in the last few weeks. I have half a dozen projects in progress that use containers in some fashion, including my Visualizing MySQL’s Performance Schema project.

Since I prefer to work from a Mac laptop, I have to utilize an intermediary Linux Virtual Machine (VM) which runs the Docker daemon. Luckily, Docker Machine makes this a very simple process.

However, interacting both with Docker and Docker Machine does introduce some additional commands that I would rather simplify for the repeatable use-cases I’ve come across. With BASH aliases, this is not a problem.

Is My Docker Environment Setup?

When working with Docker through Docker Machine, you first have to set up your environment with various DOCKER_* variables, such as these:

$ env | grep DOCKER_

The first alias is an easy way to check that the Docker environment is setup.

alias de="env | grep DOCKER_"

Now, all I have to type is de, and I get the Docker environment output:

$ de
Setting up My Docker Environment

But how do you set up the environment with Docker Machine? The docker-machine command provides the details:

$ docker-machine env default
export DOCKER_HOST="tcp://"
export DOCKER_CERT_PATH="/Users/dtest/.docker/machine/machines/default"
export DOCKER_MACHINE_NAME="default"
# Run this command to configure your shell:
# eval "$(docker-machine env default)"

Notice that the comments indicate you have to run the command through eval to get the terminal setup correctly. I don’t want to type that out each time I open a new terminal.

The docker-machine command requires the name of the VM to set up as an argument, so I’ve created a function to accept the argument:

function dm-env() {
  eval $(docker-machine env $1)
alias dm-list="docker-machine ls"

Each time I open a terminal I can setup the environment:

$ dm-list
NAME      ACTIVE   DRIVER       STATE     URL                         SWARM
default            virtualbox   Running   tcp://
$ dm-env default
$ env | grep DOCKER

If you only use one Docker VM for local development, you can hardcode the name of it to execute the command to automatically setup the docker environment when a new terminal is created.

Cleaning Out Docker Images

The last helpful alias I have comes from building and re-building containers that have left old images on my VM.

The docker-clean command cleans up all dangling images:

function docker-clean() {
  docker rmi -f $(docker images -q -a -f dangling=true)

And running the docker-clean command yields:

$ docker-clean
Deleted: 8355d13a293c722895df6326c8f7fdba168f6e27a960c57ea724e61ee4c93bfb
Deleted: fe6f71c15c7ed34903609480552cb4ec2e5d563cbf843760ade2b5f58289c933
Deleted: 6a1a4c39ea78961a9075fa19c6fdbc6ccae57dc92038e5db55561ab1a36e03ef
Deleted: 86428bbc2be17a9f71893245fe868f41581b5eae5c9498947083178d15403f09
Deleted: f20f186342ccd671153315f893271a4089614e8865291b28b0cd755852f0689d
Deleted: 5f55f0770bb514e665490f076032fe65b1e10e61e15d05616d7690279c435452
Deleted: a6936e6ce9b2cd5c183133a1c1abf8eea343d00f81e2b089c7ca21b68362e51e
Deleted: 34f51cd292120f522b932a086efa0dd75e497a66067024c6dcc128806c9a9967
Deleted: d8ea74b05971788f660a4f25b83b7da974bba90e6f523afdbc460de33a2fc1a8
Deleted: 153d44b1514739c9f1fbc6263d47c03e5e2a09bbfad78026755cd948005072fb
Deleted: 06c8e29ff0f54a393c52f9c32b8c671215e6ae64e7dcf67a1cfd4642e9dabb1e
Deleted: 76d06da4e00ec547b7d1c041d2b49d1cf8986fd899f25cee208139e76503555e
Deleted: 2ee318eee874b3622701be517224378d8230495cc6537443ff1e773202277302
Deleted: 32ce55598679c6cb24a50d281b9bfd61265d44d3170e49633c1d9348e2a49c48
Deleted: df90e6d34212292ca3f8895968855ae63140a8651a281615366ec9bff8733091
Deleted: 782ea5befe69f5d08d47e9353d0cd6c8ef5a40d03acf28496f81041c15c8a033
Deleted: c99cb8938e33b5cc0838620846af29f871dd2a2da3a05dedc13353dac516e7fe
Deleted: bde69fa0774949c77af72860020b85d77dcbdb6f85f971b8b5bf6636b81c8022
Deleted: cef795306a6f4358f60750568fb68ae63434b5c01457265fd5eb808963dfa98e
Deleted: ae426d330b8fa8da5245375342b6e1df89d8ce5e1dcf52a2094c236bab29f3a2
Deleted: d003bcaf0b1a2e24f214adcc4bb889b467c2ecf23a1ded223484b7c29f23d97c
Deleted: a586300c6e8c9232cacd3494884cd8fbcf543dc09b14bbe497094ff84435e54e
Deleted: e9fd77d1c325a0668dfbb114d7b5287f79ce4f921b4ff576f7d8ef7e29abbeb8
Deleted: bbe8e3da06412fc7d0c455d1acdbdfb800566190a3449cf0f51a2984f45dc93a
Deleted: 5a9aa4f34c01f5c2cc9c419a42c95de7a278216afb1bb5087a108ec9ab8217b6
Deleted: 0ee64687f4457214527d4b910cdcc2a719cf5558be4b36057dcf050b20bbed06
Deleted: 72b02495ad97da2b3ae77927917a741cac57450def8c01bd597fa8d62c64031b
Deleted: 6571797e10bc57ff8ebe186f208292503639ea234e6b384927d719c1e82372ed
Deleted: 47e0d9b1296522a545e6f32ddad6501f7c035fe52c82f956054bb900d1fc42e8
Deleted: 54090b45c638c49dce1ca9de8737ac3288b16a888e47b7e7ec8f9c279eead6be
Deleted: 163265e05a9c47ac7ef2c1e56b6836fad654ff5aad9611c5e0663f0df33c2386
Deleted: 5293c1079ff47e42d06eadd2d3533f22547e1c7434bd495ef68fa7d68bfd6aab
Deleted: 48815da633b891d5c7c49f3ee60d73ed553636ec5034f7a6bf1c169406cb1088
Deleted: c2ddaec48bef8f29e877f8b456a510b303518c06abf1670538e611261dc1204f
Deleted: 63a5ba3db43aa1b163c1a783c4d059a39b75d380aac60d298417ce4e2d3d0910
Deleted: 19261a084b950bb9dbc359a4ab974e817b2108b296463b158a9f3be0c18af0a0
Deleted: 69ff91a0593fbc77f41b91ad1e5124456ccd940f80fb9cccd279aa2b7ed410d1
Deleted: 113c5c605ea68e142bc5121848878d13dcc1e86c041cc161f4a63f29b6258af8
Deleted: 70b97f6a626ab4bdb1ebe3a14529ec6ed3a7f8e328d87af174fc93374ff3a152
Deleted: 9d0cd42d363a5d0dd527c38495fc104b0277fc90c84c92c0f5a0b16dc9cb05a2
Deleted: c9251e844223e3ecdf72ed1e87f9efa224cc5e02f79210e8978cee4cc34d39c8
Deleted: bf10746e4eeee9122c06d6ae3138b2e829f03cbbbc15f141b0d8b33e5dfadb43
Deleted: 08afd2679b95d531f621d07686ec60af9e4871bc9228d0b7e9d6cdf9184ab809
Deleted: bdac229bdd0aa6d8cf529b846c5d20f01a21ddbac4f8b9056a005d7111aabbe2
Deleted: 33b3fc61971e7af73f7150ec1fc67fce944a5b130cbd4b21fcd95e7a18bcb367
Deleted: ed505b4e13732aeead1a8e9e2c5d61d11c454579627e85caa78ae99949bd709c
Deleted: 2180127d131a23e1f27cf03b8d82ef58641b67bfee6cf2c6ca52a9eaee2ee1c0
Deleted: d27499cfa54e8c74669c226c0eded86785547d398dc2da5f4a02159a71f6d3ab
Deleted: 2af32da19834adc010306180eab5a59bdebd0c2ba21bf18c505cd0c5942bf192
Deleted: f4176250cc803325de34de8e08c354a2e4c6c4de181a7acda8123812bf1576de
Deleted: e914e7f057c17387f56adeca5e25ac7aa1e3760be4669688359a3e614162de1b
Deleted: 5ffd3ce218cff85582d6b586374733fe5ef31eb6b638667a9cd6796e39a5e34a
Deleted: e5552fc54204a44110a5b3f025acc8fee73402b6f1b25ac5bfc30825d194a1fc
Deleted: 25140906c74a79d4d3c620ad3fb19ded7b04103afb084317255dda9fcdc8f50a
Deleted: 13fb12aef12dd930817f7852db6a272b37be68163c8b572191cfac1e31cc8ca8
Deleted: 81826c55deedd1c2e97bc8ff34e9246f86405302ce96d522f822a2f90ce6b944
Deleted: 1d3599e5ca3278520b7f49576cdb0c2bbb787276c9c2171a0bf61fd90881fd33
Deleted: a4bfa20df9e3e24dc7d6aa20f0273dc931be7fa928cc67477dbe7057e1459b76
Deleted: e103df06ce590225825ce6bde9df96129b2e23254a9f530d65a5b7c63a87c03d
$ docker images
REPOSITORY       TAG                 IMAGE ID            CREATED             VIRTUAL SIZE
dtest/docker-dev latest              ccaa1413a6d4        2 hours ago         392.4 MB

I put all of these aliases and functions together in my ~/.bash_profile* script, which is executed anytime I open a terminal window:

# Docker
function dm-env() {
  eval $(docker-machine env $1)

function docker-clean() {
  docker rmi -f $(docker images -q -a -f dangling=true)

alias de="env | grep DOCKER_"
alias dm-list="docker-machine ls"

*Note: Instead of putting these aliases and functions in ~/.bash_profile, other distributions would look for them in ~/.bashrc or ~/.bash_aliases to ensure they are available for all types of interactive shells.

If you have any other commands to simplify Docker interactions, please share them in the comments!


Discover more about our expertise with DevOps.

Categories: DBA Blogs

OHS Installation failed with error OUI-10136 Oracle Home Already Exists

Online Apps DBA - Thu, 2015-10-08 12:10

Oracle HTTP Server (OHS) is part of Oracle Web Tier (other components in Web Tier are Oracle Traffic Director OTD and Oracle WebCache). OHS is a recommend component as per Oracle Reference Architecture (some of we call it Oracle Enterprise Deployment).

OHS is mandatory if you wish to integrate any Oracle Fusion Middleware product (like WebCenter Portal, SOA, OBIEE, etc) with Oracle Access Manager (OAM) for Single Sign-On (SSO).

This post is related to Oracle HTTP Server (OHS) installation issue from our Oracle Fusion Middleware (FMW) Training (next batch starts on 17th Oct, 2015 and we cover WebLogic, SOA, OHS, OBIEE, WebCenter, SSL, Patching, Cloning) where one of the trainee encountered issue while installing OHS 11g (As part of hands-on, We provide step by step instructions to install Fusion Middleware components like OHS, WebLogic, SOA, OBIEE etc on our Server for practice)



OUI-10136 : An Oracle Home with name OH562635761 already exists at location /u01/oracle/fmw_ohs/oracle_common. Please specify another  location for Oracle Home.


Error message is self explaining, there is already an Oracle Home present with name that trainee was using while OHS installation. But as he checked, there wasn’t any folder located with name fmw_ohs under “/u01/oracle/” directory.

Root Cause:

OHS was installed under same location (/u01/oracle) earlier but installation files were later removed. Old installation location was still present in  oraInventory which was creating this issue while installing OHS again under same directory.



Delete the entries of old OHS installation path from inventory.xml file present under $ORAINV_LOC/oraIventory/ContentsXML/ directory.

Note: You can check oraInventory location in /etc/oraInst.loc in Linux.

In our case, we have deleted below entries from inventory.xml file present under  “/u01/app/oraInventory/ContentsXML/”

<HOME NAME=”OH562635761″ LOC=”/u01/oracle/fmw_ohs/oracle_common” TYPE=”0″ IDX=”2″>
<REFHOME LOC=”/u01/oracle/fmw_ohs/Oracle_WT1″/>
<HOME NAME=”OH175026832″ LOC=”/u01/oracle/fmw_ohs/Oracle_WT1″ TYPE=”0″ IDX=”3″>
<REFHOME LOC=”/u01/oracle/fmw_ohs/oracle_common”/>


After deleting above highlighted entries, OHS Installation was completed successfully.


If you want to learn more issues like above or wish to discuss challenges you are hitting in Oracle Fusion Middleware products Implementation, register for our Oracle Fusion Middleware Training.

We are so confident on quality and value of our training that We provide 100% Money back guarantee so in unlikely case of you being not happy after 2 sessions, just drop us a mail before third session and We’ll refund FULL money.

Did you subscribe to our YouTube Channel (340 already subscribed) ?

you can click on the button  below and download Free Interview Questions Guide on Oracle Fusion Middleware.

Click Here to download Oracle Fusion Middleware Interview Questions for FREE

Stay tuned for more updates!

The post OHS Installation failed with error OUI-10136 Oracle Home Already Exists appeared first on Oracle : Design, Implement & Maintain.

Categories: APPS Blogs

Amazon Quick Sight – BI on Cloud?

Dylan's BI Notes - Thu, 2015-10-08 08:08
In my post Data Warehouses on Cloud – Amazon Redshift, I mentioned that what would be really useful is providing BI on Cloud, not just Data Warehouse on Cloud. I felt that BICS makes more sense comparing to Amazon Redshfit. I discussed with a couple of people last night in a meetup.  Some of them […]
Categories: BI & Warehousing

PDB Logging Clause… Again…

Tim Hall - Thu, 2015-10-08 03:29

About 14 months ago I spotted a problem with the PDB Logging Clause. I opened an SR and several months later I got a patch, which unfortunately didn’t fix the issue, just altered the symptom somewhat. I wrote about that patch here.

Yesterday I got a new patch, which actually does fix the problem, so now the PDB Logging Clause works as documented!

I’ve updated the PDB Logging Clause article to reflect the change.

I realise it’s a small issue, with an easy workaround, but 14 months seems a bit excessive. :)



PDB Logging Clause… Again… was first posted on October 8, 2015 at 10:29 am.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.

Oracle Utilities Customer Care and Billing available

Anthony Shorten - Wed, 2015-10-07 19:40

Oracle Utilities Customer Care and Billing is now available. This release includes the newest version of Oracle Utilities Application Framework (V4.

This version of the Oracle Utilities Application Framework has the following features:

  • New Login Screen - A simpler login screen has been introduced with a new screen in line with Oracle's Cloud offerings.
  • Session Timeout - In past releases when a session expired due to inactivity it would throw the user back to the login screen. This release now includes a popup message informing the user that the session is timed out prior to sending the user back to the login screen.
  • Required Fields indicator - In line with the new interface implementation introduced in, required fields can be indicated with a required fields indicator: *. This only applies to newer UI hint based screens. For backward compatibility this feature can be disabled.
  • Menu Behavior - Menu behavior has been tweaked to allow greater flexibility and a wide range of device support. Menu's can be locked into position to cater for user behavior. 
  • Toolbar History enhancements - In Oracle Utilities Application Framework, identifiers were recorded when using individual functions to enhance navigation. To complement this, these identifiers will be displayed in the History on the toolbar.
  • Component Installation - In the past installation, required additional effort to install individual components such as channels or tiers in an installation event. This enhancement greatly simplifies this by assigning installation roles to individual installations. Upon selecting appropriate roles, only those parts of the installation are exposed and configured for that installation. It is also possible to change roles after installation for maximum flexibility. This enhancement allows disparte installations to be linked into a virtual environment and managed for patching and management from Oracle Enterprise Manager easily
  • New Help Engine - This releases sees the product use the Oracle Help Engine for Web which provides greater usability, flexibility and standardization for our help. This enhancement is only available to Oracle WebLogic customers. Customers on IBM WebSphere will continue to use the legacy help engine. This also means that Framework Help is now included in Product Help.
  • Enhanced OIM Integration - The Business object used for Oracle Identity Management (OIM) has been enhanced following feedback from the Oracle Cloud implementations and customers using OIM. The interface now defaults more attributes and uses the user templating functionality to reduce the amount of business rules within OIM.
  • User Security Changes - To support greater levels of security, the user object now has additional security levels to control different aspects of security ralated to its maintenance. For example, it is now possible to allow specific users to administrate user group membership.
  • Map XML Enhancements - A simpler approach to mapping fields has been implemented to speed up mapping XML data types including a new wizard to aid schema designers.
  • XQuery engine upgraded - The latest Oracle XQuery engine has been adopted to provide new functions and reduce memory footprint needed for scripting.
  • Batch Parameter Security - It is now possible to encrypt data used on individual sensitive batch parameters.
  • System Heath Check - A new system wide health check API has been introduced that can be extended to verify components within the architecture to provide status information. The initial release will cover architecture and specific level of service checks. A simple user interface and web service have been included for this facility. This will be incorporated into a future Application Management Pack for Oracle Utilities release.
  • Additional Sort keys on Manual To Dos - Manual To Do entries now populate the standard sort keys even if custom sort keys are populated.
  • Extendable Lookup Enhancements - We have added a few new features for Extendable lookups to make them easier to use. This includes characteristic support, a new CLOB field, overriding support, action support and extended validation support.
  • Support Manual Transition of Sych Requests - It is now possible to manually transition a Sync Request.
  • Currency Symbol Support - Number elements may define a currency reference to automatically render the currency symbol.
  • Extended Currency Support - The base classes within the framework have been extended to support a wider range of world currencies.
  • Page Based Web Services - Inbound Web Services has been extended to now support legacy page based services. This will reuse the definitions in the page services to expose them as Inbound Web Services without the need for further configuration. This will aid in moving from XAI to IWS for legacy customers.
  • Message Driven Bean in IWS - The MDB has now been moved to the IWS implementation to implement an Integration Server architecture. Existing MDB implementations will be supported in the short term but will need to move to this new IWS implementation long term.

More articles will be published over the next few weeks outlining more information about most of these enhancements. Refer to the product documentation and release notes for more details of each of these enhancements.

Notes on packaged applications (including SaaS)

DBMS2 - Wed, 2015-10-07 18:27

1. The rise of SAP (and later Siebel Systems) was greatly helped by Anderson Consulting, even before it was split off from the accounting firm and renamed as Accenture. My main contact in that group was Rob Kelley, but it’s possible that Brian Sommer was even more central to the industry-watching part of the operation. Brian is still around, and he just leveled a blast at the ERP* industry, which I encourage you to read. I agree with most of it.

*Enterprise Resource Planning

Brian’s argument, as I interpret it, boils down mainly to two points:

  • Big ERP companies selling big ERP systems are pathetically slow at adding new functionality. He’s right. My favorite example is the multi-decade slog to integrate useful analytics into operational apps.
  • The world of “Big Data” is fundamentally antithetical to the design of current-generation ERP systems. I think he’s right in that as well.

I’d add that SaaS (Software As A Service)/on-premises tensions aren’t helping incumbent vendors either.

But no article addresses all the subjects it ideally should, and I’d like to call out two omissions. First, what Brian said is in many cases applicable just to large and/or internet-first companies. Plenty of smaller, more traditional businesses could get by just fine with no more functionality than is in “Big ERP” today, if we stipulate that it should be:

  • Delivered via SaaS.
  • Much easier to adopt and use.

Second, even within the huge enterprise/huge app vendor world, it’s not entirely clear how integrated ERP supposedly is or isn’t with CRM (Customer Relationship Management). And a lot of what Brian talks about fits pretty cleanly into the CRM bucket.

2. In any case, there are many application areas that — again assuming that we’re in the large enterprise or large internet company world — fit well neither with classical ERP nor with its CRM sibling. For starters, investigative analytics doesn’t fit well into packaged application suites, for a myriad of reasons, the most basic of which are:

  • The whole point of investigative analytics is to discover things that are new. Therefore, business processes are inherently unpredictable.
  • So are data inputs.

If somebody does claim to be selling an app in investigative analytics, it is usually really an analytic application subsystem or else something very disconnected from other apps. Indeed, in almost all cases it’s both.

3. When it comes to customer-facing websites, I stand by my arguments three years ago in the post just linked above, which boil down to:

  • What I just said above about investigative analytics, plus the observation that …
  • … websites have a strong creative aspect that fits badly with soup-to-nuts packaged applications.

Also, complex websites are likely to rely on dynamic schemas, and packaged apps have trouble adapting to those.

4. This is actually an example of a more general point — packaged or SaaS apps generally assume rather fixed schemas. (The weasel word “rather” is included to allow for customization-through-configuration, but I think the overall point holds.) Indeed, database design is commonly the essence of packaged app technology.

5. However, those schemas do not have to be relational. It would be inaccurate to say that packaged apps always assume tabular data, because of examples such as:

  • SAP has built on top of quasi-objects for a long time, although the underpinnings are technically relational.
  • There are some cases of building entirely on an object-oriented or hierarchical data model, especially in health care.
  • Business has some inherent hierarchies that get reflected in data structures, e.g. in bills of materials or organization charts.

But even non-tabular data structures are, in the minds of app developers, usually assumed to have fixed schemas.

Related links

Categories: Other

My Delphix presentation at Oaktable World

Bobby Durrett's DBA Blog - Wed, 2015-10-07 17:52

It is official.  I will be doing my Delphix presentation at Oaktable World during the Oracle OpenWorld conference at the end of this month.  My talk is at 9 am on Tuesday, October 27.

I will describe our journey as a new Delphix customer with its ups and downs. I tried to have the spirit of a user group talk where you get a real person’s experience that you might not get from a more sales oriented vendor presentation.

Kyle Hailey, a Oaktable member and Delphix employee, will host my talk.  I have been very impressed by Kyle’s technical knowledge and he will be with me to answer questions about Delphix that I could not answer.  I think it will be a good combination of my real world user experience and his depth of technical background in Delphix and Oracle performance tuning.

If you are going to OpenWorld and if you want to know more about Delphix come check it out.  Also, feel free to email me or post comments here if you have any questions about what the talk will cover.


Categories: DBA Blogs

Response to Robert Talbert: Pedagogical change is difficult, many need support

Michael Feldstein - Wed, 2015-10-07 12:51

By Phil HillMore Posts (367)

On Monday Robert Talbert, associate professor at Grand Valley State University and author of the Casting Out Nines blog, wrote a provocative and important post titled “Active learning as an ethical issue”. Robert noted:

The recent Proceedings of the National Academy of Sciences study stands out among these recent studies. It is a meta-study of 225 prior studies on active learning, and the results are bracing: students in these studies who were in classes focused on lecture and direct instruction in the classroom were 55% more likely to fail their courses than their counterparts in active learning focused classes, and scored almost half a standard deviation lower than their active learning counterparts on exams.

This sentence from the PNAS study stopped me in my tracks when I first read it:

“If the experiments analyzed here had been conducted as randomized controlled trials of medical interventions, they may have been stopped for benefit—meaning that enrolling patients in the control condition might be discontinued because the treatment being tested was clearly more beneficial.”

Robert’s central point is that active learning should be thought of as an ethical issue, where it could be considered unethical to withhold treatment. He then asks why faculty might withhold active learning and listed four reasons: self-preservation, laziness, a weird and irrational superiority complex, and legitimate external forces (such as overly controlling school structure).

The argument is an interesting and compelling one based on the study, and it is worth reading the whole article and his follow-up post. I wish we treated teaching and learning more often as an ethical issue,but I would add one additional reason that the active learning treatment is not more prevalent. This one comes from our discussions with faculty and support staff as part of our e-Literate TV series on personalized learning, and Michael and I summarized the point in the introduction episode. In a nutshell, changing to active learning (described as personalized learning in the series, but this terms overlaps with active learning in the context of this discussion) designs often or usually comes along with a fundamental change in role of the faculty and TAs involved. This changing role is profound and not easy, especially if faculty try to make changes on their own without peer or staff support.

Michael Feldstein: And going along with that was a willingness for faculty and for students to really ask some hard questions about the roles that they needed to take in the classroom, right? This is no longer, “I go up as a faculty member, and I lecture. I tell you what you know. And you, as a student, dutifully write it down and regurgitate it on the test.”

Faculty have to be comfortable letting go of a certain amount of control. We heard that over and over again. And students need to be comfortable and confident taking a certain amount of control over their education.
Faculty have to be comfortable letting go of a certain amount of control.

Phil Hill: Just taking a step back, I can’t emphasize enough what we’re talking about, the fact that this personalized learning, where it’s done in an appropriate manner, absolutely doesn’t replace the faculty. So, we just need to move past a lot of that part of the discussion.

But that change in role is significant. So, when you say, “You’ve got to give up some control,” we’re not just talking a slight adjustment in your teaching; you’re talking about something deeply held, internal beliefs. And part of the implication there is not just that it’s significantly different, but it also means it takes a lot of work and a lot of time to make that transition in a role.
It’s not just that it’s significantly different . . . , it also means it takes a lot of work and a lot of time.

And then, to pick up on your other point, students are coming in, and they need to be much more part of an active learning experience. Well, they’ve gone through, likely, the K–12 system, where they’ve almost been taught to be passive learners, or that’s sort of their expectations.

But now they’re coming in, and they’re being asked to do a lot of active work—to really stay up to speed, not put off work and cram right before the exams, but come in prepared to the classes. And a lot of times, they’re teaching themselves. So, those two change in roles are very significant, and they take time for people to deal with.

Michael also noted this challenge of knowing how to change in his recent post on lectures.

Following the IHE piece on Essex County College’s struggles to get good outcomes from their personalized learning program in developmental math, and following my blog post on the topic, Phil and I had an interesting exchange about the topic in email with ECC’s Vice President for Planning, Research, and Assessment Doug Walercz. With his permission, I’d like to share some of his observations with you. One of the big takeaways from the conversation, for me, is that our cultural notion of the pedagogical work that happens in a good lecture is pretty impoverished relative to the reality. We don’t have a clear understanding of all the things that a good lecture accomplishes, and therefore we often lose valuable elements of student support when we try to replace it. This has pretty serious implications for MOOCs, flipped classrooms, personalized learning, and a wide array of pedagogical approaches that replace a traditional in-person lecture with something else.

Michael then shared Doug’s email describing his viewpoints on expert blindness and the nature of combining non-cognitive aspects of teaching with content delivery. Michael concludes:

We don’t fully understand what we are doing well now. Therefore, when we attempt to deconstruct it and then reconstruct it in a different environment, we don’t really know what we will miss or how we will need to retrain our instructors so that we won’t miss it. That’s why it is so important to undertake these sorts of experiments thoughtfully, self-critically, and iteratively.

The point is that changing to any new pedagogy – active learning, adaptive, personalized, etc – changes the role of faculty and the methods of providing support to students in significant ways. I would add this difficulty with understanding and implementing change to Robert’s list of reasons why the active learning treatment has been withheld. Using Robert’s argument that this this an ethical issue, this reason should not be one to prevent such a change, but it is a reason why many faculty have not yet changed or a reason that additional support for faculty might be needed in order to allow more extensive adoption of active learning.

Some faculty will be able to make these changes to active learning on their own – think of them as autodidacts in learning about learning – but if you want deeper changes, then we need to acknowledge that many, if not most, faculty will need support to do so.

In Robert’s follow-up post, he makes an important point about assessment and effectiveness:

But also do this: Gather formative assessment data on a regular basis and see what students are actually learning.Don’t try to base the effectiveness of your teaching on how much passion and verve you appear to bring to lectures; don’t base it on summative assessments where the data come too late for students to act on them; don’t base it on how many students talk in your discussions or how bright and bushy tailed they appear to be. Base it on data that you collect about student learning.

Then do this: Analyze your assessment data when you get it, and objectively decide whether your teaching is helping students learn. And if it isn’t, consider how you might change, and then make the change.

The post Response to Robert Talbert: Pedagogical change is difficult, many need support appeared first on e-Literate.

How to delete older emails from GMAIL

Arun Bavera - Wed, 2015-10-07 08:40




Other category:

category: social older_than:45d

Categories: Development

Why go to Oracle OpenWorld?

Duncan Davies - Wed, 2015-10-07 07:00

We’re a shade under a month away from the biggest event in the calendar for those that work in the Oracle marketplace – the Oracle OpenWorld Conference.

It runs every year in San Francisco and draws a massive 60,000 attendees from 145 countries (plus 2.1 million online attendees). That’s huge.

There are more than 2,500 sessions from ~3,600 speakers, approximately half of which are customers/partners and half are Oracle themselves. As well as the sessions there are the demo grounds and the exhibition hall, all great places for networking with people that you’ve either not met before or have only ever come across online. You get quality face-time with top developers and execs, who are normally hidden behind many levels of Oracle Support. These are the people who have designed and written the products and services that we’ll be using over the coming years, so meeting up with them is priceless.

If you register before the event, it’s $2,450 (about £1,600).

I’m lucky to have the chance to go again this year, and I know already that it’s going to have huge value for both me and Cedar. Both my colleague, Graham, and I were lucky enough to be selected to speak (his session is on Fluid, mine is on Selective Adoption – the two hottest topics in PeopleSoft right now).

Graham also produced this lively promo video:

This (above) is what we look like, it’d be great to say hello to you if you’re around. Likewise, if you’re coming to either of our sessions let us know and we’ll be sure to say hi.

As a nice bonus, we get to see Elton John and Beck at the Appreciation Event!

I’m really looking forward to seeing and hearing about the very latest from the PeopleSoft and Fusion/Taleo worlds. Look out for a Cedar event when we return where we can share everything with you.

Oracle OpenWorld, Customer Reception, Latest News, and More

WebCenter Team - Wed, 2015-10-07 06:45

The OpenWorld edition of Oracle WebCenter and Oracle Business Process Management (BPM) newsletter is now out. Take a look to get more information on the exclusive Customer Reception we are hosting in San Francisco on Monday, October 25th; the must-attend sessions for you to hear from Oracle executives, subject matter experts, customers and partners to know the latest strategy, vision and roadmaps for both our on-premises and cloud solutions; the latest news features, product collateral, industry insights, and learn more about upcoming events and activities.

If you are in the area and have not registered for Oracle OpenWorld, you can still attend our Customer Reception so do RSVP today. And don't forget to add your favorite Oracle WebCenter, Oracle BPM and Oracle Cloud Content and Collaboration solutions to your My Schedule that is now live on Oracle OpenWorld website.

Not attending Oracle OpenWorld this year? No worries, this newsletter still has plenty of information for you to know the very latest happening in Oracle WebCenter, BPM and Cloud Collaboration world. Happy reading!