Feed aggregator

EXECUTE IMMEDIATE with multiple bind variables and SELECT INTO returning a single record but multiple columns

Tom Kyte - Thu, 2017-06-01 07:46
Hello, I'm building a function that always receives 3 input values. These incoming values are used as bind variables in assembling a SQL query string that is later executed using an EXECUTE IMMEDIATE statement. The SQL query string contains a S...
Categories: DBA Blogs

It is possible to declare/create an UDT type inside of a package?

Tom Kyte - Thu, 2017-06-01 07:46
Hello everyone. I came here with one simple question. It is possible create an UDT inside of a package? For example, I create a new package CREATE OR REPLACE PACKAGE MyPackage AS TYPE myType AS OBJECT( NAME VARCHAR(30), SURNAME VARCHAR(...
Categories: DBA Blogs

Bounded Context Model Without Violating RDBMS Principles

Tom Kyte - Thu, 2017-06-01 07:46
Hello Team, I work on system that is a web portal containing over a hundred different APIs that customers use to interact with our products. This system is legacy, and was developed with little to no database governance. As a result, all these API...
Categories: DBA Blogs

First Steps with Oracle Analytics Cloud

Rittman Mead Consulting - Thu, 2017-06-01 07:43
Preface

Not long ago Oracle added a new offer to their Cloud - an OBIEE in a Cloud with full access. Francesco Tisiot made an overview of it and now it's time to go a bit deeper and see how you can poke it with a sharp stick by yourself. In this blog, I'll show how to get your own OAC instance as fast and easy as possible.

Before you start

The very first step is to register a cloud account. Oracle gives a trial which allows testing of all features. I won't show it here as it is more or less a standard registration process. I just want highlight a few things:

  • You will need to verify your phone number by receiving an SMS. It seems that this mechanism may be a bit overloaded and I had to make more than one attempts. I press the Request code button but nothing happens. I wait and press it again, and again. And eventually, I got the code. I can't say for sure and possible it was just my bad luck but if you face the same problem just keep pushing (but not too much, requesting a code every second won't help you).
  • Even for trial you'll be asked for a credit card details. I haven't found a good diagnostics on how much was already spent and the documentation is not really helpful here.
Architecture

OAC instances are not self-containing and require some additional services. The absolute minimum configuration is the following:

  • Oracle Cloud Storage (OCS) - is used for backups, log files, etc.
  • Oracle Cloud Database Instance (DBC) - is used for RCU schemas.
  • Oracle Analytics Cloud Instance (OAC) - is our ultimate target.

From the Cloud services point of view, architecture is the following. This picture doesn't show virtual disks mounted to instances. These disks consume Cloud Storage quota but they aren't created separately as services.

Architecture

We need at least one Oracle Database Cloud instance to store RCU schemas. This database may or may not have a separate Cloud Storage area for backups. Every OAC instance requires Cloud storage area for logs. Multiple OAC instances may share one Cloud storage area but I can't find any advantage of this approach over a separate area for every instance.

Create Resources

We create these resource in the order they are listed earlier. Start with Storage, then DB and the last one is OAC. Actually, we don't have to create Cloud Storage containers separately as they may be created automatically. But I show it here to make things more clear without too much "it works by itself" magic.

Create Cloud Storage

The easiest part of all is the Oracle Cloud Storage container. We don't need to specify its size or lots of parameters. All parameters are just a name, storage class (Standard/Archive) and encryption.

20-create_ocs.gif

I spent some time here trying to figure out how to reference this storage later. There is a hint saying that "Use the format: <storage service>-<identity domain>/<container>. For example: mystorage1-myid999/mybackupcontainer." And if identity domain and container are pretty obvious, storage service puzzled me for some time. The answer is "storage service=Storage". You can see this in the top of the page.

30-OCS_naming.png

It seems that Storage is a fixed keyword, rurittmanm is the domain name created during the registration process and demo is the actual container name. So in this sample when I need to reference my demo OCS I should write Storage-rurittmanm/demo.

Create Cloud DB

Now when we are somewhat experienced in Oracle Cloud we may move to a more complicated task and create a Cloud DB Instance. It is harder than Cloud Storage container but not too much. If you ever created an on-premise database service using DBCA, cloud DB should be a piece of cake to you.

At the first step, we set the name of the instance and select the most general options. These options are:

  • Service Level. Specifies how this instance will be managed. Options are:

    • Oracle Database Cloud Service: Oracle Database software pre-installed on Oracle Cloud Virtual Machine. Database instances are created for you using configuration options provided in this wizard. Additional cloud tooling is available for backup, recovery and patching.
    • Oracle Database Cloud Service - Virtual Image: Oracle Database software pre-installed on an Oracle Cloud Virtual Machine. Database instances are created by you manually or using DBCA. No additional cloud tooling is available.
  • Metering Frequency - defines how this instance will be paid: by months or by hours.

  • Software Release - if the Service Level is Oracle Database Cloud Service, we may choose 11.2, 12.1 and 12.2, for Virtual Image only 11.2 and 12.1 are available. Note that even cloud does no magic and with DB 12.2 you may expect the same problems as on-premise.

  • Software Edition - Values are:

    • Standard Edition
    • Enterprise Edition
    • Enterprise Edition - High Performance
    • Enterprise Edition - Extreme Performance
  • Database Type - defines High Availability and Disaster Recovery options:

    • Single Instance
    • Database Clustering with RAC
    • Single Instance with Data Guard Standby
    • Database Clustering with RAC and Data Gard Standby

Database Clustering with RAC and Database Clustering with RAC and Data Gard Standby types are available only for Enterprise Edition - Extreme Performance edition.

40-create_obdc-1.gif

The second step is also quite intuitive. It has a lot of options but they should be pretty simple and well-known for anyone working with Oracle Database.

60-create-odbc-dc.png

The first block of parameters is about basic database configuration. Parameters like DB name (sid) or Administration Password are obvious.

Usable DataFile Storage (GB) is less obvious. Actually, in the beginning, it puzzled me completely. In this sample, I ask for 25 Gb of space. But this doesn't mean that my instance will take 25 Gb of my disk quota. In fact, this particular instance took 150 Gb of disk space. Here we specify only a guaranteed user disk space, but an instance needs some space for OS, and DB software, and temp, and swap, and so on.

65-db-disk.png

A trial account is limited with 500 Gb quota and that means that we can create only 3 Oracle DB Cloud instances at max. Every instance will use around 125 Gb of let's say "technical" disk space we can't reduce. From the practical point of view, it means that it may be preferable to have one "big" instance (in terms of the disk space) rather than multiple "small".

  • Compute shape specifies how powerful our VM should be. Options are the following:
    • OC3 - 1.0 OCPU, 7.5 GB RAM
    • OC4 - 2.0 OCPU, 15.0 GB RAM
    • OC5 - 4.0 OCPU, 30.0 GB RAM
    • OC6 - 8.0 OCPU, 60.0 GB RAM
    • OC7 - 16.0 OCPU, 120.0 GB RAM
    • OC1m - 1.0 OCPU, 15.0 GB RAM
    • OC2m - 2.0 OCPU, 30.0 GB RAM
    • OC3m - 4.0 OCPU, 60.0 GB RAM
    • OC4m - 8.0 OCPU, 120.0 GB RAM
    • OC5m - 16.0 OCPU, 240.0 GB RAM

We may increase or decrease this value later.

  • SSH Public Key - Oracle gives us an ability to connect directly to the instance and authentication is made by user+private key pair. Here we specify a public key which will be added to the instance. Obviously, we should have a private key for this public one. Possible options are either we provide a key we generated by ourselves or let Oracle create keys for us. The most non-obvious thing here is what is the username for the SSH. You can't change it and it isn't shown anywhere in the interface (at least I haven't found it). But you can find it in the documentation and it is opc.

The second block of parameters is about backup and restore. The meaning of these options is obvious, but exact values aren't (at least in the beginning).

70-create-odbc-brc.png

  • Cloud Storage Container - that's the Cloud Storage container I described earlier. Value for this field will be something like Storage-rurittmanm/demo. In fact, I may do not create this Container in advance. It's possible to specify any inexistent container here (but still in the form of Storage-<domain>/<name>) and tick Create Cloud Storage Container check-box. This will create a new container for us.

  • Username and Password are credentials of a user who can access this container.

The last block is Advanced settings and I believe it's quite simple and obvious. Most of the time we don't need to change anything in this block.

80-create-odbc-ac.png

When we fill all parameters and press the Next button we get a Summary screen and the actual process starts. It takes about 25-30 minutes to finish.

When I just started my experiments I was constantly getting a message saying that no sites available and my request may not be completed.

It is possible that it was again the same "luck" as with the phone number verification but the problem solved by itself a few hours later.

Create OAC Instance

At last, we have all we need for our very first OAC instance. The process of an OAC instance setup is almost the same as for an Oracle DB Cloud Instance. We start the process, define some parameters and wait for the result.

At the first step, we give a name to our instance, provide an SSH public key, and select an edition of our instance. We have two options here Enterprise Edition or Standard Edition and later we will select more additional options. Standard edition will allow us to specify either Data Visualisation or Essbase instances and Enterprise Edition adds to this list a classical Business Intelligence feature. The rest of the parameters here are exactly the same as for Database Instance.

90-oacs-1st-step.png

At the second step, we have four blocks of parameters.

100-oacs-2nd-step.png

  • Service Administrator - the most obvious one. Here we specify an administrator user. This user will be a system administrator.

  • Database - select a database for RCU schemas. That's why we needed a database.

  • Options - specify which options our instance will have.

    • Self-Service Data Visualisation, Preparation and Smart Discovery - this option means Oracle Data Visualisation and it is available for both Standard and Enterprise Editions.
    • Enterprise Data Models - this option gives us classical BI and available only for Enterprise Edition. Also, this option may be combined with the first one giving us both classical BI and modern Data discovery on one instance.
    • Collaborative Data Collection, Scenarios and What-if Analysis - this one stands for Essbase and available for Standard and Enterprise Editions. It can't be combined with other options.
  • Size is the same thing that is called Compute Shape for the Database. Options are exactly the same.
  • Usable Storage Size on Disk GB also has the same meaning as for the DB. The minimum size we may specify here is 25 Gb what gives us total 170 Gb of used disk space.

Here is a picture showing all possible combinations of services:

110-oacs-editions.png

And here virtual disks configuration. data disk is the one we specify.
130-oacs-storage.png

The last block - Cloud Storage Configuration was the hardest one. Especially the first field - Cloud Storage Base URL. The documentation says "Use the format: https://example.storage.oraclecloud.com/v1" and nothing more. When you know the answer it may be easy, but when I saw it for the first time it was hard. Should I place here any unique URL just like an identifier? Should it end with v1? And what is the value for the second instance? V2? Maybe I should place here the URL of my current datacenter (https://dbcs.emea.oraclecloud.com). The answer is https://<domain>.storage.oraclecloud.com/v1 in my case it is https://rurittmanm.storage.oraclecloud.com/v1. It stays the same for all instances.

All other parameters are the same as they were for DBCS instance. We either specify an existing Cloud Storage container or create it here.

120-oacs-cloud-storage.png

The rest of the process is obvious. We get a Summary and then wait. It takes about 40 minutes to create a new instance.

Note: diagnostics here is a bit poor and when it says that the instance start process is completed it may not be true. Sometimes it makes sense to wait some time before starting to panic.

Now we may access our instance as a usual. The only difference is that the port is 80 not 9502 (or 443 for SSL). For Data Visualisation the link is http(s)://<ip address>/va, for BIEE - http(s)://<ip address>/analytics and for Essbase http(s)://<ip address>/essbase. Enterprise Manager and Weblogic Server Console are availabale at port 7001 which is blocked by default.

What is bad that https uses a self-signed certificate. Depending on browser settings it may give an error or even prevent access to https.

Options here either use HTTP rather than HTTPS or add this certificate to your local computer. But these aren't the options for a production server. Luckily Oracle provides a way to use own SSL certificates.

Typical Management Tasks SSH to Instances

During the setup process, we provide Oracle with a public key which is used to get an SSH access to instances. Cloud does nothing special to this. In the case of Windows, we may use Putty. Just add the private key to Pageant and connect to the instance using user opc.

140-pageant.png

150-putty.gi

Opening Ports

By default only the absolute minimum of the ports is open and we can't connect to the OAC instance using BI Admin tool or to the DB with SQLDeveloper. In order to do this, we should create an access rule which allows access to this particular ports.

In order to get to the Access Rules interface, we must use instance menu and select the Access Rules option.

150-access-menu.png

This will open the Access Rules list. What I don't like about it is that it opens the full list of all rules but we can create only a rule for this particular instance.

160-access-rules-list.png

New rule creation form is simple and should cause no issues. But be careful here and not open too much for a wild Internet.

170-new-rule.png

Add More Users

The user who registered a Cloud Account becomes its administrator and can invite more users and manage privileges.

180-access-users.png

Here we can add and modify users.

190-users.png

When we add a user we specify a name, email and login. Also here we set roles for the user. The user will get an email with these details, and link to register.

Obviously, the user won't be asked about a credit card. He just starts working and that's all.

Summary

My first steps with Oracle Analytics Cloud were not very easy, but I think it was worth it. Now I can create a new OBIEE instance just in a few minutes and one hour later it will be up and running. And I think that's pretty fast compared to a normal process of creating a new server in a typical organisation. We don't need to think about OS installation, or licenses, or whatever else. Just try it.

Categories: BI & Warehousing

New OA Framework 12.2.6 Update 3 Now Available

Steven Chan - Thu, 2017-06-01 02:00

Web-based content in Oracle E-Business Suite Release 12 runs on the Oracle Application Framework (also known as OA Framework, OAF, or FWK) user interface libraries and infrastructure. Since the initial release of Oracle E-Business Suite Release 12.2 in 2013, we have released a number of cumulative updates to Oracle Application Framework to fix performance, security, and stability issues.

These updates are provided in cumulative Release Update Packs, and cumulative Bundle Patches that can be applied on top of the Release Update Packs. In this context, cumulative means that the latest RUP or Bundle Patch contains everything released earlier.

The latest OAF update for Oracle E-Business Suite Release 12.2.6 is now available:

Where is this update documented?

Instructions for installing this OAF Release Update Pack are in the following My Oracle Support knowledge document:

Who should apply this patch?

All Oracle E-Business Suite Release 12.2.6 users should apply this patch.  Future OAF patches for EBS Release 12.2.6 will require this patch as a prerequisite. 

What's new in this update?

This bundle patch is cumulative: it includes all fixes released in previous EBS Release 12.2.6 bundle patches.

This latest bundle patch includes fixes for the following issue:

  • Cosmetic issue: Framework email notifications have boxes around headers
  • Launching Forms application in Accessibility mode fails in the second iteration in IE11 browser.
  • In attachment image style, adding an attachment fails when a primary key of an entity map has the value as null.
  • In Japanese enabled environment, the text in the advanced table Total cell is getting split into multiple lines when the text has spaces.

Related Articles

Categories: APPS Blogs

Avoiding Coincidental Cohesion

Andrew Clarke - Wed, 2017-05-31 17:10
Given that Coincidental Cohesion is bad for our code base so obviously we want to avoid writing utilities packages. Fortunately it is mostly quite easy to do so. It requires vigilance on our part. Utilities packages are rarely planned. More often we are writing a piece of business functionality when we find ourselves in need of some low level functionality. It doesn't fit in the application package we're working on, perhaps we suspect that it might be more generally useful, so we need somewhere to put it.


The important thing is to recognise and resist the temptation of the Utilities package. The name itself (and similarly vague synonyms like helper or utils) should be a red flag. When we find ourselves about to type create or replace package utilities we need to stop and think: what would be a better name for this package? Consider whether there are related functions we might end up needing? Suppose we're about to write a function to convert a date into Unix epoch string. It doesn't take much imagine to think we might need a similar function to convert a Unix timestamp into a date. We don't need to write that function now but let's start a package dedicated to Time functions instead of a miscellaneous utils package.


Looking closely at the programs which comprise the DBMS_UTILITY package it is obviously unfair to describe them as a random selection. In fact that there seven or eight groups of related procedures.


DB Info

  • INSTANCE_RECORD Record Type
  • DBLINK_ARRAY Table Type
  • INSTANCE_TABLE Table Type
  • ACTIVE_INSTANCES Procedure
  • CURRENT_INSTANCE Function
  • DATA_BLOCK_ADDRESS_BLOCK Function
  • DATA_BLOCK_ADDRESS_FILE Function
  • DB_VERSION Procedure
  • GET_ENDIANNESS Function
  • GET_PARAMETER_VALUE Function
  • IS_CLUSTER_DATABASE Function
  • MAKE_DATA_BLOCK_ADDRESS Function
  • PORT_STRING Function
Runtime Messages
  • FORMAT_CALL_STACK Function
  • FORMAT_ERROR_BACKTRACE Function
  • FORMAT_ERROR_STACK Function
Object Management
  • COMMA_TO_TABLE Procedures
  • COMPILE_SCHEMA Procedure
  • CREATE_ALTER_TYPE_ERROR_TABLE Procedure
  • INVALIDATE Procedure
  • TABLE_TO_COMMA Procedures
  • VALIDATE Procedure
Object Info (Object Management?)
  • INDEX_TABLE_TYPE Table Type
  • LNAME_ARRAY Table Type
  • NAME_ARRAY Table Type
  • NUMBER_ARRAY Table Type
  • UNCL_ARRAY Table Type
  • CANONICALIZE Procedure
  • GET_DEPENDENCY Procedure
  • NAME_RESOLVE Procedure
  • NAME_TOKENIZE Procedure
Session Info
  • OLD_CURRENT_SCHEMA Function
  • OLD_CURRENT_USER Function
SQL Manipulation
  • EXPAND_SQL_TEXT Procedure
  • GET_SQL_HASH Function
  • SQLID_TO_SQLHASH Function
Statistics (deprecated))
  • ANALYZE_DATABASE Procedure
  • ANALYZE_PART_OBJECT Procedure
  • ANALYZE_SCHEMA Procedure
Time
  • GET_CPU_TIME Function
  • GET_TIME Function
  • GET_TZ_TRANSITIONS Procedure
Unclassified
  • WAIT_ON_PENDING_DML Function
  • EXEC_DDL_STATEMENT Procedure
  • GET_HASH_VALUE Function
  • IS_BIT_SET Function


We can see an alternative PL/SQL code suite, with several highly cohesive packages. But there will be some procedures which are genuinely unrelated to anything else. The four procedures in the Unclassified section above are examples. But writing a miscellaneous utils package for these programs is still wrong. There are better options.

  1. Find a home. It's worth considering whether we already have a package which would fit the new function. Perhaps WAIT_ON_PENDING_DML() should have gone in DBMS_TRANSACTION; perhaps IS_BIT_SET() properly belongs in UTL_RAW.
  2. A package of their own. Why not? It may seem extravagant to have a package with a single procedure but consider DBMS_DG with its lone procedure INITIATE_FS_FAILOVER(). The package delivers the usual architectural benefits plus it provides a natural home for related procedures we might discover a need for in the future.
  3. Standalone procedure. Again, why not? We are so conditioned to think of a PL/SQL program as a package that we forget it can be just a Procedure or Function. Some programs are suited to standalone implementation.


So avoiding the Utilities package requires vigilance. Code reviews can help here. Preventing the Utilities package becoming entrenched is crucial: once we have a number of packages dependent on a Utilities package it is pretty hard to get rid of it. And once it becomes a fixture in the code base developers will consider it more acceptable to add procedures to it.


Part of the Designing PL/SQL Programs series

Utilities - the Coincidental Cohesion anti-pattern

Andrew Clarke - Wed, 2017-05-31 15:46
One way to understand the importance of cohesion is to examine an example of a non-cohesive package, one exhibiting a random level of cohesion. The poster child for Coincidental Cohesion is the utility or helper package. Most applications will have one or more of these, and Oracle's PL/SQL library is no exception. DBMS_UTILITY has 37 distinct procedures and functions (i.e. not counting overloaded signatures) in 11gR2 and 38 in 12cR1 (and R2). Does DBMS_UTILITY deliver any of the benefits the PL/SQL Reference says packages deliver? Easier Application Design? One of the characteristics of utilities packages is that they aren't designed in advance. They are the place where functionality ends up because there is no apparently better place for it. Utilities occur when we are working on some other piece of application code; we discover a gap in the available functionality such as hashing a string. When this happens we generally need the functionality now: there's little benefit to deferring the implementation until later. So we write a GET_HASH_VALUE() function,x stick it in our utilities package and proceed with the task at hand.

The benefit of this approach is we keep our focus on the main job, delivering business functionality. The problem is, we never go back and re-evaluate the utilities. Indeed, now there is business functionality which depends on them: refactoring utilities introduces risk. Thus the size of the utilities package slowing increases, one tactical implementation at a time. Hidden Implementation Details? Another characteristic of utility functions is that they tend not to share concrete implementations. Often a utilities package beyond a certain size will have groups of procedures with related functionality. It seems probable that DBMS_UTILITY.ANALYZE_DATABASE(), DBMS_UTILITY.ANALYZE_PART_OBJECT() and DBMS_UTILITY.ANALYZE_SCHEMA() share some code. So there are benefits to co-locating them in the same package. But it is unlikely that CANONICALIZE() , CREATE_ALTER_TYPE_ERROR_TABLE() and GET_CPU_TIME() have much code in common. Added Functionality? Utility functions are rarely part of a specific business process. They are usually called on a one-off basis rather than being chained together. So there is no state to be maintained across different function calls. Better Performance? For the same reason there is no performance benefit from a utilities package. Quite the opposite. When there is no relationship between the functions we cannot make predictions about usage. We are not likely to call EXPAND_SQL_TEXT() right after calling PORT_STRING(). So there is no benefit in loading the former into memory when we call the latter. In fact the performance of EXPAND_SQL_TEXT() is impaired because we have to load the whole DBMS_UTILITY package into the shared pool, plus it uses up a larger chunk of memory until it gets aged out. Although to be fair, in these days of abundant RAM, some unused code in the library cache need not be our greatest concern. But whichever way we bounce it, it's not a boon. Grants? Privileges on utility packages is a neutral concern. Often utilities won't be used outside the owning schema. In cases where we do need to make them more widely available we're probably granting access on some procedures that the grantee will never use. Modularity? From an architectural perspective, modularity is the prime benefit of cohesion. A well-designed library should be frictionless and painless to navigate. The problem with random assemblages like DBMS_UTILITY is that it's not obvious what functions it may contain. Sometimes we write a piece of code we didn't need to. The costs of utility packagesPerhaps your PL/SQL code base has a procedure like this:


create or replace procedure run_ddl
( p_stmt in varchar2)
is
pragma autonomous_transaction;
v_cursor number := dbms_sql.open_cursor;
n pls_integer;
begin
dbms_sql.parse(v_cursor, p_stmt, dbms_sql.native);
n := dbms_sql.execute(v_cursor);
dbms_sql.close_cursor(v_cursor);
exception
when others then
if dbms_sql.is_open(v_cursor) then
dbms_sql.close_cursor(v_cursor);
end if;
raise;
end run_ddl;
/

It is a nice piece of code for executing DDL statements. The autonomous_transaction pragma prevents the execution of arbitrary DML statements (by throwing ORA-06519), so it's quite safe. The only problem is, it re-implements DBMS_UTILITY.EXEC_DDL_STATEMENT().


Code duplication like this is a common side effect of utility packages. Discovery is hard because their program units are clumped together accidentally. Nobody sets out to deliberately re-write DBMS_UTILITY.EXEC_DDL_STATEMENT(), it happens because not enough people know to look in that package before they start coding a helper function. Redundant code is a nasty cost of Coincidental Cohesion. Besides the initial wasted effort of writing an unnecessary program there are the incurred costs of maintaining it, testing it, the risk of introducing bugs or security holes. Plus each additional duplicated program makes our code base a little harder to navigate.


Fortunately there are tactics for avoiding or dealing with this. Find out more.


Part of the Designing PL/SQL Programs series

New workshop "Exadata For Developers"

Randolf Geist - Wed, 2017-05-31 15:00
Just a short note that I've developed a new two day course that covers all relevant features that a database application developer should know when dealing with the Oracle Exadata Database Machine platform.

It covers in detail Smart Scans, the Exadata Flash Cache, Hybrid Columnar Compression and all surrounding features like Storage Indexes, (serial) direct path reads etc. etc.. Of course it also includes features that were added in 12c, like Attribute Clustering and Zone Maps.

All features are presented with live demo scripts, and there will be enough time to discuss your specific questions and analyse existing applications if desired.

For more information and details, check the corresponding pages:

German: Exadata für Anwendungsentwickler

English: Exadata For Developers

OHUG 2017 - What Looks Good To Me

Floyd Teter - Wed, 2017-05-31 14:01
So I’m headed to the OHUG 2017 conference next week.  As it is one of the few conferences I attend anymore, I’m pretty excited about going.  I’m particularly interested in information related to the implementation of Oracle HCM Cloud.  So, in  preparation for the conference, I thought I’d share some events and sessions that look good to me.

First, a few caveats about the following list.   I’m presenting twice myself, so I’m breaking my sessions out as a blatant plug rather than including them on the list:
  • Oracle Human Capital Management Cloud User Experience Extensibility: From Mystery To Magic, Wednesday, 1:45 pm, Salon 1
  • An Update: The Oracle HCM Cloud Solution Library, Friday, 10:15 am, Salon 8
It's worth stating here that, although I'm an Oracle employee, the opinions here and mine and mine alone - this should not be interpreted in any way as official Oracle information or direction.  Nor am I speaking for OHUG in any way.  In addition, keep in mind that I’m focused on Oracle HCM Cloud Applications.  OHUG tracks cover HCM Cloud, PeopleSoft, and E-Business.  As I make my living these days by working on HCM Cloud Applications, I can’t render any opinions on PeopleSoft or EBS sessions…so you will only see HCM Cloud entries on my list.  Also keep in mind that I won’t list SIG sessions:  you know better than I the SIGs you should attend.  But I will say that the SIG sessions are often the best part of a user conference…be sure you don’t miss out.  And finally, some of the sessions I list will have schedule conflicts with other sessions I list.  But that’s an indicator of a great conference…making difficult choices about where to burn your calories. 
 
So, with all that said, let’s get to the list.  Sessions are listed in no particular order.  And, for the presenters that did not make this list…it’s not that your presentation did not look interesting (almost all of the sessions have something that I’d love to check out), it’s just that I’m digging into other subject areas this year and my own focus had definitely driven this list.

Digital HR Transformation  First and Foremost Managing The Challenge of Change
Tuesday
4:45 pm
Salon 12
Product Vision and Roadmap:  Oracle HCM Cloud
Tuesday
3:30 pm
Ballroom
The Current State of Human Capital Management
Tuesday
1:30 pm
Executive Ballroom
User Experience: Best Practices to Ensure a Positive Experience for Candidates and Users
Wednesday
9:00 am
Salon 15

Oracle HCM Cloud Leading Practice - Cloud Rapid Start Implementation with Tools
Wednesday
10:30 am
Salon 12
Meet The Experts:  HCM Cloud Center of Excellence
Wednesday
3:15 pm
Salon 19
How To Get The Best Out of OTBI
Thursday
9:00 am
Salon 12
Cloud Global Deployments - Leading Practices and Things That Make You Go Hmmm
Thursday
11:00 am
Salon 3
Cloud Payroll:  What You Don’t Know Could Hurt You or At Least Your Project
Thursday
2:30 pm
Salon 3
Adopting A Testing Strategy Fit For The Latest Oracle HCM Cloud
Friday
11:30 am
Salon 3
HCM Cloud Extracts Made Easy!
Wednesday
3:15 pm
Salon 9
Decision Is Made:  Going To HCM Cloud. Now, What’s The Best Strategy For Data Conversion?
Friday
10:15 am
Salon 13
Patching - Where’s The Beef?
Thursday
9:00 am
Salon 4

Best Practices in Attracting Top Talent Through Oracle Taleo Cloud Suite
Thursday
9:00 am
Salon 2
Cloud Portal:  Self-Service Management For The HCM Cloud
Wednesday
9:00 am
Salon 3

Note that this information is current as of May 31st.  Dates, times and locations could change at anytime leading up to or during the conference.  Your best bet for staying current is the OHUG application for your phone.

In addition to the sessions, there are a few conference events you may want to consider:
  • Meet The Experts:  a place to meet and chat with the people who build and deliver Oracle HCM products.  Watch the schedule for the times and locations of those you’d care to meet.  I have listed one Meet The Experts session in the above list - the session for the HCM Cloud Center of Excellence.  Oracle’s John Cafolla will be the expert at this particular session.  John has his fingers on the pulse of HCM Cloud implementations around the globe.  If you want to talk implementation, John is your guy.  You can find the full schedule for Meet The Experts here.
  • Report Jam:  Oracle’s Julian Challenger conducts a series of pretty cool workshops focused on building reports using OTBI.  If you have reports you’d like to build or report-related questions, you’ll get a hands-on opportunity to collaborate with Oracle’s best OTBI gurus here.  Sign up for a time slot in advance at the Empire foyer.
  • Usability Lab: you’ll have the opportunity to see and provide feedback on HCM-related features and applications still in development.  Great for getting a sneak peek and influencing future direction of Oracle products.  The Lab only happens on Wednesday and requires signing up in advance by sending an email to jeannette.chadwick@oracle.com.
  • Oracle Support Central:  Bring your toughest HCM-related questions here. Whether it’s HR, Payroll, Benefits, Compensation, Taleo or whatever, this is the place to get answers.

Let’s wrap this up with an invitation from me to you:  if you’d like to get together with me at the conference to talk or ask questions about anything related to HCM Cloud (or if you’d just like to hang out), hit me up on Twitter @fteter.  I'm happy to talk to customers, partners, old friends and potential new friends.  Hope to see you there! 

DevOps and the Agile DBA

Gerger Consulting - Wed, 2017-05-31 13:38
The esteemed ACE Director alumna Kellyn Pot'Vin Gorman is hosting our next webinar!

Attend the free webinar by Kellyn on June 13th and learn how agile best practices and virtualization can help you become better at DevOps.


About the Webinar:
This session discusses common database challenges and how they can be solved with newer DevOps practices without impacting the DBAs “safe zone”.

Development, testing, provisioning, patching, upgrading and managing will become easier with agile tools, virtualization and collaboration.

The take-aways from this session are to embrace DevOps basics with a “DBA twist” such as:
  • Develop, Rinse, Repeat
  • Test Until Your Five Year Old Can Do It
  • Human Intervention Creates Human Error- Automate EVERYTHING.
  • How the shorter development cycle build business success.
  • How the DBA can do more with less.
Discover the power of virtualization and agile development practices providing more value to even the most senior database expert.

We’ll demo different scenarios surrounding DevOps, virtualization and how to make the DBA pivotal to DevOps success.

Sign up for the free webinar.

About the Presenter:



Kellyn Pot’Vin-Gorman is a member of the Oak Table Network and an Oracle ACE Director alumna. She is the Technical Intelligence Manager for the Office of CTO at Delphix, a company recognized for it's impressive virtualization and data masking environment capabilities. Kellyn is known for her extensive work with multi-database platforms, cloud migrations, virtualization, the command line interface, environment optimization tuning, automation and architecture design. Her blog, http://dbakevlar.com and social media activity under her handle, DBAKevlar is well respected for her insight and content.

Registration is free but space is limited.
Sign up now.
Categories: Development

Packages, roles, compilation and invoker rights and package privileges.

Tom Kyte - Wed, 2017-05-31 13:26
Hi I have need to select from tables in another schema (B). I am connected to schema (A) by using proxy account. Privileges have been given to me through roles and I can select tables from schema B with sqlplus. There is package in schema...
Categories: DBA Blogs

PL/SQL update set row command and virtual columns

Tom Kyte - Wed, 2017-05-31 13:26
Hi I have found PL/SQL 'update tablex set row' command very useful in some cases. (And special thanks to asktom.com to teaching me it !). However I now would like to add virtual columns to table to be updated. My trial caused error of 'trying...
Categories: DBA Blogs

Find number and date which break the rule

Tom Kyte - Wed, 2017-05-31 13:26
Dear Chris, I have table like this <code>Account_number Opening_date 1234 16/05/2017 1235 16/05/2017 1236 23/04/2017 1237 25/04/2017 1238 25/04/2...
Categories: DBA Blogs

Need to load the data into pl/sql tables from excel file

Tom Kyte - Wed, 2017-05-31 13:26
Hello Experts, I have a below code from where I want to convert .xls file to .csv file and load the data into pl/sql tables but I am not able to perform it. Could you please help to get this done? Code: CREATE OR REPLACE FUNCTION LOAD_CSV...
Categories: DBA Blogs

DB Link: ALTER SESSION ENABLE PARALLEL DML

Tom Kyte - Wed, 2017-05-31 13:26
Hi Chris/Connonr, We have two databases namely Primary & Secondary. On Primary, There is scheduled Job which select data from Primary and Insert it into Secondary then inserted records gets delete from Primary. here are the steps written in ...
Categories: DBA Blogs

Session Maximum Time Exceeded

Tom Kyte - Wed, 2017-05-31 13:26
Hello I have problem with Session Maximum Time Exceeded. Users are usually 13 hours strait in the app. But sessions are cleared in 12 hours. How can i increase or disable it at all? Best requires Kristjan
Categories: DBA Blogs

Enabling an EBS Global Diagnostics Trace

Steven Chan - Wed, 2017-05-31 12:23

Oracle E-Business Suite provides several types of logs.  One of the tools you can use is the Logging Framework, which allows you to collect and view log messages in Oracle Applications Manager.  The Logging Framework collects debug messages, errors, and alerts.

A Global Diagnostics Trace can help you diagnose issues if errors occur when users click a button or select an option. It's a targeted trace that can be run per user, thereby minimizing the impact on the overall performance of your environment. This kind of trace is also known as an FND Diagnostics Trace or Oracle Applications Logging.

You can find a quick summary of steps to enable a Global Diagnostics trace here:

There's lots more information about more-powerful logging options in the following manuals

Related Articles

Categories: APPS Blogs

DIY Home Center Sets Foundation for Growth and Customer Experience Innovation with NetSuite Unified Cloud Commerce Platform

Oracle Press Releases - Wed, 2017-05-31 10:55
Press Release
DIY Home Center Sets Foundation for Growth and Customer Experience Innovation with NetSuite Unified Cloud Commerce Platform Online home improvement distributor sees 16% increase in conversion rate and 6% increase in average order value with new commerce platform

SAN MATEO, Calif.—May 31, 2017

Oracle NetSuite Global Business Unit, one of the world’s leading providers of cloud-based financials / ERP, HR, Professional Services Automation (PSA) and omnichannel commerce software suites, today announced that DIY Home Center, an online distributor of decking products and outdoor furniture, has implemented NetSuite’s unified cloud commerce platform to power its B2C and B2B ecommerce, inventory and order management, CRM and ERP.

Servicing homeowners and professional builders, DIY Home Center has experienced double-digit year over year growth since it launched online in 2004. However, its siloed, legacy systems and applications, including Dynacomp’s MOM, QuickBooks and home-grown product information management (PIM) system and ecommerce website, could no longer support its growth. The company was wasting resources making the technology work together, instead of focusing innovating the customer experience and growing the business.

“We had a lightbulb moment when we realized we could have everything on one unified suite without the hairball of disconnected systems to manage,” said Michael Anderson, President at DIY Home Center. “Now, with a modern ecommerce platform natively part of the solution, we can provide a more visual, intuitive shopping experience.”

DIY Home Center first focused on getting its technology foundation in place. It went live on NetSuite in June 2016 to power its back-end applications, including order and inventory management, CRM and ERP. By consolidating fragmented data into a single source of customer, order and product information, the company was able to make informed, timely business decisions and provide more engaging, relevant customer experiences. The launch of its new ecommerce site powered by SuiteCommerce Advanced followed in November and quickly led to the following results:

  • 6 percent increase in average order value.
  • 16 percent increase in conversion rates.
  • 15 percent increase in average time spent on site.

The new webstore, designed by NetSuite Commerce Agency Partner, Intente, provides a modern and engaging experience. Prior to the new site, only a desktop experience was supported. Now, with a responsive design website, 50 percent of the company’s site visitors are using mobile devices.  

To further enrich the online experience and drive engagement, DIY Home Center showcases a wealth of educational product content and improvement tips on its product pages, ranging from videos, blogs and tutorials.

To support its B2B buyers, DIY Home Center developed a Preferred Builder Program. Builders get the same rich online experience that individual shoppers get, but in addition, receive product discounts and enhanced account management capabilities to view invoices, make payments, review past orders and easily reorder products. With these account management activities now online, the company sales reps can focus on offering value-add services to their accounts and building brand loyalty.

DIY Home Center has also gained these other benefits with NetSuite’s unified cloud commerce platform:

  • Single source of product information. As an omnichannel retailer selling on marketplaces including Amazon, eBay and Walmart.com, DIY Home Center uses its product data from NetSuite to publish to these marketplaces, delivering consistent, high quality data.
  • Real-time inventory management. DIY Home Center is benefiting from just-in-time ordering. NetSuite SuiteCloud Developer Network partner PaceJet has helped streamline the pick, pack and ship process.
  • Optimized order management. With centralized order management for orders from all channels, DIY Home Center is able to quickly and efficiently process orders. What used to take sales reps four hours to process, now takes less than 30 minutes.
  • Unified cloud platform. Moving to the cloud has saved on IT costs and removed the hassles of having to manage systems and software. DIY Home Center also benefits from a platform that allows for easy customizations and integrations. The cloud platform provides the scalability and adaptability needed to keep pace with business, especially during the peak summer months.

“NetSuite is the heartbeat of our company,” said Anderson. “Everything starts and ends with it. If we’re looking to add another system our first requirement is that it must connect with NetSuite.”

Experience NetSuite at IRCE 2017

NetSuite is empowering DIY Home Center to transform its business. Merchants attending IRCE 2017 from June 6-9 at McCormick Place in Chicago will have the opportunity to see firsthand how NetSuite’s unified cloud commerce platform is enabling B2C and B2B businesses to provide customers with a seamless shopping experience as well as optimize business operations. To learn more and schedule a personal demo at NetSuite’s booth #701 please visit www.netsuite-irce.com.

About Oracle NetSuite Global Business Unit

Oracle NetSuite Global Business Unit pioneered the Cloud Computing revolution in 1998, establishing the world's first company dedicated to delivering business applications over the internet. Today, Oracle NetSuite Global Business Unit provides a suite of cloud-based financials / Enterprise Resource Planning (ERP), HR and omnichannel commerce software that runs the business of companies in more than 100 countries. For more information, please visit http://www.netsuite.com.

Follow Oracle NetSuite Global Business Unit’s Cloud blog, Facebook page and @NetSuite Twitter handle for real-time updates.

Contact Info
Christine Allen
Public Relations, Oracle NetSuite Global Business Unit
603-743-4534
PR@netsuite.com
About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit www.oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Talk to a Press Contact

Christine Allen

  • 603-743-4534

Soccer Express Scales Omnichannel Business with NetSuite Unified Cloud Commerce Platform

Oracle Press Releases - Wed, 2017-05-31 10:55
Press Release
Soccer Express Scales Omnichannel Business with NetSuite Unified Cloud Commerce Platform Canada’s largest soccer store delivers personalized, digital-first shopping experiences

SAN MATEO, Calif.—May 31, 2017

Oracle NetSuite Global Business Unit, one of the world’s leading providers of cloud-based financials / ERP, HR, Professional Services Automation (PSA) and omnichannel commerce software suites, today announced that Soccer Express, Canada’s largest distributor and retailer of soccer equipment and apparel, has implemented NetSuite’s unified cloud commerce platform to run its omnichannel sales spanning ecommerce and in- store point of sale (POS) across four stores, inventory management, order management and ERP. As a result of the implementation, Soccer Express has increased revenue by 25 percent.

Founded in Coquitlam, British Columbia in 1998, Soccer Express offers merchandise from global brands including Nike, Adidas and Puma. As the company experienced 30 percent year over year growth rates, Soccer Express realized its siloed, legacy Sage and Adobe systems could no longer support its business. The company required a scalable, flexible solution to manage the business and support its ecommerce expansion.

“Adopting a single solution that seamlessly unifies ecommerce with back-end business systems has given us the ability to scale and improve processes,” said Andrew Veer, Chief Operating Officer of Soccer Express. “By removing the burden of manual tasks, we’ve been able to take two employees previously dedicated to order entry and invoicing and move them to roles focused on better supporting our customers.”

Since going live with NetSuite in May, 2016, the customer experience has dramatically evolved.  All customer information is captured in a single repository, providing a 360-degree view of all customer activity, including purchases, returns, exchanges, store visits and marketing campaigns. This allows customer service representatives to provide better service and information to customers. Similarly, when shoppers are in a Soccer Express retail store, sales associates can access the same information with NetSuite POS, which provides shoppers with a seamless experience across all sales channels.

The new website, running on SuiteCommerce Advanced, provides a fresh and engaging shopping experience, including the ability to customize merchandise with names and numbers as well as display tier quantity pricing based on volume. With its target audience of 13- to 24-year-olds, it was also critical for Soccer Express to support and optimize the mobile experience. Now, with a responsive design website, the company has seen a 24 percent growth in mobile users and 30 percent increase in sales.

The website caters to individual soccer enthusiasts as well as soccer clubs by offering customized, private club stores. Each club store has its own unique merchandise, pricing, and real-time inventory availability. Because club purchasing happens at the start of the season, it was critical for Soccer Express to have a platform that could scale to meet the traffic and order management spikes.

“We support more than 150 soccer clubs across Canada and we wanted our customers to feel like they were shopping on a local online store,” said Veer. “We now have a platform that can meet the demands of our business.”

With NetSuite’s unified cloud commerce platform, Soccer Express has gained the following benefits:

  • Optimized inventory management. With two distribution centers and four retail stores, Soccer Express has real-time insight into inventory across its entire enterprise. It can utilize inventory from across its stores to fulfill ecommerce orders as well as ship direct from suppliers.
  • Efficient order management. With a centralized order management system for orders from all channels, Soccer Express can process orders faster, satisfy customers and lower operational costs. Orders that used to take six days to ship now go out in 24 hours or less.
  • Improved communication. Customers receive updates on every stage of their order from processing to shipping and delivery. Customer service is now standardized and employees across the company have access to all order information to help improve the customer experience.
  • Unified cloud platform. Replacing Sage for its ERP and Adobe for ecommerce with a single cloud solution, Soccer Express has removed the hassles of having to manage upgrades, servers and dealing with version lock issues. The company gets automatic product updates and a platform that allows for easy customizations and integrations.

Experience NetSuite at IRCE 2017

NetSuite is empowering Soccer Express to transform its business. Merchants attending IRCE 2017 from June 6-9 at McCormick Place in Chicago will have the opportunity to see firsthand how NetSuite’s unified cloud commerce platform is enabling B2C and B2B businesses to provide customers with a seamless shopping experience as well as optimize business operations. To learn more and schedule a personal demo at NetSuite’s booth #701 please visit www.netsuite-irce.com.

About Oracle NetSuite Global Business Unit

Oracle NetSuite Global Business Unit pioneered the Cloud Computing revolution in 1998, establishing the world's first company dedicated to delivering business applications over the internet. Today, Oracle NetSuite Global Business Unit provides a suite of cloud-based financials / Enterprise Resource Planning (ERP), HR and omnichannel commerce software that runs the business of companies in more than 100 countries. For more information, please visit http://www.netsuite.com.

Follow Oracle NetSuite Global Business Unit’s Cloud blog, Facebook page and @NetSuite Twitter handle for real-time updates.

Contact Info
Christine Allen
Public Relations, Oracle NetSuite Global Business Unit
603-743-4534
PR@netsuite.com
About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit www.oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Talk to a Press Contact

Christine Allen

  • 603-743-4534

ORA-12154 in Data Guard environment

Amardeep Sidhu - Wed, 2017-05-31 10:54

Hit this silly issue in one of the data guard environments today. Primary is a 2 node RAC running 11.2.0.4 and standby is also a 2 node RAC. Archive logs from node2 aren’t shipping and the error being reported is

ORA-12154: TNS:could not resolve the connect identifier specified

We tried usual things like going to $TNS_ADMIN, checking the entry in tnsnames.ora and then also trying to connect using sqlplus sys@target as sysdba. Everything seemed to be good but logs were not shipping and the same problem was being reported repeatedly. As everything on node1 was working fine so it looked even more weird.

From the error it is clear that the issue is with tnsnames entry. Finally found the issue after some 30 mins. It was an Oracle EBS environment so the TNS_ADMIN was set to the standard $ORACLE_HOME/network/admin/*hostname* path (on both the nodes). On node1 there was no tnsnames.ora file in $ORACLE_HOME/network/admin so it was connecting to the standby using the Apps tnsnames.ora which was having the correct entry for standby. On node2 there was a file called tnsnames.ora in $ORACLE_HOME/network/admin but it was not having any entry for standby. It was trying to connect using that file (the default tns path) and failing with ORA-12154. Once we removed that file, it started using the Apps tnsnames.ora and logs started shipping.

Categories: BI & Warehousing

Pages

Subscribe to Oracle FAQ aggregator