Anthony Shorten

Subscribe to Anthony Shorten feed
Oracle Blogs
Updated: 14 min 16 sec ago

Oracle Utilities Customer To Meter/Customer Care And Billing is available

Tue, 2018-09-18 19:03

Oracle Utilities Customer To Meter and Oracle Utilities Customer Care And Billing are now available for download from Oracle eDelivery Cloud. These new releases are based upon Oracle Utilities Application Framework with new and updated functionality.

For details of the release, refer to the release notes and documentation available from Oracle eDelivery Cloud and Oracle Utilities Help Center.

Inbound Web Services - REST Services

Tue, 2018-09-18 14:43

In Oracle Utilities Application Framework V4., the Inbound Web Services object has been extended to support both SOAP and REST based services. This has a lot of advantages:

  • Centralized web services registration. The interface Application Programming Interface (API) are now centralized in the Inbound Web Services object. This means you can manage all your programmatic interfaces from a single object. This helps when using the Web Service Catalog used for Oracle Integration Cloud Service as well as any API management capabilities.
  • Isolation from change. One of the major features of the REST capability within Inbound Web Services is the the URI is no longer fixed but can be different from the underlying service. This means you can isolate your interface clients from changes.
  • Standardization. The Inbound Web Services object has inherent standards that can be reused across both SOAP and REST based services. For example, the ConfigTools object model can be directly wired into the service reducing time.
  • Reduced cost of maintenance. One of the features of the new capability is to group all your interfaces into a minimal number of registrations. This reduces maintenance and allows you to control groups of interfaces easily.

The Inbound Web Services now supports two Web Service Classes:

  • SOAP - Traditional XAI and IWS based services based around the SOAP protocol. These services will be deployed to the Oracle WebLogic Server.
  • REST - RESTful based services that are now registered for use. These services are NOT deployed as they are used directly using the REST execution engine.

Inbound Web Service Business Object

For REST Services, a new optimized maintenance function is now available. This facility has the following capabilities:

  • Multiple Services in one definition. It is now possible to define multiple REST services in one registration. This reduces maintenance effort and the interfaces can be enabled and disabled at the Inbound Web Service level. Each REST Service is regarded as an operation on the Inbound Web Service.
  • Customizable URI for service. The URL used for the REST Service can be the same or different than the operation.
  • Business Object Support. In past releases, Business Objects were not supported. In this release, there are some limited support for Business Objects. Refer to the Release Notes and online documentation for clarification of level of support.
  • Open API Support.  This release introduces Open API support for documenting the REST API.

For example, the new Inbound Web Services maintenance function for REST is as follows:

Example REST Inbound Web Service definition

Active REST Services are available to the REST execution engine.

Open API (OAS3) Support has been introduced which provides the following:

  • Documentation of the API in various formats. The documentation of the REST based API based upon the meta data stored in the product.
  • Ability to authorize Inbound Web Services directly in Open API. It is possible to authorize the API directly from the Open API documentation. Developers can check the API prior to making it active.
  • Multiple formats supported. Developers can view payloads in various formats including Model format.
  • Ability to download the API. You can download the API directly from the documentation in Open API format. This allows the API to be imported into Development IDE's.
  • Ability to testing inline. Active API's can be tested directly into the documentation.

The following are examples of the documentation:

API Header including Authorization (Note: Server URL is generic as this server is NOT active).

Open API Support - Authorization

Operation/API List:

Open API Support - URL List

Request API with inbuilt testing facility:

Open API Support - API Request

Response API with response codes:

Open API Support - API Response

Model Format:

Open API Support - Model List

For more information about REST support, refer to the online documentation or Web Services Best Practices (Doc Id: 2214375.1) from My Oracle Support.

Oracle Utilities Application Framework V4. Release

Mon, 2018-09-17 18:05

Oracle Utilities Application Framework V4. based products will be released over the coming months. As with past release the Oracle Utilities Application Framework has been enhanced with new and updated features for on-premise, hybrid and cloud implementations of Oracle Utilities products.

The Oracle Utilities Application Framework continues to provide a flexible and wide ranging set of common services and technology to allow implementations the ability to meet the needs of their customers.  The latest release provides a wide range of new and updated capabilities to reduce costs and introduce exciting new functionality. The products ships with a complete listing of the changes and new functionality but here are some highlights:

  • Improved REST Support - The REST support for the product has been enhanced in this release. It is now possible to register REST Services in Inbound Web Services as REST. Inbound Web Services definitions have been enhanced to support both SOAP and REST Services. This has the advantage that the registration of integration is now centralized and the server URL for the services can be customized to suit individual requirements. It is now possible to register multiple REST Services within a single Inbound Web Services to reduce costs in management and operations. Execution of the REST Services has been enhanced to use the Registry as the first reference for a service. No additional deployment effort is necessary for this capability. A separate article on this topic will provide additional information.
  • Improved Web Registry Support for Integration Cloud Service - With the changes in REST and other integration changes such as Categories and supporting other adapters, the Web Service Catalog has been expanded to add support REST and other services directly for integration registration for use in the Oracle Integration Cloud.
  • File Access Adapter - In this release a File Adapter has been introduced to allow implementations to parameterize all file integration to reduce costs of management of file paths and ease the path to the Oracle Cloud. In Cloud implementations, an additional adapter is available to allow additional storage on the Oracle Object Storage Cloud to supplement cloud storage for Oracle Utilities SaaS solutions. The File Access Adapter includes an Extendable Lookup to define alias and physical location attributes. That lookup can then be used an alias for file paths in Batch Controls, etc.. A separate article on this topic will provide additional information.
  • Batch Start/End Date Time now part of Batch Instance Object - In past releases the Batch Start and End Dates and times where located as data elements with the thread attributes. This made analysis harder to perform. In this release these fields have been promoted as reportable fields directly on the Batch Instance Object for each thread. This will improve capabilities for reporting performance of batch jobs. For backward compatibility, these fields are only populated for new executions. The internal Business Service F1-GetBatchRunStartEnd has been extended to support the new columns and also detect old executions to return the correct values regardless.
  • New Level of Service Algorithms - In past releases, Batch Level Of Service required the building of custom algorithms for checking batch levels. In this release additional base algorithms for common scenarios like Total Run Time, Throughput and Error Rate are now provided for use. Additionally, it is now possible to define multiple Batch Level Of Service algorithms to model complex requirements. The Health Check API has been enhanced to return the Batch Level Of Service as well as other health parameters. A separate article on this topic will provide additional information.
  • Job Scope in DBMS_SCHEDULER interface - The DBMS_SCHEDULER Interface allowed for specification Batch Control and Global level of parameters as well as at runtime. In this release, it is possible to pre-define parameters within the interface at the Job level, allowing for support for control of individual instances Batch Controls that are used more than once across chains.
  • Ad-hoc Recalculation of To Do Priority - In the past release of the Oracle Utilities Application Framework an algorithm to dynamically reassess ad recalculate a To Do Priority was introduced. In this release, it is possible to invoke this algorithm in bulk using the new provided F1-TDCLP Batch Control.  This can be used with the algorithm to reassess To Do's to improve manual processing.
  • Introduction of a To Do Monitor Process and Algorithm - One of the issues with To Do's in the field has been that users can forget to manually close the To Do when the issue that caused the condition has been resolved. In this release a new batch control F1-TDMON and a new Monitor algorithm on the To Do Type has been added so that logic can be introduced to detect the resolution of the issue can lead to the product automatically closing the To Do.
  • New Schema Editor - Based upon feedback from partners and customers, the usability and capabilities of the Schema Editor have been improved to provide more information as part of the basic views to reduce rework and support cross browser development.
  • Process Flow Editor - A new capability has been added to the Oracle Utilities Application Framework to allow for complex workflows to be modeled and fully capable workflow introduced. This includes train support (including advanced navigation), saving incomplete work support, branching and object integration. This process flow editor was introduced internally successfully to use for our cloud automation in the Oracle Utilities Cloud Services Foundation and has now been introduced, in a new format, for use across the Oracle Utilities Application Framework based products. A separate article on this topic will provide additional information.
  • Improved Google Chrome Support - This release introduces extensive Google Chrome for Business support. Check the availability with each of the individual Oracle Utilities Application Framework based products.
  • New Cube Viewer - In the Oracle Utilities Market Settlements product we introduced a new Cube Viewer to embed advanced analytics into our products. That capability has been made generic and now included in the Oracle Utilities Application Framework so that products and implementations can now build their own cube analytical capabilities. In this release a series of new objects and ConfigTools objects have been introduce to build Cube Viewer based solutions. Note: The Cube Viewer has been built to operate independently of Oracle In-Memory Database support but would greatly benefit from use with Oracle In-Memory Database. A separate article on this topic will provide additional information.
  • Object Erasure Support - To support various data privacy regulations introduced across the world, a new Object Erasure capability has been introduced to manage the erasure or obfuscation of master objects within the Oracle Utilities Application Framework based products. This capability is complementary to the Information Lifecycle Management (ILM) capability introduced to manage transaction objects within the product. A number of objects and ConfigTools objects have been introduced to allow implementations to add Object Erasure to their implementations. A separate article on this topic will provide additional information.
  • Proactive Update ILM Switch Support - In past release, ILM eligibility and the ILM switch was performed in bulk exclusively by the ILM batch processes or using the Automatic Data Optimization (ADO) feature of the Oracle Database. To work more efficiently, it is now possible use the new BO Enter Status plug-in and BO Exit Status plug-in to proactively assess the eligibility and set the ILM switch as part of processing, thus reducing ILM workloads.
  • Mobile Framework Auto Deploy Support - This releases includes a new optional parameter to auto deploy mobile content automatically when a deployment is saved. This can avoid the extra manual deployment step, if desired.
  • Required Indicator on Legacy Screens - In past releases, the required indicator, based upon meta data, has been introduced for ConfigTools based objects, in this release it has been extended to Oracle Utilities Application Framework using legacy screens built using the Oracle Utilities SDK or custom JSP (that confirm to the standards required by the Oracle Utilities Application Framework). Note: Some custom JSP's may contain logic to prevent the correct display the required indicator.
  • Oracle Identity Manager integration improved - In this release the integration with Oracle Identity Manager has been improved with multiple adapters supported and the parameters are now located as a Feature Configuration rather than properties settings. This allows the integration setup to be migrated using Configuration Migration Assistant.
  • Outbound Message Mediator Improvements - In previous releases, implementations were required to use the Outbound Message Dispatcher (F1-OutmsgMediator) business services to send an outbound message without instantiating it but where the outbound message Business Object pre-processing algorithms need to be executed.  This business service orchestrated a creation and deletion of the outbound message, which is not desirable for performance reasons. The alternate business service Outbound Message Mediator (F1-OutmsgMediator) routes a message without instantiating anything, so is preferred when the outbound message should not be instantiated.  However, the Mediator did not execute the Business Object pre-processing algorithms.  In this release the Mediator business service has been enhanced to also execute the Business Object pre-processing algorithms.
  • Deprecations - In this release a few technologies and capabilities will be removed as they were announced in previous releases. These include:
    • XAI Servlet/MPL - After announcing the deprecation of XAI and MPL in 2012, the servlet and MPL software are no longer available in this release. XAI Objects are retained for backward compatibility and last minute migrations to IWS and OSB respectively.
    • Batch On WebLogic - In the Oracle Cloud, batch threadpools were managed under Oracle WebLogic. Given changes to the architecture over the last few releases, the support for threadpools is no longer supported. As this functionality was never released for use on-premise customers, this change does not have any impact to on-premise customers.
    • WebLogic Templates - With the adoption of Oracle WebLogic 12.2+, the necessity of custom WebLogic templates was no longer necessary. It is now possible to use the standard Fusion Middleware templates supplied with Oracle WebLogic with a few manual steps. These additional manual steps are documented in the new version of the Installation Guide supplied with the product. Customers may continue to use the Domain Builder supplied with Oracle WebLogic to build custom templates post Oracle Utilities Application Framework product installation. Customers should stop using the Native Installation or Clustering whitepaper documentation for Oracle Utilities Application Framework V4. and above as this information is now inside the Installation Guide directly or Oracle WebLogic 12.2.1.x Configuration Guide (Doc Id: 2413918.1) available from My Oracle Support.

A number of additional articles will be published over the next few weeks going over some of these topics as well as updates to key whitepapers will be published.

Oracle Utilities Testing Accelerator Whitepaper Updates

Wed, 2018-09-12 18:07

The Oracle Utilities Testing Accelerator Whitepaper has been updated with the latest information about the Testing capability optimized for Oracle Utilities.

The documentation is available at Oracle Utilities Testing Accelerator for Oracle Utilities (Doc Id: 2014163.1) from My Oracle Support.

The article includes the following updated documents:

  • Oracle Utilities Testing Accelerator Overview - Overview of the testing solution and how it optimizes the test experience.
  • Oracle Utilities Testing Accelerator Frequently Asked Questions - Set of common questions and answers about the Oracle Utilities Testing Accelerator including migration from the previous Oracle Application Testing Suite based solution.
  • Oracle Utilities Testing Accelerator Data Sheet (New) -  A brochure about the Oracle Utilities Testing Accelerator.

Oracle Utilities Testing Accelerator training is now available via the Oracle University training on-demand.

Patches available for Internet Explorer 11 performance

Tue, 2018-08-07 21:45

A number of Oracle Utilities Customer and Billing customers have reported some performance issues with Internet Explorer 11 in particular situations. After analysis, it was ascertained that the issue was within Internet Explorer itself. An article is available at Known UI Performance Issues on Internet Explorer 11 (Doc Id: 2430962.1) from My Oracle Support with an explanation of the issues and advice on patches recommended to install to minimize the issue for affected versions.

It is highly recommended to read the article and install the patches to minimize any issues with Internet Explorer 11.

Keep up to Date With Critical Patches

Wed, 2018-08-01 20:39

One of the most important recommendations I give to customers is to keep up to date with the latest patches, especially all the security patches, to improve performance and reduce risk.

For more information refer to the following sites:

Oracle WebLogic, Oracle Linux, Oracle Solaris and Oracle Database patches apply to Oracle Utilities products.

Using Groovy Whitepaper available

Sun, 2018-07-29 18:33

Groovy is an alternative language for building extensions for Oracle Utilities Application Framework based products for on-premise and cloud implementations. For Cloud implementations it is the preferred language replacing java based extensions typically available for on-premise implementations. The implementation of Groovy in the Oracle Utilities Application Framework extends the scripting object to allow Groovy script, Groovy includes and Groovy libraries to be implemented. This is all controlled using a whitelist to ensure that the code is appropriate for the cloud implementation.

A new whitepaper is available outlining the Groovy capability as well as some guidelines on how to use Groovy to extend Oracle Utilities products. It is available as Using Groovy Script in Oracle Utilities Applications (Doc Id: 2427512.1) from My Oracle Support.

New Oracle Utilities Testing Accelerator (

Fri, 2018-06-29 14:02

I am pleased to announce the next chapter in automated testing solutions for Oracle Utilities products. In the past some Oracle Utilities products have used Oracle Application Testing Suite with some content to provide an amazing functional and regression testing solution. Building upon that success, a new solution named the Oracle Utilities Testing Accelerator has been introduced that is an new optimized and focused solution for Oracle Utilities products.

The new solution has the following benefits:

  • Component Based. As with the Oracle's other testing solutions, this new solution is based upon testing components and flows with flow generation and databank support. Those capabilities were popular with our existing testing solution customers and exist in expanded forms in the new solution.
  • Comprehensive Content for Oracle Utilities. As with Oracle's other testing solutions, supported products provided pre-built content to significantly reduce costs in adoption of automation. In this solution, the number of product within the Oracle Utilities portfolio has greatly expanded to provide content. This now includes both on-premise product as well as our growing portfolio of cloud based solutions.
  • Self Contained Solution.  The Oracle Utilities Testing Accelerator architecture has been simplified to allow customers to quickly deploy the product with the minimum of fuss and prerequisites.
  • Used by Product QA. The Oracle Utilities Product QA teams use this product on a daily basis to verify the Oracle Utilities products. This means that the content provided has been certified for use on supported Oracle Utilities products and reduces risk of adoption of automation.
  • Behavior-Driven Development Support. One of most exciting capabilities introduced in this new solution, is the support for Behavior-Driven Development (BDD), which is popular with the newer Agile based implementation approaches. One of the major goals of the new testing capability is reduce rework from the Agile process into the building of test assets. This new capability introduces Machine Learning into the testing arena for generating test flows from Gherkin syntax documentation from Agile approaches. A developer can reuse their Gherkin specifications to generate a flow quickly without the need for rework. As the capability uses Machine Learning, it can be corrected if the assumptions it makes are incorrect for the flow and those corrections will be reused for any future flow generations. An example of this approach is shown below:

  • Selenium Based. The Oracle Utilities Testing Accelerator uses a Selenium based scripting language for greater flexibility across the different channels supported by the Oracle Utilities products. The script is generated automatically and does not need any alteration to be executed correctly.
  • Data Independence. As with other Oracle's testing products, data is supported independently of the flow and components. This translates into greater flexibility and greater levels of reuse in using automated testing. It is possible to change data at anytime during the process to explore greater possibilities in testing.
  • Support for Flexible Deployments. Whilst the focus of the Oracle Utilities Testing Accelerator is functional and/or regression testing.
  • Beyond Functional Testing. The Oracle Utilities Testing Accelerator is designed to be used for testing beyond just functional testing. It can be used to perform testing in flexible scenarios including:
    • Patch Testing. The Oracle Utilities Testing Accelerator can be used to assess the impact of product patches on business processes using the flows as a regression test.
    • Extension Release Testing. The Oracle Utilities Testing Accelerator can be used to assess the impact of releases of extensions from the Oracle Utilities SDK (via the migration tools in the SDK) or after a Configuration Migration Assistant (CMA) migration.
    • Sanity Testing. In the Oracle Cloud the Oracle Utilities Testing Accelerator is being used to assess the state of a new instance of the product including its availability and that the necessary data is setup ensuring the instance is ready for use.
    • Cross Oracle Utilities Product Testing. The Oracle Utilities Testing Accelerator supports flows that cross Oracle Utilities product boundaries to model end to end processes when multiple Oracle Utilities products are involved.
    • Blue/Green Testing. In the Oracle Cloud, zero outage upgrades are a key part of the solution offering. The Oracle Utilities Testing Accelerator supports the concept of blue/green deployment testing to allow multiple versions to be able to be tested to facilitate smooth upgrade transitions.
  • Lower Skills Required. The Oracle Utilities Testing Accelerator has been designed with the testing users in mind. Traditional automation involves using recording using a scripting language that embeds the data and logic into a script that is available for a programmer to alter to make it more flexible. The Oracle Utilities Testing Accelerator uses an orchestration metaphor to allow a lower skilled person, not a programmer, to build test flows and generate, no touch, scripts to be executed.

An example of the Oracle Utilities Testing Accelerator Workbench:

New Architecture

The Oracle Utilities Testing Accelerator has been re-architectured to be optimized for use with Oracle Utilities products:

  • Self Contained Solution. The new design is around simplicity. As much as possible the configuration is designed to be used with minimal configuration.
  • Minimal Prerequisites. The Oracle Utilities Testing Accelerator only requires Java to execute and a Database schema to store its data. Allocations for non-production for existing Oracle Utilities product licenses are sufficient to use for this solution. No additional database licenses are required by default.
  • Runs on same platforms as Oracle Utilities applications. The solution is designed to run on the same operating system and database combinations supported with the Oracle Utilities products.

The architecture is simple:

UTA Architecture

  • Product Components. A library of components from the Product QA teams ready to use with the Oracle Utilities Testing Accelerator. You decide which libraries you want to enable.
  • Oracle Utilities Testing Accelerator Workbench. A web based design toolset to manage and orchestrate your test assets. Includes the following components:
    • Embedded Web Application Server. A preset simple configuration and runtime to house the workbench.
    • Testing Dashboard. A new home page outlining the state of the components and flows installed as well as notifications for any approvals and assets ready for use.
    • Component Manager. A Component Manager to allow you to add custom component and manage the components available to use in flows.
    • Flow Manager. A Flow Manager allowing testers to orchestrate flows and manage their lifecycle including generation of selenium assets for execution.
    • Script Management. A script manager used to generate scripts and databanks for flows.
    • Security. A role based model to support administration, development of components/flows and approvals of components/flows.
  • Oracle Utilities Testing Accelerator Schema. A set of database objects that can be stored in any edition of Oracle (PDB or non-PDB is supported) for storing assets and configuration.
  • Oracle Utilities Testing Accelerator Eclipsed based Plug In. An Oxygen compatible Eclipse plugin that executes the tests including recording of performance and payloads for details test analysis.
New Content

The Oracle Utilities Testing Accelerator has expanded the release of the number of products supported and now includes Oracle Utilities Application Framework based products and Cloud Services Products. New content will be released on a regular basis to provide additional coverage for components and a set of prebuilt flows that can be used across products.

Note: Refer to the release notes for supported Oracle Utilities products and assets provided.


The Oracle Utilities Testing Accelerator provides a comprehensive testing solution, optimized for Oracle Utilities products, with content provided by Oracle to allow implementation to realize lower cost and lower risk adoption of automated testing.

For more information about this solution, refer to the Oracle Utilities Testing Accelerator Overview and Frequently Asked Questions (Doc Id: 2014163.1) available from My Oracle Support.

Note: The Oracle Utilities Testing Accelerator is a replacement for the older Oracle Functional Testing Advanced Pack for Oracle Utilities. Customers on that product should migrate to this new platform. Utilities to convert any custom components from the Oracle Application Testing Suite platform are provided with this tool.

Updated Technical Best Practices

Tue, 2018-06-26 04:13

The Oracle Utilities Application Framework Technical Best Practices have been revamped and updated to reflect new advice, new versions and the cloud implementations of the Oracle Utilities Application Framework based products. The following summary of changes have been performed:

  • Formatting change. The whitepaper uses a new template for the content which is being rolled out across Oracle products.

  • Removed out of date advice. Advice that was on older versions and is not appropriate anymore has been removed from the document. This is ongoing to keep the whitepaper current and optimal.
  • Added Configuration Migration Assistant advice. With the increased emphasis of the use of CMA we have added a section outlining some techniques on how to optimize the use of CMA in any implementation.
  • Added Optimization Techniques advice. With the implementation of the cloud, there are various techniques we use to reduce our costs and risks on that platform. We added a section outlining some common techniques can be reused for on-premise implementations. This is based upon a series of talks given at customer forums the last year or so.
  • Added Preparation Your Implementation For the Cloud advice. This is a new section outlining the various techniques that can be used to prepare an on-premise implementation for moving to the Oracle Utilities Cloud SaaS Services. This is based upon a series of talks given at customer forums the last year or so.

The new version of the whitepaper is available at Technical Best Practices (Doc Id: 560367.1) from My Oracle Support.

Oracle WebLogic 12.2.1.x Configuration Guide for Oracle Utilities available

Thu, 2018-06-21 19:06

A new guide whitepaper is now available for use with Oracle Utilities Application Framework based products that support Oracle WebLogic 12.2.1.x and above. The whitepaper walks through the setup of the domain using the Fusion Domain Templates instead of the templates supplied with the product. In future releases, Oracle Utilities Application Framework the product specific domain templates will not be supplied as the Fusion Domain Templates take more of a prominent role in deploying Oracle Utilities products.

The whitepaper covers the following topics:

  • Setting up the Domain for Oracle Utilities products
  • Additional Web Services configuration
  • Configuration of Global Flush functionality in Oracle WebLogic 12.2.1.x
  • Frequently asked installation questions

The whitepaper is available as Oracle WebLogic 12.2.1.x Configuration Guide (Doc Id: 2413918.1) from My Oracle Support.

Oracle Utilities and the Oracle Database In-Memory Option

Tue, 2018-05-29 20:09

A few years ago, Oracle introduced an In-Memory option for the database to optimize analytical style applications. In Oracle Database 12c and above, the In-Memory option has been enhanced to support other types of workloads. All Oracle Utilities products are now certified to use the Oracle In-Memory option, on Oracle Database 12c and above, to allow customers to optimize the operational and analytical aspects of the products.

The Oracle In-Memory option is a memory based column store that co-exists with existing caching schemes used within Oracle to deliver faster access speeds for complex queries across the products. It is transparent to the product code and can be easily implemented with a few simple changes to the database to implement the objects to store in memory. Once configured the Oracle Cost Based Optimizer becomes aware of the data loaded into memory and adjusts the execution plan directly, delivering much better performance in almost all cases.

There are just a few option changes that need to be done:

  • Enable the In-Memory Option. The In-Memory capability is actually already in the database software already (no relinking necessary) but it is disabled by default. After licensing the option, you can enable the option by setting the amount of the SGA you want to use for the In-Memory store. Remember to ensure that the SGA is large enough to cover the existing memory areas as well as the In-Memory Data Store. These are setting a few database initialization parameters.
  • Enable Adaptive Plans. To tell the optimizer that you now want to take into account the In-Memory Option, you need to enable Adaptive Plans to enable support. This is flexible where you can actually turn off the In-Memory support without changing In-Memory settings.
  • Decide the Objects to Load into Memory. Now that the In-Memory Option is enabled the next step is to decide what is actually loaded into memory. Oracle provides an In-Memory Advisor that analyzes workloads to make suggestions.
  • Alter Objects to Load into Memory. Create the SQL DDL statements to specify the statements to instruct the loading of objects into memory. This includes priority and compression options for the objects to maximize flexibility of the option. The In-Memory Advisor can be configured to generate these statements from its analysis.

No changes to the code is necessary to use the option to speed up common queries in the products and analytical queries.

A new Implementing Oracle In-Memory Option (Doc Id: 2404696.1) whitepaper available from My Oracle Support has been published which outlines details of this process as well as specific guidelines for implementing this option.

PS. The Oracle In-Memory Option has been significantly enhanced in Oracle Database 18c.


Data Management with Oracle Utilities products

Mon, 2018-05-28 19:46

One of the most common questions I receive is about how to manage data volumes in the Oracle Utilities products. The Oracle Utilities products are designed to scale no matter how much data is present in the database but obviously storage costs and management of large amounts of data is not optimal.

A few years ago we adopted the Information Lifecycle Management (ILM) capabilities of the Oracle Database as well as developed a unique spin on the management of data. Like biological life, data has a lifecycle. It is born when it is created, it has an active life while the business uses or manipulates it, it goes into retirement but is still accessible and eventually it dies when it is physically removed from the database. The length of that lifecycle will vary from data type to data type, implementation to implementation. The length of the life is dictated by its relevance to the business, company policies and even legal or government legislation.

The data management (ILM) capabilities of Oracle Utilities take this into account:

  • Data Retention Configuration. The business configures how long the active life of the individual data types are for their business. This defines what is called the Active Period. This is when the data needs to be in the database and accessible to the business for update and active use in their business.
  • ILM Eligibility Rules. Once the data retention period is reached, before the data can enter retirement, the system needs to know that anything outstanding, from a business perspective, has been completed. This is the major difference with most data management approaches. I hear DBA's saying that they would just rather the data was deleted after a specific period. Whilst that would cover most situations it would not cover a situation where the business is not finished with the data. Lets explain with an example. In CCB customers are billed and you can also record complains against a bill if their is a dispute. Depending on the business rules and legal processes an old bill may be in dispute. You should not remove anything related to that bill till the complaint is resolved, regardless of the age. Legal issues can be drawn out for lots of reasons. If you use a retention rule only then the data used in the complaint would potentially be lost. In the same situation, the base ILM Eligbility rules would detect something outstanding and bypass the applicable records. Remember these rules are protecting the business and ensuring that the ILM solution adheres to the complex rules of the business.
  • ILM Features in the Database. Oracle, like a lot of vendors, introduced ILM features into the database to help, what I like to call storage manage the data. This provides a set of flexible options and features to allow database administrators a full range of possibilities for their data management needs. Here are the capabilities (refer to the Database Administration Guide for details of each capability):
    • Partitioning. One of the most common capabilities is using the Partitioning option. This allows a large table to be split up, storage wise, into parts or partitions using a partitioned tablespace. This breaks up the table into manageable pieces and allows the database administration to optimize the storage using hardware and/or software options. Some hardware vendors have inbuilt ILM facilities and this option allows you to target specific data partitions to different hardware capabilities or just split the data into trenches (for example to separate the retirement stages of data). Partitioning is also a valid option if you want to use hardware storage tiered based solutions to save money. In this scenario you would pout the less used data on cheaper storage (if you have it) to save costs. For Partitioning advice, refer to the product DBA Guides which outline the most common partitioning schemes used by customers.
    • Advanced Compression. One of the popular options is the using the Advanced Compression option. This allows administrators to set compression rules against the database based upon data usage. The compression is transparent to the product and compressed data can be co-located with uncompressed data with no special processing needed by the code. The compression covers a wide range of techniques including CLOB compression as well as data compression. Customers using Oracle Exadata can also use Hybrid Columnar Compression (HCC) for hardware assisted compression for greater flexibility.
    • Heat Map. One of the features added to Oracle Database 12c and above to help DBA's is the Heat Map. This is a facility where the database will track the usage patterns of the data in your database and give you feedback on the activity of the individual rows in the database. This is an important tool as it helps the DBA identify which data is actually being used by the business and is a useful tool for determining what is important to optimize. It is even useful in the active period to determine which data can be safely compressed as it has reduced update activity against it. It is a useful tool and is part of the autonomous capabilities of the database.
    • Automatic Data Optimization. The Automatic Data Optimization (ADO) is a feature of database that allows database administrations  to implement rules to manage storage capabilities based upon various metrics including heat map. For example, the DBA can put in a rule that says data if data in a specific table is not touched for X months then it should be compressed. The rules cover compression, partition movement, storage features etc and can be triggered by Heat Map or any other valid metric (even SQL procedure code can be used).
    • Transportable Tablespaces. One of the most expensive things you can do in the database is issue a DELETE statement. To avoid this in bulk in any ILM based solution, Oracle offers the ability to use the Partitioning option and create a virtual trash bin via a transportable tablespace. Using ADO or other capabilities you can move data into this tablespace and then using basic commands switch off the tablespace to do bulk removal quickly. An added advantages is that you can archive that tablespace and reconnect it later if needed.

The Oracle Utilities ILM solution is comprehensive and flexible using both a aspect for the business to define their retention and eligibility rules and the various capabilities of the ILM in the database for the database administrator to factor in their individual sites hardware and support policies. It is not as simple as removing data in most cases and the Oracle Utilities ILM solution reduces the risk of managing your data, taking to account both your business and storage needs.

For more information about the Oracle Utilities ILM solution, refer to the ILM Planning Guide (Doc Id: 1682436.1) available from My Oracle Support and read the product DBA Guides for product specific advice.

EMEA Edge Conference 2018

Wed, 2018-05-23 19:41

I will be attending the EMEA Oracle Utilities Edge Conference on the 26 - 27 June 2018 in the Oracle London office. This year we are running an extended set of technical sessions around on-premise and the Oracle Utilities Cloud Services. This forum is open to Oracle Utilities customers and Oracle Utilities partners.

The sessions mirror the technical sessions for the conference in the USA held earlier this year with the following topics:

Reducing Your Storage Costs Using Information Life-cycle Management With the increasing costs of maintaining storage and satisfying business data retention rules can be challenging. Using Oracle Information Life-cycle Management solution can help simplify your storage solution and hardness the power of the hardware and software to reduce storage costs. Integration using Inbound Web Services and REST with Oracle Utilities Integration is a critical part of any implementation. The Oracle Utilities Application Framework has a range of facilities for integrating from and to other applications. This session will highlight all the facilities and where they are best suited to be used. Optimizing Your Implementation Implementations have a wide range of techniques available to implement successfully. This session will highlight a group of techniques that have been used by partners and our cloud implementations to reduce Total Cost Of Ownership. Testing Your On-Premise and Cloud Implementations Our Oracle Testing solution is popular with on premise implementations. This session will outline the current testing solution as well as outline our future plans for both on premise and in the cloud. Securing Your Implementations With the increase in cybersecurity and privacy concerns in the industry, a number of key security enhancements have made available in the product to support simple or complex security setups for on premise and cloud implementations. Turbocharge Your Oracle Utilities
Product Using the Oracle In-Memory Database Option
The Oracle Database In-Memory options allows for both OLTP and Analytics to run much faster using advanced techniques. This session will outline the capability and how it can be used in existing on premise implementations to provide superior performance. Developing Extensions using Groovy Groovy has been added as a supported language for on premise and cloud implementations. This session outlines that way that Groovy can be used in building extensions. Note: This session will be very technical in nature. Ask Us Anything Session Interaction with the customer and partner community is key to the Oracle Utilities product lines. This interactive sessions allows you (the customers and partners) to ask technical resources within Oracle Utilities questions you would like answered. The session will also allow Oracle Utilities to discuss directions and poll the audience on key initiatives to help plan road maps

Note: These sessions are not recorded or materials distributed outside this forum.

This year we have decided to not only discuss capabilities but also give an idea of how we use those facilities in our own cloud implementations to reduce our operating costs for you to use as a template for on-premise and hybrid implementations.

See you there if you are attending.

If you wish to attend, contact your Oracle Utilities local sales representative for details of the forum and the registration process.

Reflecting Changes in Business Objects in UI Tables with Visual Builder

Mon, 2018-05-21 13:14

While the quick start wizards in Visual Builder Cloud Service (VBCS) make it very easy to create tables and other UI components and bind them to business objects, it is good to understand what is going on behind the scenes, and what the wizards actually do. Knowing this will help you achieve things that we still don't have wizards for.

For example - let's suppose you created a business object and then created a UI table that shows the fields from that business object in your page. You probably used the "Add Data" quick start wizard to do that. But then you remembered that you need one more column added to your business object, however after you added that one to the BO, you'll notice it is not automatically shown in the UI. That makes sense since we don't want to automatically show all the fields in a BO in the UI.

But how do you add this new column to the UI?

The table's Add Data wizard will be disabled at this point - so is your only option to drop and recreate the UI table? Of course not!


If you'll look into the table properties you'll see it is based on a page level ServiceDataProvider ( SDP for short) variable. This is a special type of object that the wizards create to represent collections. If you'll look at the variable, you'll see that it is returning data using a specific type. Note that the type is defined at the flow level - if you'll look at the type definition you'll see where the fields that make up the object are defined.

Type Definition

It is very easy to add a new field here - and modify the type to include the new column you added to the BO. Just make sure you are using the column's id - and not it's title - when you define the new field in the items array.

Now back in the UI you can easily modify the code of the table to add one more column that will be hooked up to this new field in the SDP that is based on the type.

Sounds complex? It really isn't - here is a 3 minute video showing the whole thing end to end:

As you see - a little understanding of the way VBCS works, makes it easy to go beyond the wizards and achieve anything.

European Privacy Requirements: Considerations for Retailers

Mon, 2018-05-21 11:52

When retailers throughout Europe adopt a new set of privacy and security regulations this week, it will be the first major revision of data protection guidelines in more than 20 years. The 2018 regulations address personal as well as financial data, and require that retailers use systems already designed to fulfill these protections by default.

In 1995, the European Commission adopted a Data Protection Directive that regulates the processing of personal data within the European Union. This gave rise to 27 different national data regulations, all of which remain intact today. In 2012, the EC announced that it would supersede these national regulations and unify data protection law across the EU by adopting a new set of requirements called the General Data Protection Regulation (GDPR).

The rules apply to any retailer selling to European consumers. The GDPR, which takes effect May 25, 2018, pertains to any company doing business in, or with citizens of, the European Union, and to both new and existing products and services. Organizations found to be in violation of the GDPR will face a steep penalty of 20 million euros or four percent of their gross annual revenue, whichever is greater.

Retailers Must Protect Consumers While Personalizing Offers

GDPR regulations will encompass personal as well as financial data, including much of the data found in a robust customer engagement system, CRM, or loyalty program. It also includes information not historically considered to be personal data: device IDs, IP addresses, log data, geolocation data, and, very likely, cookies.

For the majority of retailers relying on customer data to personalize offers, it is critically important to understand how to fulfill GDPR requirements and execute core retail, customer, and marketing operations. Developing an intimate relationship with consumers and delivering personalized offers means tapping into myriad data sources.

This can be done, but systems must be GDPR-compliant by design and by default. A key concept underlying the GDPR is Privacy by Design (PBD), which essentially stipulates that systems be designed to minimize the amount of personal data they collect. Beginning this week, Privacy by Design features will become a regulatory requirement for both Oracle and our customers and GDPR stipulates that these protections are, by default, turned on.

Implementing Security Control Features

While the GDPR requires “appropriate security and confidentiality,” exact security controls are not specified. However, a number of security control features are discussed in the text and will likely be required for certain types of data or processing. Among them are multi-factor authentication for cloud services, customer-configurable IP whitelisting, granular access controls (by record, data element, data type, or logs), encryption, anonymization, and tokenization.

Other security controls likely to be required are “separation of duties” (a customer option requiring two people to perform certain administrative tasks); customer options for marking some fields as sensitive and restricted; limited access on the part of the data controller (i.e. Oracle) to customer information; displaying only a portion of a data field; and the permanent removal of portions of a data element.

Summary of Critical GDPR Requirements

The GDPR includes a number of recommendations and requirements governing users’ overall approach to data gathering and use. Among the more important are:

  • Minimization. Users are required to minimize the amount of data used, length of time it is stored, the number of people who have access to it, and the extent of that access.
  • Retention and purging. Data may be retained for only as long as reasonably necessary. This applies in particular to personal data, which should be processed only if the purpose of processing cannot reasonably be fulfilled by other means. Services must delete customer data on completion of the services.
  • Exports and portability. End users must be provided with copies of their data in a structured, commonly used digital format. Customers will be required to allow end users to send data directly to a competing service provider for some services.
  • Access, correction, and deletion. End-user requests for data access, correction, and deletion for data they store in any service. Users may have a “right to be forgotten”—a right to have all their data erased.
  • Notice and consent. When information is collected, end-user notice and consent for data processing is generally required.
  • Backup and disaster recovery. Timely availability of end-user data must be ensured.

Are you prepared?

Oracle is prepared for the EU General Data Protection Regulation (GDPR) that was adopted by the European Parliament in April 2016 and will become effective on May 25, 2018. We welcome the positive changes it is expected to bring to our service offerings by providing a consistent and unified data protection regime for businesses across Europe. Oracle is committed to helping its customers address the GDPR’s new requirements that are relevant to our service offerings, including any applicable processor accountability requirements.

Our customers can rest assured that Oracle Retail’s omnichannel suite will empower them to continue delivering personalized customer experiences that meet complex global data privacy regulations. Contact Oracle Retail to learn more about Oracle systems, services and GDPR compliance:





New Oracle E-Business Suite Person Data Removal Tool Now Available

Mon, 2018-05-21 10:27

Oracle is pleased to announce the availability of the Oracle E-Business Suite Person Data Removal Tool, designed to remove (obfuscate) data associated with people in E-Business Suite systems. Customers can apply the tool to select information in their E-Business Suite production systems to help address internal operational and external regulatory requirements, such as the EU General Data Protection Regulation (GDPR).

For more details, see:

DP World Extends Strategic Collaboration with Oracle to Accelerate Global Digital ...

Mon, 2018-05-21 09:56

Global trade enabler DP World has extended its partnership with Oracle to implement its digital transformation programme that supports its strategy to develop complementary sectors in the global supply chain such as industrial parks, free zones and logistics. 


Suhail Al Banna, Senior Vice President, DP World, Middle East and Africa Region; Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle; Mohammed Al Muallem, CEO and Managing Director, DP World, UAE Region and CEO, JAFZA.

Suhail Al Banna, Senior Vice President, DP World, Middle East and Africa Region; Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle; Mohammed Al Muallem, CEO and Managing Director, DP World, UAE Region and CEO, JAFZA.


The move follows an announcement by DP World earlier this year to use the Oracle Cloud Suite of Applications drive business transformation. Oracle Consulting will now implement the full suite of Fusion Enterprise Resource Planning (ERP), Human Capital Management (HCM) and Enterprise Performance Management (EPM) Cloud solutions using its True Cloud methodology. The technology roll out across the Group has already started with the Group’s UAE Region and Middle East and Africa Region the first to sign up.

Teo Chin Seng, Senior Vice President IT, DP World Group, said:“Our focus on building our digital capability follows our vision to become a digitised global trade enabler and we working to achieve a new operational efficiency level while creating value for our stakeholders.”

Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle said:“Following the recent announcement of our strategic partnership to help DP World drive its global digital transformation with our best-in-class Cloud Suite of Applications (SaaS), we are proud to extend our collaboration by leveraging the deep expertise of Oracle Consulting to drive this large scale project. We are confident that this strategic cloud deployment will help them deliver the next level of innovation and differentiation.”

The Oracle Consulting team is focused exclusively on Oracle Cloud solutions and staffed with more than 7,000 experts in 175 countries serving more than 20 million users to help organizations implement Oracle Cloud in an efficient and cost-effective manner.


Further press releases Oracle Middle East Newsroom 

If You Are Struggling With GDPR, Then You Are Not Alone

Mon, 2018-05-21 08:00

Well, it's only 5 days to go until the infamous GDPR deadline of 25th May 2018 and you can certainly see the activity accelerating.

You would have thought that with the deadline so close, most organisations would be sat back, relaxing, safe in the knowledge that they have had 2 years to prepare for GDPR, and therefore, are completely ready for it. It's true, some organisations are prepared and have spent the last 24 months working hard to meet the regulations. Sadly, there are also a significant proportion of companies who aren't quite ready. Some, because they have left it too late. Others, by choice.

Earlier this week I had the pleasure of being invited to sit on a panel discussing GDPR at Equinix's Innovation through Interconnection conference in London.

As with most panels, we had a very interesting discussion, talking about all aspects of GDPR including readiness, data sovereignty, healthcare, the role of Cloud, and the dreaded Brexit!

I have written before about GDPR, but this time I thought I would take a bit of time to summarise three of the more interesting discussion topics from the panel, particularly areas where I feel companies are struggling.

Are you including all of your personal right data?

There is a clear recognition that an organisation's customer data is in scope for GDPR. Indeed, my own personal email account has been inundated with opt-in consent emails from loads of companies, many of whom I had forgotten even had my data. Clearly, companies are making sure that they are addressing GDPR for their customers. However, I think there is a general concern that some organisations are missing some of the data, especially internal data, such as that of their employees. HR data is just as important when it comes to GDPR. I see some companies paying far less attention to this area than their customer's data.

Does Cloud help or hinder GDPR compliance?

A lot was discussed on the panel around the use of cloud. Personally, I think that cloud can be a great enabler, taking away some of the responsibility and overhead of implementing security controls, processes, and procedures and allowing the Data Processor (the Cloud Service Provider) to bring all of their experience, skill and resources into delivering you a secure environment. Of course, the use of Cloud also changes the dynamic. As the Data Controller, an organisation still has plenty of their own responsibility, including that of the data itself. Therefore, putting your systems and data into the Cloud doesn't allow you to wash your hands of the responsibility. However, it does allow you to focus on your smaller, more focused areas of responsibility. You can read more about shared responsiblity from Oracle's CISO, Gail Coury in this article. Of course, you need to make sure you pick the right cloud service provider to partner with. I'm sure I must have mentioned before that Oracle does Cloud and does it extremely well.

What are the real challenges customers are facing with GDPR?

I talk to lots of customers about GDPR and my observations were acknowledged during the panel discussion. Subject access rights is causing lots of headaches. To put it simply, I think we can break GDPR down into two main areas: Information Security and Subject Access Rights. Organisations have been implementing Information Security for many years (to varying degrees), especially if they have been subject to other legislations like PCI, HIPAA, SOX etc. However, whilst the UK Data Protection Act has always had principles around data subjects, GDPR really brings that front and centre. Implementing many of the principles associated with data subjects, i.e. me and you, can mean changes to applications, implementing new processes, identifying sources of data across an organisation etc. None of this is proving simple.

On a similar theme, responding to subject access rights due to this spread of data across an organisation is worrying many company service desks, concerned that come 25th May, they will be inundated with requests they cannot fulfil in a timely manner.

Oh and of course, that's before you even get to paper-based and unstructured data, which is proving to be a whole new level of challenge.

I could continue, but the above 3 areas are some of the main topics I am hearing over and over again with the customers I talk to. Hopefully, everyone has realised that there is no silver bullet for achieving GDPR compliance, and, for those companies who won't be ready in 5 days time, I hope you at least have a strong plan in place.

Experience, Not Conversion, is the Key to the Switching Economy

Mon, 2018-05-21 08:00

In a world increasingly defined by instant-gratification, the demand for positive and direct shopping experiences has risen exponentially. Today’s always-on customers are drawn to the most convenient products and services available. As a result, we are witnessing higher customer switching rates, with consumers focusing more on convenience than on branding, reputation, or even on price.  

In this switching economy – where information and services are always just a click away –  we tend to reach for what suits our needs in the shortest amount of time. This shift in decision making has made it harder than ever for businesses to build loyalty among their customers and to guarantee repeat purchases. According to recent research, only 1 in 5 consumers now consider it a hassle to switch between brands, while a third would rather shop for better deals than stay loyal to a single organization. 

What's Changed? 

The consumer mindset for one. And the switching tools available to customers have also changed. Customers now have the ability to research extensively before they purchase, with access to reviews and price comparison sites often meaning that consumers don’t even make it to a your website before being captured by a competitor. 

This poses a serious concern for those brands that have devoted their time – and marketing budgets – to building great customer experiences across their websites. 

Clearly this is not to say that on-site experiences aren’t important, but rather that they are only one part of the wider customer journey. In an environment as complex and fast moving as the switching economy, you must look to take a more omnichannel approach to experience, examining how your websites, mobile apps, customer service teams, external reviews and in-store experiences are all shaping the customers’ perceptions of your brand. 

What Still Needs to Change?

Only by getting to know your customers across all of these different channels can you future-proof your brand in the switching economy. To achieve this, you must establish a new set of metrics that go beyond website conversion. The days of conversion optimization being viewed as the secret sauce for competitive differentiation are over; now brands must recognize that high conversion rates are not necessarily synonymous with a great customer experience – or lifetime loyalty. 

Today, the real measure of success does not come from conversion, but from building a true understanding of your customers – across every touchpoint in the omnichannel journey. Through the rise of experience analytics, you finally have the tools and technologies needed to understand customers in this way, and to tailor all aspects of your brand to maximize convenience, encourage positive mindsets and pre-empt when your customers are planning to switch to a different brand. 

It is only through this additional layer of insight that businesses and brands will rebuild the notion of customer loyalty, and ultimately, overcome the challenges of the switching economy. 

Want to learn more about simplifying and improving the customer experience? Read Customer Experience Simplified: Deliver The Experience Your Customers Want to discover how to provide customer experiences that are managed as carefully as the product, the price, and the promotion of the marketing mix.

Customer Experience Simplified

See What Your Guests Think with Data Visualization

Mon, 2018-05-21 06:00

As we approach the end of May, thoughts of summer and vacations begin. Naturally, a key component is finding the best place to stay and often that means considering the hotel options at your chosen destination. But what’s the best way to decide? That’s where reading reviews is so important.   

And that brings us to the latest blog in the series of taking datasets from ‘less typical’ sources and analyzing them with Oracle Data Visualization. Here, we’ve pulled the reviews from as a dataset and visualized it to see how we – the general public - rate the hotels we stay in.

Working with Ismail Syed, pre-sales intern, and Harry Snart, pre-sales consultant, both from Oracle UK, we ran the analysis and created visualizations. We decided to look at the most common words used in both positive and negative reviews, see how long each of them is – and work out which countries are the most discerning when they give their feedback. 

So, what are the main irritations when we go away? Conversely - what's making a good impression?

Words of discontent

First, we wanted to combine the most commonly used words in a positive review with those most likely used in a negative review. You can see these in the stacked bar chart below. Interestingly, 'room' and 'staff' both appear in the positive and negative comments list. However, there are far more positive reviews around staff than negative ones, and likewise a lot more negative reviews around the room than positive reviews.

It seems then, across the board, guests find customer service better than the standard of the rooms they receive – implying an effective way to boost client retention would be by starting with improving rooms. In particular the small size of the rooms was complained about, that’s a tough fix, but people were more upset about the standard of the beds, their bathrooms and the toilets, which can be updated a bit more easily.

You’ll also notice 'breakfast' appears prominently in both the positive and negative word clouds – so a more achievable fix could be to start there. A bad breakfast can leave a bad taste, but a good one is obviously remembered. 

Who’ll give a good review?

Next, we wanted to see who the most complimentary reviewers were, by nationality. While North Americans, Australians and Kyrgyz (highlighted in green) tend to leave the most favorable reviews, hotels have a harder time impressing those from Madagascar, Nepal and Mali (in red). Europeans sit somewhere in the middle – except for Bosnia and Herzegovina, who like to leave an upbeat review.   

Next, we wanted to see who is the most verbose in their feedback – the negative reviewers or the positive reviewers – and which countries leave the longest posts.

Are shorter reviews sweeter?

Overall, negative reviews were slightly longer, but only by a small amount – contrary to the popular belief that we tend to ‘rant’ more when we’re perturbed about something. People from Trinidad and Tobago left the longest good reviews, at an average of 29 words. Those from Belarus, the USA and Canada followed as the wordiest positive reviewers. On the flip side, the Romanians, Swedish, Russians and Germans had a lot to say about their bad experiences – leaving an average of 22 words showing their displeasure.

It's business, but also personal...

Clearly data visualization doesn't necessarily just need to be a tool just for the workplace; you can deploy it to gain an insight into other aspects as well – including helping you prepare for some valuable time off.

If you’re an IT leader your organization and need to enable insights for everyone across business, you should consider a complete, connected and collaborative analytics platform like Oracle Analytics Cloud. Why not find out a bit more and get started for free.

If you simply interested in visual analysis of your own data? Why not see what you can find out by taking a look at our short demo and signing up for an Oracle Data Visualization trial?

Either way, make sure you and your business take a vacation from spreadsheets and discover far more from your data through visualization.