Feed aggregator

Internal Audit Policies

Dominic Brooks - Tue, 2017-06-27 13:23

The Oracle RDBMS market has long been a-shrinking.

A future as an Oracle specialist definitely looks like being past its shelf-life.

Of course, having spent many years as such you will have many transferable skills anyway, transferable to other products and languages within administration, build, performance, troubleshooting or design and development. And if you’re lucky to a number of these. It’s highly likely that you’ve not been a monogamous technologist anyway.

However, what frustrates me, I mean really, really frustrates me are internal audit policies.

Oracle is filled with plenty of fabulous features.

But when I can’t use a significant number of those then it makes me mad… to the extent that I see this all coming to an end prematurely. Unnecessarily.

I can’t use DBMS_PARALLEL_EXECUTE.

Fabulous feature.

From http://docs.oracle.com/database/122/ARPLS/DBMS_PARALLEL_EXECUTE.htm#ARPLS67331 :

To execute chunks in parallel, the user must have CREATE JOB system privilege.

Well, the application schema isn’t allowed CREATE JOB (or access to DBMS_SCHEDULER) because it’s against the internal audit policy.

(And I also hate it when I’m told that “anyway DBMS_JOB is deprecated” when it’s not, at least not until 12.2 and even then they deprecation is dubious because of transactionality or rather the continued lack thereof in the replacement.)

If I’m granted CREATE JOB then apparently I’m, or rather my application is, likely to suddenly go bonkers creating jobs left, right and centre when I should be using standard job infrastructure like Autosys or Control+M or whatever you use. (Even though I wouldn’t because I’m responsible for what I and my application do. And if I do irresponsible things, in general I expect sanctions and consequences)

I can make a very reasonable and well-reasoned argument as to why I should be using DBMS_PARALLEL_EXECUTE but ultimately I’m arguing with a checklist, even if there is a Jobsworth holding it.

Bottom line is that if I want to do multi-threaded work like this via DBMS_PARALLEL_EXECUTE, I can’t. But if I want to achieve the same thing in one thousand Java threads, that is perfectly fine. This is just not a level playing field!!!

My reasonable response is ok but can’t the DBAs provide a controlled developer toolkit of wrappers around such policied functionality?

To which the answer is also no. Because then the DBAs would potentially be responsible for any issues arising out of that usage.

The internal audit policy is largely driven by external policy coming from generic regulators and auditors (so I am told regularly although I don’t have direct access to the external policy).

The bank is being subjected to severe regulation and those regulations cannot be put on hold whilst alternative solutions are sought. Regulators and auditors have no concern when it comes to the lower level problems that are caused when they see high level gaps that need to be closed.

Yeah… so the application has to deal with the consequences even when the policy is wrong. Sorry. It’s wrong. Or it’s poorly implemented.

The list of what I can and can’t do (not personally remember, this is the main application schema, not me or a representation of me) grows all the time.

This DBMS_PARALLEL_EXECUTE and CREATE JOB is actually an old wound re-opened by refusal to be allowed to use an application context / SYS_CONTEXT.

The application needs CREATE ANY CONTEXT.

No, this is an elevated privilege. Not allowed.

Why is it an elevated privilege? Oh you mean the ANY word? That’s sort of misleading.

Checklist says no.

It’s not the same as CREATE ANY TABLE which allows you to create a table in ANY schema. Contexts are global. Security for actually setting the context value is via a named schema.package.

Checklist says no.

So the application can’t use VPD or any other worthy use case of using applications contexts.

Checklist says no.

So, can the DBA create the context for the application? Or provide some wrapper or toolkit to create it.

Checklist says no.

Um… ok. So that’s it then.

It’s just plain bonkers.

These are just two examples of many.

I’m had many a battle over the “can’t grant a privilege directly to a schema, has to be via a role” conversation. Which then leads to the explanation to the DBA of how PLSQL works. Oh so tedious.

I’m not 100% sure to what extent this is just local law enforcement or whether this is more widespread.

I’m pretty sure the latter but there are bound to be pockets of sanity and common sense out there somewhere.

Bottom line is that this will accelerate the shrinking Oracle RDBMS market.

People whose “speciality” are Oracle checklists are contributing to the demise and they will hopefully be able to transfer their checklist enforcement skills to other markets.

And shouldn’t product owners within Oracle surely be aware and be working with regulators and auditors to address these issues and in turn work with their customers to make sure the what they have discussed and agreed is carried down to the lower level? Also asking too much.

Bonkers. Bonkers. Bonkers. And been getting worse for years.

Working with Oracle is becoming unworkable.

My first article in nearly a year… and not a technical article… just a moan… and a reflection that I’m doing less and less Oracle work.
(But I’m not writing regularly on my other learnings either: http://thereplmc.wordpress.com)

It’s just a shame.


Oracle JET Modular Architecture Example

Andrejus Baranovski - Tue, 2017-06-27 13:01
One of my favourite parts in Oracle JET - modular code structuring support. This allows to split application functionality into modules and reusable functions. In this post I will show how you could leverage Oracle JET modular architecture not only by implementing common code functions, but also by managing data maintained in common modules.

Let's jump to the example (download or browse through sample code on GitHub repository - JETModularArchitecture). Sample application is based on JET template. I have created two common modules - dashboardChartHelper and dashboardTableHelper. These modules define data structure to be displayed in the dashboard and provide API to manage this data from consuming module:


Both helper modules are imported into consuming module - dashboard:


Dashboard module defines variables (chart and table data), which are initialized from variables assigned with data structures in helper modules:


There are two wrapper functions, calling API functions from helper modules. API provides data manipulation logic, which changes chart data structure. Wrapper functions are invoked from dashboard UI:


Here is the implementation of API functions - data array changes:


What is great about such structuring - data and data changes logic can be encapsulated in common helper module. As soon as we have observable variable defined in consuming module (dashboard), which points to the method helper method (where data is changed) - changes are automatically visible on UI.

Here is the data coming from helper modules rendered in dashboard module:


Press on Add Group E button, this will call helper module API function to update data array cached in that module. In turn observable variable in dashboard module will be updated and data displayed on UI will be refreshed:


Press on Remove Group E button - chart data will be changed again:

OUD 11.1.2.3 – ODSM Session Timeout

Yann Neuhaus - Tue, 2017-06-27 12:52

The ODSM is a quite powerful tool for managing the Oracle Unified Directory, and quite a lot of default settings are very reasonable. But there is one that disturbed me from the beginning. It is the ODSM Session Timeout. You might see a message like the following.

1

Or maybe this one.

2

It just says, that the page will expire unless a response is received within 2 minutes. Doing some complex searches or similar stuff often takes more than 2 minutes.

So, how can we increase the limit to, let’s say, 2 hours? A value which is more reasonable from my point of view.

In the early days, you had to edit a file called web.xml in a very intuitive directory. ;-) And change the session-timeout to whatever value you like. In my example 2 hours.

$ vi /u01/app/oracle/product/Middleware/11.1.2.3/user_projects/domains/base_domain/servers/AdminServer/tmp/_WL_user/odsm_11.1.1.5.0/d89dm9/war/WEB-INF/web.xml

...
...

  <session-config>
    <session-timeout>7200</session-timeout>
  </session-config>

However, changing the value here has the disadvantage that it might not be permanent.  This directory is a temporary cache directory for the WebLogic server.  So, if you apply a patch that overwrites the …/odsm/odsm.ear file, the changes you made to web.xml in the temporary cache directory are also overwritten. So, it is not a good long term solution.

Way better is to do it via the WebLogic Console.

Login to your WebLogic Console with the WebLogic user and navigate to “Deployments”. From there select the “/odsm” module.

3

Now move to the “Configuration” tab and change the Session Timeout (in seconds) to 7200, in case you want a 2h session timeout.

4

5

Now save the changes, and click another time OK, to save the Deployment Plan.

6

7

That’s it.

Conclusion

The default session timeout is way too short from my point of view. But no worries. Changing it via the WebLogic Console is quite easy and it might save you a lot of headaches. :-)

 

Cet article OUD 11.1.2.3 – ODSM Session Timeout est apparu en premier sur Blog dbi services.

Important Metrics for Food and Beverage Industry

Nilesh Jethwa - Tue, 2017-06-27 09:54

The food beverage industry can be extremely demanding if not highly competitive. Hence, it is important for managers and staff to remove guesswork from their vocabularies and instead utilize a dashboard app or two that will be able to assist them in running the business effectively, efficiently, and profitably.

Aside from standard excel dashboards, there are other reporting tools that focus on various metrics and KPIs that ensure that restaurants know what their strong and weak points are, where they are doing great service and earnings wise, and where they need to take more action and implement better controls.

Especially now that we live in a data-driven environment where access to real-time information is easier than ever, there is no reason for restaurants not to use excel dashboards to monitor their daily operations. There are distinct areas that need to be focused on and here they are:

  1. Kitchen
  • Food Cost

By comparing weekly food sales with purchases, your dashboard app will tell you whether you’re profiting from daily kitchen operations or not. Even spoiled food and other wastage are computed to determine the overall cost.

In the food beverage industry, there are certain percentages for food costs that are acceptable and those which are simply unjustifiable.

  • Kitchen Labor

It is also important for managers to know whether they have too many staff working in the kitchen at said hours of the day or not.

Read more at http://www.infocaptor.com/dashboard/important-metrics-and-kpis-for-food-beverage

Exploring the Rittman Mead Insights Lab

Rittman Mead Consulting - Tue, 2017-06-27 08:58
What is our Insights Lab?

The Insights Lab offers on-demand access to an experienced data science team, using a mature methodology to deliver one-off analyses and production-ready predictive models.

Our Data Science team includes physicists, mathematicians, industry veterans and data engineers ready to help you take analytics to the next level while providing expert guidance in the process.

Why use it?

Data is cheaper to collect and easier to store than ever before. But collecting the data is not synonymous with getting value from it. Businesses need to do more with the same budget and are starting to look into machine learning to achieve this.

These processes can take off some of the workload, freeing up people's time to work on more demanding tasks. However, many businesses don't know how to get started down this route, or even if they have the data necessary for a predictive model.

R

Our Data science team primarily work using the R programming language. R is an open source language which is supported by a large community.

The functionality of R is extended by many community written packages which implement a wide variety of statistical and graphical techniques, including linear and nonlinear modeling, statistical tests, time-series analysis, classification, clustering as well as packages for data access, cleaning, tidying, analysing and building reports.

All of these packages can be found on the Comprehensive R Archive Network (CRAN), making it easy to get access to new techniques or functionalities without needing to develop them yourself (all the community written packages work together).

R is not only free and extendable, it works well with other technologies and makes it an ideal choice for businesses who want to start looking into advanced analytics. Python is an obvious alternative, and several of our data scientists prefer it. We're happy to use whatever our client's teams are most familiar with.

Experienced programmers will find R syntax easy enough to pick up and will soon be able to implement some form of machine learning. However, for a detailed introduction to R and a closer look at implementing some of the concepts mentioned below we do offer a training course in R.

Our Methodology

Define Define a Question

Analytics, for all intents and purposes, is a scientific discipline and as such requires a hypothesis to test. That means having a specific question to answer using the data.

Starting this process without a question can lead to biases in the produced result. This is called data dredging - testing huge numbers of hypotheses about a single data set until the desired outcome is found. Many other forms of bias can be introduced accidentally; the most commonly occurring will be outlined in a future blog post.

Once a question is defined, it is also important to understand which aspects of the question you are most interested in. Associated, is the level of uncertainty or error that can be tolerated if the result is to be applied in a business context.

Questions can be grouped into a number of types. Some examples will be outlined in a future blog post.

Define a dataset

The data you expect to be relevant to your question needs to be collated. Maybe supplementary data is needed, or can be added from different databases or web scraping.

This data set then needs to be cleaned and tidied. This involves merging and reshaping the data as well as possibly summarising some variables. For example, removing spaces and non-printing characters from text and converting data types.

The data may be in a raw format, there may be errors in the data collection, or corrupt or missing values that need to be managed. These records can either be removed completely or replaced with reasonable default values, determined by which makes the most sense in this specific situation. If records are removed you need to ensure that no selection biases are being introduced.

All the data should be relevant to the question at hand, anything that isn't can be removed. There may also be external drivers for altering the data, such as privacy issues that require data to be anonymised.

Natural language processing could be implemented for text fields. This takes bodies of text in human readable format such as emails, documents and web page content and processes it into a form that is easier to analyse.

Any changes to the dataset need to be recorded and justified.

Model Exploratory Analysis

Exploratory data analysis involves summarising the data, investigating the structure, detecting outliers / anomalies as well as identifying patterns and trends. It can be considered as an early part of the model production process or as a preparatory step immediately prior. Exploratory analysis is driven by the data scientist, enabling them to fully understand the data set and make educated decisions; for example the best statistical methods to employ when developing a model.

The relationships between different variables can be understood and correlations found. As the data is explored, different hypotheses could be found that may define future projects.

Visualisations are a fundamental aspect of exploring the relationships in large datasets, allowing the identification of structure in the underlying dataset.

This is also a good time to look at the distribution of your dataset with respect to what you want to predict. This often provides an indication of the types of models or sampling techniques that will work well and lead to accurate predictions.

Variables with very few instances (or those with small variance) may not be beneficial, and in some cases could even be detrimental, increasing computation time and noise. Worse still, if these instances represent an outlier, significant (and unwarranted) value may be placed on these leading to bias and skewed results.

Statistical Modelling/Prediction

The data set is split into two sub groups, "Training" and "Test". The training set is used only in developing or "training" a model, ensuring that the data it is tested on (the test set) is unseen. This means the model is tested in a more realistic context and will help to determine whether the model has overfitted to the training set. i.e. is fitting random noise in addition to any meaningful features.

Taking what was learned from the exploratory analysis phase, an initial model can be developed based on an appropriate application of statistical methods and modeling tools. There are many different types of model that can be applied to the data, the best tends to depend on the complexity of your data and the any relationships that were found in the exploratory analysis phase. During training, the models are evaluated in accordance with an appropriate metric, the improvement of which is the "goal" of the development process. The predictions produced from the trained models when run on the test set will determine the accuracy of the model (i.e. how closely its predictions align with the unseen real data).

A particular type of modelling method, "machine learning" can streamline and improve upon this somewhat laborious process by defining models in such a way that they are able to self optimise, "learning" from past iterations to develop a superior version. Broadly, there are two types, supervised and un-supervised. A supervised machine learning model is given some direction from the data scientist as to the types of methods that it should use and what it is expecting. Unsupervised machine learning on the other hand, as the name suggests, involves giving the model less information to start with and letting it decide for its self what to value, and how to approach the problem. This can help to remove bias and reduce the number of assumptions made but will be more computationally intensive, as the model has a broader scope to investigate. Usually supervised machine learning is employed in a case where the problem and data set are reasonably well understood, and unsupervised machine learning where this is not the case.

Complex predictive modelling algorithms perform feature importance and selection internally while constructing models. These models can also report on the variable importance determined during the model preparation process.

Peer Review

This is an important part of any scientific process, and effectively utilities our broad expertise in modelling at Rittman Mead. This enables us to be sure no biases were introduced that could lead to a misleading prediction and that the accuracy of the models is what could be expected if the model was run on new unseen data. Additional expert views can also lead to alternative potential avenues of investigation being identified as part of an expanded or subsequent study.

Deploy Report

For a scientific investigation to be credible the results must be reproducible. The reports we produce are written in R markdown and contain all the code required to reproduce the results presented. This also means it can be re-run with new data as long as it is of the same format. A clear and concise description of the investigation from start to finish will be provided to ensure that justification and context is given for all decisions and actions.

Delivery

If the result is of the required accuracy we will deploy a model API enabling customers to start utilising it immediately.
There is always a risk however that the data does not contain the required variables to create predictions with sufficient confidence for use. In these cases, and after the exploratory analysis phase there may be other questions that would be beneficial to investigate. This is also a useful result, enabling us to suggest additional data to collect that may allow a more accurate result should the process be repeated later.

Support

Following delivery we are able to provide a number of support services to ensure that maximum value is extracted from the model on an on-going basis. These include:
- Monitoring performance and accuracy against the observed, actual values over a period of time. Should there be discrepancies between these values arise, these can be used to identify the need for alterations to the model.
- Exploring specific exceptions to the model. There may be cases in which the model consistently performs poorly. Instances like these may not have existed in the training set and the model could be re-trained accordingly. If they were in the training set these could be weighted differently to ensure a better accuracy, or could be represented by a separate model.
- Updates to the model to reflect discrepancies identified through monitoring, changes of circumstance, or the availability of new data.
- Many problems are time dependent and so model performance is expected to degrade, requiring retraining on more up to date data.

Summary

In conclusion our Insights lab has a clearly defined and proven process for data science projects that can be adapted to fit a range of problems.

Contact us to learn how Insights Lab can help your organization get the most from its data, and schedule your consultation today.
Contact us at info@rittmanmead.com

Categories: BI & Warehousing

Notes on datapatch

Michael Dinh - Tue, 2017-06-27 08:47

Mike Dietrich is the God father of all Oracle upgrades.

Here are just a few reference and should read

DBUA 12c and “datapatch.pl” – things to know

DBUA misses the post-upgrade datapatch execution in Oracle 12.1.0.2. The solution is to apply the SQL changes manually after DBUA has completed the database upgrade to Oracle Database 12c:

cd $ORACLE_HOME/OPatch
./datapatch -verbose

Does DBCA execute “datapatch” in Oracle 12.2?

My notes: when it doubt check it out.

$ cd $ORACLE_HOME/OPatch
$ ./datapatch -verbose

SQL Patching tool version 12.1.0.2.0 on Thu Jan 19 10:35:27 2017
Copyright (c) 2016, Oracle.  All rights reserved.

Log file for this invocation: /u01/app/oracle/cfgtoollogs/sqlpatch/sqlpatch_93223_2017_01_19_10_35_27/sqlpatch_invocation.log

Connecting to database...OK
Bootstrapping registry and package to current versions...done
Determining current state...done

Current state of SQL patches:
Patch 24315824 (Database PSU 12.1.0.2.161018, Oracle JavaVM Component (OCT2016)):
  Installed in the binary registry only
Bundle series DBBP:
  ID 161018 in the binary registry and not installed in the SQL registry

Adding patches to installation queue and performing prereq checks...
Installation queue:
  Nothing to roll back
  The following patches will be applied:
    24315824 (Database PSU 12.1.0.2.161018, Oracle JavaVM Component (OCT2016))
    24340679 (DATABASE BUNDLE PATCH: 12.1.0.2.161018 (24340679))

Installing patches...
Patch installation complete.  Total patches installed: 2

Validating logfiles...
Patch 24315824 apply: SUCCESS
  logfile: /u01/app/oracle/cfgtoollogs/sqlpatch/24315824/20676808/24315824_apply_DBFS_2017Jan19_10_35_43.log (no errors)
Patch 24340679 apply: SUCCESS
  logfile: /u01/app/oracle/cfgtoollogs/sqlpatch/24340679/20646358/24340679_apply_DBFS_2017Jan19_10_35_47.log (no errors)
SQL Patching tool complete on Thu Jan 19 10:37:16 2017

SQL> set serveroutput on
SQL> exec dbms_qopatch.get_sqlpatch_status;

-- Logfile is provided. How convenient is that?
Patch Id : 24315824
        Action : APPLY
        Action Time : 19-JAN-2017 10:37:14
        Description : Database PSU 12.1.0.2.161018, Oracle JavaVM Component (OCT2016)
        Logfile : /u01/app/oracle/cfgtoollogs/sqlpatch/24315824/20676808/24315824_apply_DBFS_2017Jan19_10_35_43.log
        Status : SUCCESS

Patch Id : 24340679
        Action : APPLY
        Action Time : 19-JAN-2017 10:37:16
        Description : DATABASE BUNDLE PATCH: 12.1.0.2.161018 (24340679)
        Logfile : /u01/app/oracle/cfgtoollogs/sqlpatch/24340679/20646358/24340679_apply_DBFS_2017Jan19_10_35_47.log
        Status : SUCCESS

PL/SQL procedure successfully completed.

SQL>


$ cd $ORACLE_HOME/OPatch
$ ./datapatch -verbose


SQL> set serveroutput on
SQL> exec dbms_qopatch.get_sqlpatch_status;


Patch Id : 24315824
        Action : APPLY
        Action Time : 19-JAN-2017 10:37:14
        Description : Database PSU 12.1.0.2.161018, Oracle JavaVM Component (OCT2016)
        Logfile : /u01/app/oracle/cfgtoollogs/sqlpatch/24315824/20676808/24315824_apply_DBFS_2017Jan19_10_35_43.log
        Status : SUCCESS

Patch Id : 24340679
        Action : APPLY
        Action Time : 19-JAN-2017 10:37:16
        Description : DATABASE BUNDLE PATCH: 12.1.0.2.161018 (24340679)
        Logfile : /u01/app/oracle/cfgtoollogs/sqlpatch/24340679/20646358/24340679_apply_DBFS_2017Jan19_10_35_47.log
        Status : SUCCESS

PL/SQL procedure successfully completed.

SQL>

Italian Consumer Electronics Retailer Unieuro SpA Drives Efficiency Through Supply Chain

Oracle Press Releases - Tue, 2017-06-27 08:01
Press Release
Italian Consumer Electronics Retailer Unieuro SpA Drives Efficiency Through Supply Chain Anticipate Demand and Simplify Operations with a Unified View of Inventory

Redwood Shores, Calif.—Jun 27, 2017

Today, Oracle announced that consumer electronics retailer Unieuro SpA purchased Oracle Retail Demand Forecasting and Oracle Retail Allocation & Replenishment to optimize stock levels, increase profitability and improve cash flow.

Unieuro aims at improving its centralized supply chain organization in order to support effective omnichannel replenishment processes including DOS, affiliate and online channels. With this initiative, Unieuro is in a stronger position to optimize the cross-channel opportunity to minimize stock, reduce obsolescence and streamline organizational costs. The supply chain organization will have the much needed visibility into demand coming from all touch points and to ultimately orchestrate the supply, reduce lost sales and increase customer satisfaction.

“Oracle Retail provides a distinctive replenishment approach for short lifecycle products which includes attribute based forecasting.” said Luigi Fusco, COO, Unieuro SpA. “We believe the optimized orchestration of the stock across channels will help improve our fulfillment rate to improve customer satisfaction and reduce obsolescence to eliminate costs.”

“After conducting a proof of concept with our data, Oracle Retail gained our confidence to move forward with the project. We validated our choice with customer references in the consumer electronics and hardlines businesses,” said Luca Girotti, IT Director, Unieuro. “We are thankful to the Sopra Steria team who helped us evaluate the market offerings and ultimately decide to move forward with the Oracle Retail solution.”

“Retailers like Unieuro can proactively position inventory in the right place in the right quantity by using analytic algorithms to drive accuracy and visibility. The visibility of this new supply chain organization will help Unieuro inspire consumer loyalty with a better in-stock position wherever they are inspired to shop,” said Ray Carlin, Senior Vice President and General Manager, Oracle Retail.

About Oracle Retail

Oracle provides retailers with a complete, open, and integrated suite of best-of-breed business applications, cloud services, and hardware that are engineered to work together and empower commerce. Leading fashion, grocery, and specialty retailers use Oracle solutions to anticipate market changes, simplify operations and inspire authentic brand interactions. For more information, visit our website at www.oracle.com/retail.

About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit www.oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

NetSuite Announces New Channel Partners Seeking to Drive Growth with Cloud ERP

Oracle Press Releases - Tue, 2017-06-27 08:01
Press Release
NetSuite Announces New Channel Partners Seeking to Drive Growth with Cloud ERP Cherry Bekaert, Terillium, Revolution Group and Conexus SG Join NetSuite Solution Provider Program

San Mateo, Calif.—Jun 27, 2017

Oracle NetSuite Global Business Unit, one of the world’s leading providers of cloud-based financials / ERP, HR, Professional Services Automation (PSA) and omnichannel commerce software suites, today announced the addition of new partners to the NetSuite Solution Provider Program including Cherry Bekaert, Terillium, Revolution Group and Conexus SG. All four partners have launched new cloud ERP practices with NetSuite to help clients capitalize on the transformative power of cloud ERP. These new partners are responding to high demand for cloud ERP that can deliver greater agility and scalability for growth without the costly overhead and limitations of on-premise systems. At the same time, the partners are taking advantage of the opportunity to diversify and grow their business while benefiting with the high margins and recurring revenue offered through the NetSuite Solution Provider Program.

Launched in 2002, the NetSuite Solution Provider Program is the industry’s leading cloud channel partner program. Since its inception, NetSuite has been a leader in partner success, breaking new ground in building and executing on the leading model to make the channel successful with NetSuite. A top choice for partners who are building new cloud ERP practices or for those expanding their existing practice to meet the demand for cloud ERP, NetSuite has enabled partners to transform their business model to fully capitalize on the revenue growth opportunity of the cloud. The NetSuite Solution Provider Program delivers unprecedented benefits that include highly attractive margins and range from business planning, sales, marketing and professional services enablement, to training and education. Click here for more information about the NetSuite Solution Provider Program.

“These new partners are teaming with NetSuite as the global migration to the cloud continues to accelerate,” said Craig West, Oracle NetSuite Global Business Unit Vice President of Channel Sales. “We look forward to helping customers modernize and transform how business is done through these collaborative and mutually beneficial relationships.”

Cherry Bekaert Launches ERP Consulting Practice with NetSuite Cloud ERP

Cherry Bekaert (www.cherrybekaert.com), one of the 25 largest CPA firms in the U.S. with 1,000+ associates, has launched a new ERP consulting practice to offer NetSuite to businesses in software and technology, manufacturing, distribution, private equity, transportation and logistics, nonprofit, financial services and other industries. The 70-year-old firm, headquartered in Richmond, VA with strong concentration from Washington, D.C. to Miami, will provide implementation, support and customization services to existing and new midmarket customers amid growing demand for NetSuite’s industry-leading solution for ERP, CRM and ecommerce. Cherry Bekaert specializes in helping clients take advantage of opportunities and address new requirements that come with expansive growth. By joining forces with NetSuite, Cherry Bekaert’s new Cloud Solutions practice will equip clients for improved efficiency and productivity in financial operations; compliance with revenue recognition, tax and reporting requirements; and real-time visibility into consolidated financial results.

“The demand for cloud solutions is surging among our middle-market client base, particularly in technology, health and life sciences, and industrial, the sectors that make up our practice,” said Paul Doucet, Managing Director of Cherry Bekaert’s Cloud Solutions practice. “NetSuite’s integrated applications provide a dynamic cloud solution with deep expertise in our core industry segments. We look forward to an equitable and rewarding relationship.”   

Terillium Expands Oracle Platinum Relationship with NetSuite

Terillium (www.terillium.com), a Platinum level member in the Oracle PartnerNetwork since 2006 and 12-time recipient of Oracle Excellence Awards, is expanding its Oracle technology offerings as a new NetSuite Solution Provider serving clients in the small and midmarket manufacturing, distribution and services sectors. Based in Cincinnati, OH, with more than 170 associates, Terillium’s new NetSuite practice will complement its traditional focus on Oracle JD Edwards and Oracle cloud offerings for supply chain, procurement, financials, planning and budgeting, sales and human capital management. With Oracle’s acquisition of NetSuite complete last November, allying with NetSuite was a “natural fit” that enables Terillium to offer organizations NetSuite’s agile, scalable and unified platform, according to Steve May, Vice President at Terillium. Terillium leverages deep expertise gained in delivering solutions to more than 500 businesses to meet rising demand for cloud ERP, CRM and B2B ecommerce with a complete NetSuite services offering for implementation, customization, integration and ongoing optimization.

“We teamed with NetSuite because it’s the #1 cloud ERP on the market with a complete and proven platform used by over 40,000 organizations that can be deployed very quickly with all the benefits of the cloud,” May said. “Offering NetSuite cloud ERP to the businesses we work with is aligned with our mission to provide the best solutions and services possible.”

Revolution Group Focuses on Manufacturing, Distribution with NetSuite

Revolution Group (www.revolutiongroup.com), founded in 1995, is expanding their cloud ERP practice with NetSuite to meet growing demand among its manufacturing, distribution and professional services clients and prospects for cloud ERP. Based in Ohio with about 80 associates, Revolution Group will provide NetSuite cloud ERP distribution, implementation, customization and development that combines NetSuite’s best-in-class platform with Revolution Group’s decades of expertise in the manufacturing, distribution and professional services industries. The relationship with NetSuite expands Revolution Group’s cloud technology portfolio beyond Salesforce and the Plex Manufacturing Cloud. Revolution Group will also leverage its development team and the SuiteCloud development platform for “last mile” customizations and optimizations that help clients continuously innovate and grow with NetSuite.

“We see NetSuite as a very scalable, flexible and growth-oriented platform to meet our customers’ business requirements,” said Rick Snide, CEO of Revolution Group. “Businesses are seeking cloud ERP solutions that correspond with their business needs and budget. Our teaming with NetSuite allows us to offer our clients and prospects options that complement our current services and help them achieve their goals.”

Conexus SG Expands Beyond On-Premise Microsoft ERP with NetSuite

Conexus SG (www.conexussg.com), based in Dallas-Fort Worth, TX, has joined the NetSuite Solution Provider program to offer their first cloud ERP offering and expand beyond its traditional focus of on-premise Microsoft Dynamics GP, SL, CRM and related solutions. The move addresses increased demand for NetSuite cloud solutions, among Conexus SG’s clients and prospects in the oil and gas, manufacturing, distribution, healthcare, restaurant and services industries in the Texas and Oklahoma region. Founded in 2008 and a Microsoft Gold ERP Partner that has served hundreds of customers, Conexus SG combines its deep experience in industry-specific ERP implementation with NetSuite’s proven platform to offer clients a true cloud ERP solution geared for efficiency and growth. Besides NetSuite ERP, CRM and ecommerce implementation and optimization, the 20-person firm will also offer the NetSuite OpenAir professional services automation solution.

“Customers are asking for cloud and customers are asking for NetSuite,” said Eric Holleman, Conexus SG Managing Director. “The future is cloud, and NetSuite is the clear leader in cloud ERP. NetSuite is an ideal fit for fast-growing companies because it doesn’t require infrastructure, it’s fast to implement, and it’s scalable for growth. We’re delighted to team up with NetSuite to offer clients greater choice and flexibility.”

Contact Info
Christine Allen
Public Relations, Oracle NetSuite Global Business Unit
603-743-4534
PR@netsuite.com
About Oracle NetSuite Global Business Unit

Oracle NetSuite Global Business Unit pioneered the Cloud Computing revolution in 1998, establishing the world’s first company dedicated to delivering business applications over the internet. Today, Oracle NetSuite Global Business Unit provides a suite of cloud-based financials / Enterprise Resource Planning (ERP), HR and omnichannel commerce software that runs the business of companies in more than 100 countries. For more information, please visit http://www.netsuite.com.

Follow Oracle NetSuite Global Business Unit’s Cloud blog, Facebook page and @NetSuite Twitter handle for real-time updates.

About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit www.oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Talk to a Press Contact

Christine Allen

  • 603-743-4534

Webcast: Managing and Monitoring Oracle E-Business Suite On-Premises and Cloud

Steven Chan - Tue, 2017-06-27 02:00

Oracle University has a large number of free recorded webcasts that are useful for E-Business Suite system administrators.  Here's a new one on the latest Application Management Pack updates:

Vasu Rao, Senior Principle Product Strategy Manager, shares the latest tools including the Application Management Suite for monitoring and managing an Oracle E-Business Suite 12.2 environment. Among the topics covered for Oracle E-Business Suite on-premises and cloud environments are online patching management; monitoring the technology stack; configuration comparison between run and patch editions; provisioning an instance; technology stack patch recommendations; customization management; and automated cloning. This material was presented at Collaborate 17.

Related Articles

 

Categories: APPS Blogs

Log Buffer #515: A Carnival of the Vanities for DBAs

Pythian Group - Mon, 2017-06-26 09:15

This Log Buffer Edition covers Oracle, SQL Server and MySQL.

Oracle:

You may wish to clone an Oracle Home (for example you have all your databases on a single Oracle Home, but you want to separate Development from Test

Removing Outliers using stddev()

Installing Scala and Apache Spark on a Mac

Introduction to Oracle Big Data Cloud Service – Compute Edition (Part V) – Pig

More on Optimistic Locking with ORA_ROWSCN

SQL Server:

How to Decipher sysschedules

SQL Azure Reporting Automation

Database Code Analysis

Visual Studio 2017 and Swagger: Building and Documenting Web APIs

Techniques to Cleanse Bad Data in SQL Server

MySQL:

ClickHouse in a General Analytical Workload (Based on a Star Schema Benchmark)

Debian 9 and MySQL: Watch Out!

How to Install LAMP on Ubuntu 16.04

Linux perf and the CPU regression in MySQL 5.7

Secure Binlog Server: Encrypted binary Logs and SSL Communication

Categories: DBA Blogs

Using Tableau to Show Variance and Uncertainty

Rittman Mead Consulting - Mon, 2017-06-26 09:00

Recently, I watched an amazing keynote presentation from Amanda Cox at OpenVis. Toward the beginning of the presentation, Amanda explained that people tend to feel and interpret things differently. She went on to say that, “There’s this gap between what you say or what you think you’re saying, and what people hear.”

While I found her entire presentation extremely interesting, that statement in particular really made me think. When I view a visualization or report, am I truly understanding what the results are telling me? Personally, when I’m presented a chart or graph I tend to take what I’m seeing as absolute fact, but often there’s a bit of nuance there. When we have a fair amount of variance or uncertainty in our data, what are some effective ways to communicate that to our intended audience?

In this blog I'll demonstrate some examples of how to show uncertainty and variance in Tableau. All of the following visualizations are made using Tableau Public so while I won’t go into all the nitty-gritty detail here, follow this link to download the workbook and reverse engineer the visualizations yourself if you'd like.

First things first, I need some data to explore. If you've ever taken our training you might recall the Gourmet Coffee & Bakery Company (GCBC) data that we use for our courses. Since I’m more interested in demonstrating what we can do with the visualizations and less interested in the actual data itself, this sample dataset will be more than suitable for my needs. I'll begin by pulling the relevant data into Tableau using Unify.

If you haven't already heard about Unify, it allows Tableau to seamlessly connect to OBIEE so that you can take advantage of the subject areas created there. Now that I have some data, let’s look at our average order history by month. To keep things simple, I’ve filtered so that we’re only viewing data for Times Square.

Average Orders for 2015-2016

On this simple visualization we can already draw some insights. We can see that the data is cyclical with a peak early in the year around February and another in August. We can also visually see the minimum number of orders in a month appears to be about 360 orders while the maximum is just under 400 orders.

When someone asks to see “average orders by month”, this is generally what people expect to see and depending upon the intended audience a chart like this might be completely acceptable. However, when we display aggregated data we no longer have any visibility into the variance of the underlying data.

Daily Orders

If we display the orders at the day level instead of month we can still see the cyclical nature of the data but we also can see additional detail and you’ll notice there’s quite a bit more “noise” to the data. We had a particularly poor day in mid-May of 2014 with under 350 orders. We’ve also had a considerable number of good days during the summer months when we cleared 415 orders.

Moving Average

Depending upon your audience and the dataset, some of these charts might include too much information and be too busy. If the viewer can’t make sense of what you’re putting in front of them there’s no way they’ll be able to discern any meaningful insights from the underlying dataset. Visualizations must be easy to read. One way to provide information about the volatility of the data but with less detail would be to use confidence bands, similar to how one might view stock data. In this example I’ve calculated and displayed a moving average, as well as upper and lower confidence bands using the 3rd standard deviation. Confidence bands show how much uncertainty there is in your data. When the bands are close you can be more confident in your results and expectations.

Orders by Month Orders by Day

An additional option is the use of a scatterplot. The awesome thing about a scatterplots is that not only does it allow you to see the variance of your data, but if you play with the size of your shapes and tweak the transparency just right, you also get a sense of density of your dataset because you can visualize where those points lie in relation to each other.

Boxplot

The final example I have for you is to show the distribution of your data using a boxplot. If you’re not familiar with boxplots, the line in the middle of the box is the median. The bottom and top of the box, known as the bottom and top hinge, give you the 25th and 75th percentiles respectively and the whiskers outside out the box show the minimum and maximum values excluding any outliers. Outliers are shown as dots.

I want to take a brief moment to touch on a fairly controversial subject of whether or not to include a zero value in your axes. When you have a non-zero baseline it distorts your data and differences are exaggerated. This can be misleading and might lead your audience into drawing inaccurate conclusions.

For example, a quick Google search revealed this image on Accuweather showing the count of tornados in the U.S. for 2013-2016. At first glance it appears as though there were almost 3 times more tornados in 2015 than in 2013 and 2014, but that would be incorrect.

On the flipside, there are cases where slight fluctuations in the data are extremely important but are too small to be noticed when the axis extends to zero. Philip Bump did an excellent job demonstrating this in his "Why this National Review global temperature graph is so misleading" article in the The Washington Post.

Philip begins his article with this chart tweeted by the National Review which appears to prove that global temperatures haven’t changed in the last 100 years. As he goes on to explain, this chart is misleading because of the scale used. The y-axis stretches from -10 to 110 degrees making it impossible to see a 2 degree increase over the last 50 years or so.

The general rule of thumb is that you should always start from zero. In fact, when you create a visualization in Tableau, it includes a zero by default. Usually, I agree with this rule and the vast majority of the time I do include a zero, but I don’t believe there can be a hard and fast rule as there will always be an exception. Bar charts are used to communicate absolute values so the size of that bar needs to be proportional to the overall value. I agree that bar charts should extend to zero because if it doesn’t we distort what the data is telling us. With line charts and scatterplots we tend to look at the positioning of the data points relative to each other. Since we’re not as interested in the value of the data, I don’t feel the decision to include a zero or not is as cut and dry.

The issue boils down to what it is you’re trying to communicate with your chart. In this particular case, I’m trying to highlight the uncertainty so the chart needs to draw attention to the range of that uncertainty. For this reason, I have not extended the axes in the above examples to zero. You are free to disagree with me on this, but as long as you’re not intentionally misleading your audience I feel that in instances such as these this rule can be relaxed.

These are only a few examples of the many ways to show uncertainty and variance within your data. Displaying the volatility of the data and giving viewers a level of confidence in the results is immensely powerful. Remember that while we can come up with the most amazing visualizations, if the results are misleading or misinterpreted and users draw inaccurate conclusions, what’s the point?

Categories: BI & Warehousing

Benefits of Exetuvie Dashboards for Call Centers

Nilesh Jethwa - Mon, 2017-06-26 08:54

Managing a call center is no easy feat, hence, business intelligence tools such as an executive dashboard can be tremendously helpful.

Call center management needs constant attention. Managers should focus on the latest performance metricsand important industry KPIs. Taking these into consideration, arming company executives with a reliable call center dashboard is worth its weight in gold.

Keeping a watchful eye on all the activities of the call center that you’re managing can help you increase overall productivity, and business intelligence tools should will play a major role. Not only will you be able to reduce handling times significantly but also be able to meet the obligations of your service level agreements hassle-free.

With the industry of call centers being one of the most competitive these days, providing clients with the highest level of service is the ultimate goal and you can assure service quality when you have an executive dashboard in your arsenal.

It works by increasing the visibility of real-time metrics that are crucial to your business. With a dashboard app, you can access vital information immediately and address any issues before they become potentially damaging to your company.

In any call center dashboard, there are several performance metrics that you have to pay close attention to and here are some of them:

Read more at http://www.infocaptor.com/dashboard/executive-dashboard-analytics-for-call-centers

Bank of America Selects Oracle Cloud for ERP and Financials

Oracle Press Releases - Mon, 2017-06-26 08:30
Press Release
Bank of America Selects Oracle Cloud for ERP and Financials

Redwood Shores, Calif.—Jun 26, 2017

Oracle (NYSE: ORCL) announced today that Bank of America (NYSE: BAC) has selected Oracle’s cloud-based ERP and financial applications for both its international general ledger and its broker-dealer systems. Oracle is the world leader in Cloud ERP systems. The company now has more than five thousand Fusion Financial cloud applications customers installed worldwide.

“Bank of America is among the most sophisticated applications customers in the world,” said Mark Hurd, CEO, Oracle. “Companies in record numbers are migrating legacy ERP applications to Oracle ERP Cloud.  Oracle will deliver Bank of America the most modern, complete, global ERP applications in the world.”

“Our continued focus on technology modernization has positioned Bank of America as a contemporary technology company,” said Cathy Bessant, Chief Operations and Technology Officer. “Our agreement with Oracle is about our technology and culture; we’ll learn with and from each other to deliver the best solutions.  We anticipate 80 percent of our technology workloads will be delivered on the cloud within the next few years, and this solution for general ledger reflects our evolving strategy and the improvements in cloud security and economics.”

Contact Info
Jennifer Yamamoto
Oracle
+1.916.761.9555
jennifer.yamamoto@oracle.com
Jumana Bauwens
Bank of America
+1.213.345.9134
jumana.bauwens@bankofamerica.com
About Bank of America

Bank of America is one of the world's leading financial institutions, serving individual consumers, small and middle-market businesses and large corporations with a full range of banking, investing, asset management and other financial and risk management products and services. The company provides unmatched convenience in the United States, serving approximately 47 million consumer and small business relationships with approximately 4,600 retail financial centers, approximately 15,900 ATMs, and award-winning digital banking with approximately 35 million active users and more than 22 million mobile users. Bank of America is a global leader in wealth management, corporate and investment banking and trading across a broad range of asset classes, serving corporations, governments, institutions and individuals around the world. Bank of America offers industry-leading support to approximately 3 million small business owners through a suite of innovative, easy-to-use online products and services. The company serves clients through operations in all 50 states, the District of Columbia, the U.S. Virgin Islands, Puerto Rico and more than 35 countries. Bank of America Corporation stock (NYSE: BAC) is listed on the New York Stock Exchange.

About Oracle

The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Talk to a Press Contact

Jennifer Yamamoto

  • +1.916.761.9555

Jumana Bauwens

  • +1.213.345.9134

Oracle Named a Leader in the 2017 Gartner Magic Quadrant for Access Management, Worldwide

Oracle Press Releases - Mon, 2017-06-26 07:00
Press Release
Oracle Named a Leader in the 2017 Gartner Magic Quadrant for Access Management, Worldwide Oracle positioned as a leader based on completeness of vision and ability to execute

Redwood Shores, Calif.—Jun 26, 2017

Oracle today announced that it has been named a leader in Gartner’s inaugural “Magic Quadrant for Access Management, Worldwide 2017” report[1]. Oracle believes its leadership position for Oracle Access Management further highlights the exceptional service enhancements introduced by Oracle Identity Cloud Service so far this year.

“We believe this recognition is a credit to Oracle’s commitment to build a comprehensive portfolio of PaaS security solutions, an integral piece of Oracle Cloud Platform,” said Peter Barker, senior vice president, Identity and Security at Oracle. “Our goal has been to deliver a suite of security solutions that customers can adapt and scale to help secure their digital business transformation. Oracle Identity Cloud Service is a core part of Oracle’s Identity-based Security Operations Center (SOC) framework that provides customers with an adaptive, machine learning-based security architecture.”

For this report, Gartner evaluated vendors with regard to their ability to provide a general set of access management (AM) functionalities across multiple use cases, and in multiple geographies and industries, and to do so by providing solid value for money as perceived by their customers. According to the report, “Leaders in the AM market generally have significant customer bases. They provide feature sets that are appropriate for current customer use-case needs. Leaders also show evidence of strong vision and execution for anticipated requirements related to technology, methodology or means of delivery; and they show evidence of how AM plays a role in a collection of related or adjacent product offerings. Leaders typically demonstrate solid customer satisfaction with overall AM capabilities, the sales process and/or related service and support.”

According to Gartner, "AM applies to technologies that use access control engines to provide centralized authentication, single sign-on (SSO), session management and authorization enforcement for target applications in multiple use cases. Access managers have evolved from early ‘web access managers.’ Target applications may have traditional web application architectures using web browsers and web application servers, and these applications may run on customers' premises or in the cloud.”

Download Gartner’s 2017 “Magic Quadrant for Access Management, Worldwide” here.

Oracle Access Management and Oracle Identity Cloud Service are part of the broader Oracle Identity Management platform.  Oracle was also recognized as a leader in Gartner’s February 2017 “Magic Quadrant for Identity Governance and Administration” for the fourth consecutive year. Oracle’s Access Management solutions are designed to help businesses secure hybrid, on-premises and cloud application deployments, while preserving a seamless user experience, centralized administration, and market-leading performance and scalability. Oracle Identity Cloud Service, which is Oracle’s next generation comprehensive identity platform, is cloud-native and designed to be an integral part of the Oracle Identity SOC that also includes Oracle Security Monitoring and Analytics Cloud Service, Oracle CASB Cloud Service and Oracle Configuration and Compliance Cloud Service.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

[1] Gartner, “2017 Magic Quadrant for Access Management, Worldwide,” by Gregg Kreizman, Anmol Singh, June 7, 2017

About Oracle

The Oracle Cloud delivers hundreds of SaaS applications and enterprise-class PaaS and IaaS services to customers in more than 195 countries and territories while processing 55 billion transactions a day. For more information about Oracle (NYSE:ORCL), please visit us at oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Creating Custom Applications in EBS 12.2

Steven Chan - Mon, 2017-06-26 02:00

The least invasive method of fine-tuning your E-Business Suite environment to your users' needs is via personalizations.  If that approach is insufficient, you can create custom applications via Forms, Reports, and other technologies that segregate your customizations from the standard functionality provided with Oracle E-Business Suite.

If you're considering that approach for customizing EBS 12.2, see:

This Note covers:

  • Using adsplice to create a custom application
  • Verifying the creation of custom products in the database 
  • Running AutoConfig to set-up application servers
  • Creating custom objects, including tables, indexes, and sequences
  • Propagating changes to your file systems

Related Articles

Categories: APPS Blogs

ADF BC Attribute - Collection Storage Mode Property

Andrejus Baranovski - Sun, 2017-06-25 14:45
I would like to describe one interesting property for ADF BC attribute. This property is called Storage. There are two possible values: row (default) and collection. By default attribute value is saved in row storage, but alternatively it can be saved in collection storage. ADF BC implements collection storage using map which comes from session scope. This allows to keep value even between ADF BC requests, this is ideal for transient attributes.

Sample application (ADFBCCheckboxApp.zip) implements VO transient attribute to keep checkbox value:


VO is configured for Range Paging support. While user will navigate through UI table pages - VO range paging will re-execute and this will force VO replace rows (which will result in loosing transient attribute values):


This is how it will look like. User will select checkbox and then navigate to another table page:


After navigating back - checkbox value will be lost (range paging mode will re-execute VO rowset to bring rows belonging to current page):


To force transient attribute value to stay, go to Properties window for the attribute and scroll down to the last section. Select attribute called Storage:


Change value to collection. This will force ADF BC to store value for this attribute in session map:


Transient attribute value will stay, even when VO is re-executed in range paging mode and VO rowset is refetched:

Importance of Business Intelligence for Education

Nilesh Jethwa - Sun, 2017-06-25 12:54

Providing quality education and running schools as a business are becoming more demanding these days. To maintain the efficiency and effectiveness of K-12 education, schools should really invest in reliable business intelligence tools.

Armed with the right BI tools that focus on several key performance indicators, they will be able to run the school well, as a business, and provide the necessary educational resources to those who depend on them for knowledge.

There is a lingering problem across the world wherein school administrators are expected to make magic with as little funding as possible. Usually, more funding is provided to facilities that deliver exemplary results in terms of student performance.

This is why the school board should monitor KPI metrics that will help them manage their operations and ensure that they provide the best educational resources to the students attending their facility. In doing so, they can generate the kind of performance that will generate them much needed financial support to continuously improve their offerings.

Information relating to school fees, student population, daily attendance, class sizes, program offerings, academic results, and the like are all essential key performance indicators. And with heavier competition to attract new students and supporters, a reliable performance dashboard becomes all the more necessary.

Read more at http://www.infocaptor.com/dashboard/how-important-is-business-intelligence-for-education

Asrock E3C224DI-14S IPMI not reachable

Frank van Bortel - Sun, 2017-06-25 05:48
Documentation missing There's definately some documentation missing on the IPMI settings. I managed to lock myself out of the IPMI (also know as 'Integrated Light Out', or ILO) management interface. Not sure how I managed to do that, but in the quest to find out how to restore devine powers, I noticed quite a lot of people suffered from this. And, the solution is quite simple, when you know Frankhttp://www.blogger.com/profile/07830428804236732019noreply@blogger.com0

Cross platform migration using GG

Tom Kyte - Sat, 2017-06-24 23:46
Goldengate is primarily sold as an Replication software, I understand. Can you clarify whether GG can be used to migrate from one database platform to Oracle. Say, MSSQL for example? Or is it required to use SQL Dev to convert the metadata from...
Categories: DBA Blogs

(bash): The most useless commands (6)

Dietrich Schroff - Sat, 2017-06-24 08:35
After talking about
  1. rev
  2. sl
  3. cowsay
 i found another funny command:
cmatrixThe manpage says:
NAME
       CMatrix - simulates the display from "The Matrix"and it is true:

Pages

Subscribe to Oracle FAQ aggregator