Feed aggregator

Log Buffer #515: A Carnival of the Vanities for DBAs

Pythian Group - Mon, 2017-06-26 09:15

This Log Buffer Edition covers Oracle, SQL Server and MySQL.

Oracle:

You may wish to clone an Oracle Home (for example you have all your databases on a single Oracle Home, but you want to separate Development from Test

Removing Outliers using stddev()

Installing Scala and Apache Spark on a Mac

Introduction to Oracle Big Data Cloud Service – Compute Edition (Part V) – Pig

More on Optimistic Locking with ORA_ROWSCN

SQL Server:

How to Decipher sysschedules

SQL Azure Reporting Automation

Database Code Analysis

Visual Studio 2017 and Swagger: Building and Documenting Web APIs

Techniques to Cleanse Bad Data in SQL Server

MySQL:

ClickHouse in a General Analytical Workload (Based on a Star Schema Benchmark)

Debian 9 and MySQL: Watch Out!

How to Install LAMP on Ubuntu 16.04

Linux perf and the CPU regression in MySQL 5.7

Secure Binlog Server: Encrypted binary Logs and SSL Communication

Categories: DBA Blogs

Using Tableau to Show Variance and Uncertainty

Rittman Mead Consulting - Mon, 2017-06-26 09:00

Recently, I watched an amazing keynote presentation from Amanda Cox at OpenVis. Toward the beginning of the presentation, Amanda explained that people tend to feel and interpret things differently. She went on to say that, “There’s this gap between what you say or what you think you’re saying, and what people hear.”

While I found her entire presentation extremely interesting, that statement in particular really made me think. When I view a visualization or report, am I truly understanding what the results are telling me? Personally, when I’m presented a chart or graph I tend to take what I’m seeing as absolute fact, but often there’s a bit of nuance there. When we have a fair amount of variance or uncertainty in our data, what are some effective ways to communicate that to our intended audience?

In this blog I'll demonstrate some examples of how to show uncertainty and variance in Tableau. All of the following visualizations are made using Tableau Public so while I won’t go into all the nitty-gritty detail here, follow this link to download the workbook and reverse engineer the visualizations yourself if you'd like.

First things first, I need some data to explore. If you've ever taken our training you might recall the Gourmet Coffee & Bakery Company (GCBC) data that we use for our courses. Since I’m more interested in demonstrating what we can do with the visualizations and less interested in the actual data itself, this sample dataset will be more than suitable for my needs. I'll begin by pulling the relevant data into Tableau using Unify.

If you haven't already heard about Unify, it allows Tableau to seamlessly connect to OBIEE so that you can take advantage of the subject areas created there. Now that I have some data, let’s look at our average order history by month. To keep things simple, I’ve filtered so that we’re only viewing data for Times Square.

Average Orders for 2015-2016

On this simple visualization we can already draw some insights. We can see that the data is cyclical with a peak early in the year around February and another in August. We can also visually see the minimum number of orders in a month appears to be about 360 orders while the maximum is just under 400 orders.

When someone asks to see “average orders by month”, this is generally what people expect to see and depending upon the intended audience a chart like this might be completely acceptable. However, when we display aggregated data we no longer have any visibility into the variance of the underlying data.

Daily Orders

If we display the orders at the day level instead of month we can still see the cyclical nature of the data but we also can see additional detail and you’ll notice there’s quite a bit more “noise” to the data. We had a particularly poor day in mid-May of 2014 with under 350 orders. We’ve also had a considerable number of good days during the summer months when we cleared 415 orders.

Moving Average

Depending upon your audience and the dataset, some of these charts might include too much information and be too busy. If the viewer can’t make sense of what you’re putting in front of them there’s no way they’ll be able to discern any meaningful insights from the underlying dataset. Visualizations must be easy to read. One way to provide information about the volatility of the data but with less detail would be to use confidence bands, similar to how one might view stock data. In this example I’ve calculated and displayed a moving average, as well as upper and lower confidence bands using the 3rd standard deviation. Confidence bands show how much uncertainty there is in your data. When the bands are close you can be more confident in your results and expectations.

Orders by Month Orders by Day

An additional option is the use of a scatterplot. The awesome thing about a scatterplots is that not only does it allow you to see the variance of your data, but if you play with the size of your shapes and tweak the transparency just right, you also get a sense of density of your dataset because you can visualize where those points lie in relation to each other.

Boxplot

The final example I have for you is to show the distribution of your data using a boxplot. If you’re not familiar with boxplots, the line in the middle of the box is the median. The bottom and top of the box, known as the bottom and top hinge, give you the 25th and 75th percentiles respectively and the whiskers outside out the box show the minimum and maximum values excluding any outliers. Outliers are shown as dots.

I want to take a brief moment to touch on a fairly controversial subject of whether or not to include a zero value in your axes. When you have a non-zero baseline it distorts your data and differences are exaggerated. This can be misleading and might lead your audience into drawing inaccurate conclusions.

For example, a quick Google search revealed this image on Accuweather showing the count of tornados in the U.S. for 2013-2016. At first glance it appears as though there were almost 3 times more tornados in 2015 than in 2013 and 2014, but that would be incorrect.

On the flipside, there are cases where slight fluctuations in the data are extremely important but are too small to be noticed when the axis extends to zero. Philip Bump did an excellent job demonstrating this in his "Why this National Review global temperature graph is so misleading" article in the The Washington Post.

Philip begins his article with this chart tweeted by the National Review which appears to prove that global temperatures haven’t changed in the last 100 years. As he goes on to explain, this chart is misleading because of the scale used. The y-axis stretches from -10 to 110 degrees making it impossible to see a 2 degree increase over the last 50 years or so.

The general rule of thumb is that you should always start from zero. In fact, when you create a visualization in Tableau, it includes a zero by default. Usually, I agree with this rule and the vast majority of the time I do include a zero, but I don’t believe there can be a hard and fast rule as there will always be an exception. Bar charts are used to communicate absolute values so the size of that bar needs to be proportional to the overall value. I agree that bar charts should extend to zero because if it doesn’t we distort what the data is telling us. With line charts and scatterplots we tend to look at the positioning of the data points relative to each other. Since we’re not as interested in the value of the data, I don’t feel the decision to include a zero or not is as cut and dry.

The issue boils down to what it is you’re trying to communicate with your chart. In this particular case, I’m trying to highlight the uncertainty so the chart needs to draw attention to the range of that uncertainty. For this reason, I have not extended the axes in the above examples to zero. You are free to disagree with me on this, but as long as you’re not intentionally misleading your audience I feel that in instances such as these this rule can be relaxed.

These are only a few examples of the many ways to show uncertainty and variance within your data. Displaying the volatility of the data and giving viewers a level of confidence in the results is immensely powerful. Remember that while we can come up with the most amazing visualizations, if the results are misleading or misinterpreted and users draw inaccurate conclusions, what’s the point?

Categories: BI & Warehousing

Benefits of Exetuvie Dashboards for Call Centers

Nilesh Jethwa - Mon, 2017-06-26 08:54

Managing a call center is no easy feat, hence, business intelligence tools such as an executive dashboard can be tremendously helpful.

Call center management needs constant attention. Managers should focus on the latest performance metricsand important industry KPIs. Taking these into consideration, arming company executives with a reliable call center dashboard is worth its weight in gold.

Keeping a watchful eye on all the activities of the call center that you’re managing can help you increase overall productivity, and business intelligence tools should will play a major role. Not only will you be able to reduce handling times significantly but also be able to meet the obligations of your service level agreements hassle-free.

With the industry of call centers being one of the most competitive these days, providing clients with the highest level of service is the ultimate goal and you can assure service quality when you have an executive dashboard in your arsenal.

It works by increasing the visibility of real-time metrics that are crucial to your business. With a dashboard app, you can access vital information immediately and address any issues before they become potentially damaging to your company.

In any call center dashboard, there are several performance metrics that you have to pay close attention to and here are some of them:

Read more at http://www.infocaptor.com/dashboard/executive-dashboard-analytics-for-call-centers

Bank of America Selects Oracle Cloud for ERP and Financials

Oracle Press Releases - Mon, 2017-06-26 08:30
Press Release
Bank of America Selects Oracle Cloud for ERP and Financials

Redwood Shores, Calif.—Jun 26, 2017

Oracle (NYSE: ORCL) announced today that Bank of America (NYSE: BAC) has selected Oracle’s cloud-based ERP and financial applications for both its international general ledger and its broker-dealer systems. Oracle is the world leader in Cloud ERP systems. The company now has more than five thousand Fusion Financial cloud applications customers installed worldwide.

“Bank of America is among the most sophisticated applications customers in the world,” said Mark Hurd, CEO, Oracle. “Companies in record numbers are migrating legacy ERP applications to Oracle ERP Cloud.  Oracle will deliver Bank of America the most modern, complete, global ERP applications in the world.”

“Our continued focus on technology modernization has positioned Bank of America as a contemporary technology company,” said Cathy Bessant, Chief Operations and Technology Officer. “Our agreement with Oracle is about our technology and culture; we’ll learn with and from each other to deliver the best solutions.  We anticipate 80 percent of our technology workloads will be delivered on the cloud within the next few years, and this solution for general ledger reflects our evolving strategy and the improvements in cloud security and economics.”

Contact Info
Jennifer Yamamoto
Oracle
+1.916.761.9555
jennifer.yamamoto@oracle.com
Jumana Bauwens
Bank of America
+1.213.345.9134
jumana.bauwens@bankofamerica.com
About Bank of America

Bank of America is one of the world's leading financial institutions, serving individual consumers, small and middle-market businesses and large corporations with a full range of banking, investing, asset management and other financial and risk management products and services. The company provides unmatched convenience in the United States, serving approximately 47 million consumer and small business relationships with approximately 4,600 retail financial centers, approximately 15,900 ATMs, and award-winning digital banking with approximately 35 million active users and more than 22 million mobile users. Bank of America is a global leader in wealth management, corporate and investment banking and trading across a broad range of asset classes, serving corporations, governments, institutions and individuals around the world. Bank of America offers industry-leading support to approximately 3 million small business owners through a suite of innovative, easy-to-use online products and services. The company serves clients through operations in all 50 states, the District of Columbia, the U.S. Virgin Islands, Puerto Rico and more than 35 countries. Bank of America Corporation stock (NYSE: BAC) is listed on the New York Stock Exchange.

About Oracle

The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Talk to a Press Contact

Jennifer Yamamoto

  • +1.916.761.9555

Jumana Bauwens

  • +1.213.345.9134

Oracle Named a Leader in the 2017 Gartner Magic Quadrant for Access Management, Worldwide

Oracle Press Releases - Mon, 2017-06-26 07:00
Press Release
Oracle Named a Leader in the 2017 Gartner Magic Quadrant for Access Management, Worldwide Oracle positioned as a leader based on completeness of vision and ability to execute

Redwood Shores, Calif.—Jun 26, 2017

Oracle today announced that it has been named a leader in Gartner’s inaugural “Magic Quadrant for Access Management, Worldwide 2017” report[1]. Oracle believes its leadership position for Oracle Access Management further highlights the exceptional service enhancements introduced by Oracle Identity Cloud Service so far this year.

“We believe this recognition is a credit to Oracle’s commitment to build a comprehensive portfolio of PaaS security solutions, an integral piece of Oracle Cloud Platform,” said Peter Barker, senior vice president, Identity and Security at Oracle. “Our goal has been to deliver a suite of security solutions that customers can adapt and scale to help secure their digital business transformation. Oracle Identity Cloud Service is a core part of Oracle’s Identity-based Security Operations Center (SOC) framework that provides customers with an adaptive, machine learning-based security architecture.”

For this report, Gartner evaluated vendors with regard to their ability to provide a general set of access management (AM) functionalities across multiple use cases, and in multiple geographies and industries, and to do so by providing solid value for money as perceived by their customers. According to the report, “Leaders in the AM market generally have significant customer bases. They provide feature sets that are appropriate for current customer use-case needs. Leaders also show evidence of strong vision and execution for anticipated requirements related to technology, methodology or means of delivery; and they show evidence of how AM plays a role in a collection of related or adjacent product offerings. Leaders typically demonstrate solid customer satisfaction with overall AM capabilities, the sales process and/or related service and support.”

According to Gartner, "AM applies to technologies that use access control engines to provide centralized authentication, single sign-on (SSO), session management and authorization enforcement for target applications in multiple use cases. Access managers have evolved from early ‘web access managers.’ Target applications may have traditional web application architectures using web browsers and web application servers, and these applications may run on customers' premises or in the cloud.”

Download Gartner’s 2017 “Magic Quadrant for Access Management, Worldwide” here.

Oracle Access Management and Oracle Identity Cloud Service are part of the broader Oracle Identity Management platform.  Oracle was also recognized as a leader in Gartner’s February 2017 “Magic Quadrant for Identity Governance and Administration” for the fourth consecutive year. Oracle’s Access Management solutions are designed to help businesses secure hybrid, on-premises and cloud application deployments, while preserving a seamless user experience, centralized administration, and market-leading performance and scalability. Oracle Identity Cloud Service, which is Oracle’s next generation comprehensive identity platform, is cloud-native and designed to be an integral part of the Oracle Identity SOC that also includes Oracle Security Monitoring and Analytics Cloud Service, Oracle CASB Cloud Service and Oracle Configuration and Compliance Cloud Service.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

[1] Gartner, “2017 Magic Quadrant for Access Management, Worldwide,” by Gregg Kreizman, Anmol Singh, June 7, 2017

About Oracle

The Oracle Cloud delivers hundreds of SaaS applications and enterprise-class PaaS and IaaS services to customers in more than 195 countries and territories while processing 55 billion transactions a day. For more information about Oracle (NYSE:ORCL), please visit us at oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Creating Custom Applications in EBS 12.2

Steven Chan - Mon, 2017-06-26 02:00

The least invasive method of fine-tuning your E-Business Suite environment to your users' needs is via personalizations.  If that approach is insufficient, you can create custom applications via Forms, Reports, and other technologies that segregate your customizations from the standard functionality provided with Oracle E-Business Suite.

If you're considering that approach for customizing EBS 12.2, see:

This Note covers:

  • Using adsplice to create a custom application
  • Verifying the creation of custom products in the database 
  • Running AutoConfig to set-up application servers
  • Creating custom objects, including tables, indexes, and sequences
  • Propagating changes to your file systems

Related Articles

Categories: APPS Blogs

ADF BC Attribute - Collection Storage Mode Property

Andrejus Baranovski - Sun, 2017-06-25 14:45
I would like to describe one interesting property for ADF BC attribute. This property is called Storage. There are two possible values: row (default) and collection. By default attribute value is saved in row storage, but alternatively it can be saved in collection storage. ADF BC implements collection storage using map which comes from session scope. This allows to keep value even between ADF BC requests, this is ideal for transient attributes.

Sample application (ADFBCCheckboxApp.zip) implements VO transient attribute to keep checkbox value:


VO is configured for Range Paging support. While user will navigate through UI table pages - VO range paging will re-execute and this will force VO replace rows (which will result in loosing transient attribute values):


This is how it will look like. User will select checkbox and then navigate to another table page:


After navigating back - checkbox value will be lost (range paging mode will re-execute VO rowset to bring rows belonging to current page):


To force transient attribute value to stay, go to Properties window for the attribute and scroll down to the last section. Select attribute called Storage:


Change value to collection. This will force ADF BC to store value for this attribute in session map:


Transient attribute value will stay, even when VO is re-executed in range paging mode and VO rowset is refetched:

Importance of Business Intelligence for Education

Nilesh Jethwa - Sun, 2017-06-25 12:54

Providing quality education and running schools as a business are becoming more demanding these days. To maintain the efficiency and effectiveness of K-12 education, schools should really invest in reliable business intelligence tools.

Armed with the right BI tools that focus on several key performance indicators, they will be able to run the school well, as a business, and provide the necessary educational resources to those who depend on them for knowledge.

There is a lingering problem across the world wherein school administrators are expected to make magic with as little funding as possible. Usually, more funding is provided to facilities that deliver exemplary results in terms of student performance.

This is why the school board should monitor KPI metrics that will help them manage their operations and ensure that they provide the best educational resources to the students attending their facility. In doing so, they can generate the kind of performance that will generate them much needed financial support to continuously improve their offerings.

Information relating to school fees, student population, daily attendance, class sizes, program offerings, academic results, and the like are all essential key performance indicators. And with heavier competition to attract new students and supporters, a reliable performance dashboard becomes all the more necessary.

Read more at http://www.infocaptor.com/dashboard/how-important-is-business-intelligence-for-education

Asrock E3C224DI-14S IPMI not reachable

Frank van Bortel - Sun, 2017-06-25 05:48
Documentation missing There's definately some documentation missing on the IPMI settings. I managed to lock myself out of the IPMI (also know as 'Integrated Light Out', or ILO) management interface. Not sure how I managed to do that, but in the quest to find out how to restore devine powers, I noticed quite a lot of people suffered from this. And, the solution is quite simple, when you know Frankhttp://www.blogger.com/profile/07830428804236732019noreply@blogger.com0

Cross platform migration using GG

Tom Kyte - Sat, 2017-06-24 23:46
Goldengate is primarily sold as an Replication software, I understand. Can you clarify whether GG can be used to migrate from one database platform to Oracle. Say, MSSQL for example? Or is it required to use SQL Dev to convert the metadata from...
Categories: DBA Blogs

(bash): The most useless commands (6)

Dietrich Schroff - Sat, 2017-06-24 08:35
After talking about
  1. rev
  2. sl
  3. cowsay
 i found another funny command:
cmatrixThe manpage says:
NAME
       CMatrix - simulates the display from "The Matrix"and it is true:

Package Calling Error

Tom Kyte - Sat, 2017-06-24 05:45
Hi, I created a package Order_headers with two procedures price and quantity in Tan Schema .called them in other procedure like this tan.order_headers.price; but i got Error like PLS-00302: component Component 'order_headers' must be declared. but...
Categories: DBA Blogs

based on multiple packages and procedure

Tom Kyte - Sat, 2017-06-24 05:45
Can we access the procedure in a package outside the package using another package? i.e. i need to access a package procedure outside of another package (i need to call from 2nd package to access the first package procedure)
Categories: DBA Blogs

Using ZFS compression for all database files

Tom Kyte - Sat, 2017-06-24 05:45
Hi Tom, I am hoping for some expert advice on the above, We have a T5 server using SAN storage which is allocated by ZFS. We have Solaris 11 OS and Oracle Database 12c with single instances (no RAC) and we do not use ASM, our database is limi...
Categories: DBA Blogs

Passion, Drive and Hunger

Bradley Brown - Fri, 2017-06-23 14:21
Starting a business is nothing short of an ultra marathon times 365.  If you ask people why they do it, you'll get a wide variety of answers.  Some people might say "the money" but behind most any answer you will receive is an iceberg underneath (90% of an iceberg is under the water).  Others will say they are simply crazy.  The true reason is very complex.  It's not usually one reason, but passion, drive and hunger are key components to one's success.

If you like challenges, you'll have plenty.  I view the startup world like a 6 dimensional puzzle.  Move one piece and 6 others (or 1,600,000) move with the piece you moved.  Time is one of the dimensions.  Sometimes you see results quickly and other times results take years.  Sometimes you have a "feeling" something is going to work and other times you'll have empirical data.

I always tell people that if you work your ass off for 20 years straight, you'll wake up an overnight success one day!

So what are you going to do?  Keep dreaming or make your dream a reality?  Do you have a mentor that you can turn to?  1 or 100 mentors?  Friends and family are leaders in self deception...i.e. do you have someone who will give you the real answer?  The shredded wheat answer?  Have you asked a customer for money?  That's usually the best indication of whether you have something real or not.

I started my first business when I was in high school.  Sure, lawn mowing was a business, but I started developing software (i.e. games) and got my 10,000 hours of programming in before I started college.  I sold my games to the manufacturer of my computer when I was in college.  That was exciting!  My passion, drive and hunger continues.

I taught a class called New Venture Creation at the University of Denver.  It was a 4 credit course.  Since college credits cost about $1000 per credit hour (or more), that's $4000 per student.  I brought 20+ mentors (successful business people) into my classroom every quarter.

As a mentor and director in an organization like Founders Institute, along with 30-50 other mentors, we coached a number of amazing people into successful businesses.  I absolutely believe FI is one of the best investments into your future.  You can expect mentors, education and growth through the process.

SQR with 077 umask creates file with 611 permissions

Bobby Durrett's DBA Blog - Fri, 2017-06-23 11:18

I ran across this strange situation. An SQR opened a new data file for output and created the file with 611 permissions. We needed group read access so this caused a problem. I knew that our login script for the PeopleSoft Unix user set umask to 022 but that without running the login scripts the mask is 077. So, my first thought was that we had started the process scheduler without running the login scripts and the mask was 077. But, why would the file that the SQR created be 611 permissions and not 600? The 077 mask should get rid of all the group and others bits. I built a simple test case to show that the SQR creates the file with 611 permissions with a 077 mask.

Here is the test SQR:

begin-report
  Let $Unix_Cmd = 'umask'
  Call System Using $Unix_Cmd #Status
  Let $Unix_Cmd = 'rm /tmp/bobby/bobby.txt'
  Call System Using $Unix_Cmd #Status
  let $testfile = '/tmp/bobby/bobby.txt'
  open $testfile as 1 for-writing  record=1500:fixed
  close 1
  Let $Unix_Cmd = 'ls -l /tmp/bobby/bobby.txt'
  Call System Using $Unix_Cmd #Status
end-report

Here is its output:

SQR for PeopleSoft V8.53.05
077
-rw---x--x   1 psoft      psoft            0 Jun 23 10:54 /tmp/bobby/bobby.txt

Notice the execute bits for group and others.  Why were they not masked out? Also, the default permissions to create a file is 644. So, creating a new file should not set the execute bits at all no matter what mask you are using.

I created a Korn shell script to do the same thing as the SQR:

umask
rm /tmp/bobby/bobby.txt
touch /tmp/bobby/bobby.txt
ls -l /tmp/bobby/bobby.txt

Here is its output:

077
-rw-------   1 psoft      psoft            0 Jun 23 10:58 /tmp/bobby/bobby.txt

Notice that there are no group and others bits which is what I expected with a 077 mask. I tried searching the Internet for SQR and 611 permissions but could not find anything.

As it turns out, we did start the process scheduler with umask 077 so I just modified the script that started it to set umask 022 and that resolved the problem. Here is the output from my test SQR with umask 022:

SQR for PeopleSoft V8.53.05
022
-rw-r--r--   1 psoft      psoft            0 Jun 23 11:01 /tmp/bobby/bobby.txt

This is what we wanted and of course the Korn shell script does the same thing as it should.

022
-rw-r--r--   1 psoft      psoft            0 Jun 23 11:02 /tmp/bobby/bobby.txt

Seems very odd to me. Anyway, I hope that this post helps someone.

This was on HP-UX 11.31 and PeopleTools 8.53.05

Bobby

Categories: DBA Blogs

Oracle Named a Leader in the 2017 Gartner Magic Quadrant for Mobile App Development Platforms

Oracle Press Releases - Fri, 2017-06-23 10:00
Press Release
Oracle Named a Leader in the 2017 Gartner Magic Quadrant for Mobile App Development Platforms Oracle positioned as a leader based on ability to execute and completeness of vision

Redwood Shores, Calif.—Jun 23, 2017

Oracle today announced that it has been named a leader in Gartner’s 2017 “Magic Quadrant for Mobile Application Development Platforms” report[1]. This recognition is another milestone in the tremendous momentum and growth of Oracle Mobile Cloud Service, part of Oracle Cloud Platform, this year.

“We believe this recognition is another acknowledgement of Oracle’s strong market momentum in the mobile and larger PaaS market, driven by the successful adoption of Oracle Cloud Platform offerings by thousands of customers,” said Amit Zavery, senior vice president, Oracle Cloud Platform. “By delivering a comprehensive offering with key differentiating capabilities, such as chatbot support, predictive analytics, and adaptive intelligence to make apps more contextual and smarter, Oracle has given customers a powerful option to meet their ever-evolving development needs.”

Gartner positions vendors within a particular quadrant based on their ability to execute and completeness of vision. According to Gartner, leaders must represent a strong combination of Ability to Execute and Completeness of Vision. In the MADP sector, this means that Leaders are not only good at cross-platform development, deployment and management across the full life cycle, but also have a good vision of the omnichannel and post-app requirements, support for multiple architectures and standards, a solid understanding of IT requirements, and scalable sales channels and partnerships. Leaders must provide platforms that are easy to purchase, program, deploy and upgrade, and which can connect to a range of systems of records and third-party cloud services.

Oracle was recognized as a market leader for more than doubling its customer count for Oracle Mobile Cloud Service. A cloud-based Mobile Application Development Platform (MADP), Oracle Mobile Cloud Service provides a no-code approach to app development. It has been deployed by many midsize and large enterprise customers who praised the product’s integration capabilities and cloud architecture. Oracle continues to expand its mobile cloud platform with chatbot support, expanded analytics, and high-productivity development tools.

Download Gartner’s 2017 “Magic Quadrant for Mobile Application Development Platforms” here.

Oracle Mobile Cloud Service is an open, standards-based platform built from the ground up for the cloud to help developers build better apps faster. Oracle Mobile Cloud Service was designed to help customers to engage with their customers and their employees across web, mobile, wearables, and the growing bot channel with conversational interfaces. Its persona-based approach provides the best tools for each persona involved in a mobile solution to provide a collaborative mobile-first and API-first solution.  It provides a rich set of mobile services that makes it very easy for mobile developers to add mobile context to their apps through services like push notifications, stateless offline sync, and location-based services that supports beacons and GPS. Additionally, Oracle Mobile Application Accelerator capabilities provide business-friendly tools with an easy drag and drop tool to build engaging mobile applications without writing any code. The depth and breadth of mobile analytics provide both developers and business owners with deep insights into mobile application performance, usage and adoption so they can personalize the engagement with their mobile users and increase adoption of their mobile applications. Oracle Mobile Cloud Service provides complete software development life cycle (SDLC) support for mobile app developers in the cloud as one integrated solution.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Additional Information

[1] Gartner, “2017 Magic Quadrant for Mobile Application Development Platforms,” Jason Wong | Van L. Baker | Adrian Leow | Marty Resnick, 12 June 2017

Contact Info
Nicole Maloney
Oracle
+1.415.235.4033
nicole.maloney@oracle.com
Sarah Fraser
Oracle
+1.650.743.0660
sarah.fraser@oracle.com
About Oracle

The Oracle Cloud delivers hundreds of SaaS applications and enterprise-class PaaS and IaaS services to customers in more than 195 countries and territories while processing 55 billion transactions a day. For more information about Oracle (NYSE:ORCL), please visit us at oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Talk to a Press Contact

Nicole Maloney

  • +1.415.235.4033

Sarah Fraser

  • +1.650.743.0660

OBIEE 12c Catalog Validation: Command Line

Rittman Mead Consulting - Fri, 2017-06-23 08:49
 Command Line

I wrote a blog post a while ago describing the catalog validation: an automated process performing a consistency check of the catalog and reporting or deleting the inconsistent artifacts.
In the post I stated that catalog validation should be implemented regularly as part of the cleanup routines and provides precious additional information during the pre and post upgrade phases.

However some time later I noted Oracle's support Doc ID 2199938.1 stating that the startup procedure I detailed in the previous blog post is not supported in any OBI release since 12.2.1.1.0. You can imagine my reaction...

 Command Line

The question then became: How do we run the catalog validation since the known procedure is unsupported? The answer is in catalog manager and the related command line call runcat.sh which, in the server installations (like the SampleApp v607p), can be found under $DOMAIN_HOME/bitools/bin.

How Does it Work?

As for most of command line tools, when you don't have a clue on how it works, the best approach is to run with the -help option which provides the list of parameters to pass.

Catalog Manager understands commands in the following areas:

Development To Production  
createFolder        Creates folder in the catalog  
delete            Deletes the given path from the catalog  
maintenanceMode        Puts the catalog into or out of Maintenance Mode (aka ReadOnly)  
...

Multi-Tenancy  
provisionTenant        Provisions tenants into a web catalog  
...

Patch Management  
tag            Tags all XML documents in a catalog with a unique id and common version string  
diff            Compares two catalogs  
inject            Injects a single item to a diff file  
...

Subject Area Management  
clearQueryCache        Clears the query cache  

Unfortunately none of the options in the list seems to be relevant for catalog validation, but with a close look at the recently updated Doc ID 2199938.1 I could find the parameter to pass: validate.
The full command then looks like

./runcat.sh -cmd validate

In my previous blog I mentioned different types of validation. What type of validation is the default command going to implement? How can I change the behaviour? Again the -help option provides the list of instructions.

# Command : -cmd validate -help 

validate        Validates the catalog

Description  
Validates the catalog

For more information, please see the Oracle Business Intelligence Suite  
Enterprise Edition's Presentation Services Administration Guide.

Syntax  
runcat.cmd/runcat.sh -cmd validate  
    [ -items (None | Report | Clean) [ -links (None | Report | Clean) ] [-folder <path{:path}>] [-folderFromFile <path of inclusion list file>] ] 
    [ -accounts (None | Report | Clean) [ -homes (None | Report | Clean) ] ] 
    -offline <path of catalog> 

Basic Arguments  
None

Optional Arguments  
-items (None | Report | Clean)        Default is 'Report' 
-links (None | Report | Clean)        Default is 'Clean'. Also, '-items' cannot be 'None'. 
-accounts (None | Report | Clean)        Default is 'Clean' 
-homes (None | Report | Clean)        Default is 'Report'. Also, '-accounts' cannot be 'None'. 
-folder <path{:path}>            Which folders in the catalog to validate
-folderFromFile <path of inclusion list file>            File containing folders in the catalog to validate

Common Arguments  
-offline <path of catalog>

-folderFromFile <folder from file>        ----- Sample Folder From File ------
                        /shared/groups/misc
                        /shared/groups/_filters
                        ------------------------------------

Example  
runcat.cmd/runcat.sh -cmd validate -offline c:\oraclebi\data\web\catalog\paint  

Few bits to notice:

  • -offline: the catalog validation needs to happen offline. Either with services down or on a copy of the live catalog. Running catalog validation on a online catalog is dangerous especially with "Clean" options since could delete content in use.
  • -folder: the catalog validation can be run only for a subset of the catalog
  • None | Report | Clean: each validation can be skipped (None), logged (Report) or solved via removal of the inconsistent object (Clean)
  • Also, '-accounts' cannot be 'None'.: some validations are a prerequisite for others to happen
  • Default is 'Clean': some validations have a "Clean" as default value, meaning that will solve the issue by removing the inconsistent object, this may be inappropriate in some cases.

As written before, the initial catalog validation should be done with all options set on Report since this will give a log file of all inconsistencies without deleting pieces of the catalog that could still be valuable. In order to do so the command to execute is:

./runcat.sh -cmd validate -items Report -links Report -accounts Report -homes Report -offline <path_to_catalog> > cat_validation.log

runcat.sh output is displayed direcly in the console, I'm redirecting it to a file called cat_validation.log for further analysis.

If, after the initial run with all options to Report you want the catalog validation utility to "fix" the inconsistent objects, just change the desired options to Clean. Please make sure to take a backup of the catalog before since the automatic fix is done by removing the related objects. Moreover ensure that catalog validation is working on a offline catalog. The command itself can work on top on a online catalog but is never a good idea checking a catalog that could potentially be changed while the tool is running.

The output

Let's see few examples of how Catalog Validation spots inconsistent objects. For the purpose of this test I'll work with Oracle's Sampleapp.

Abandoned and inaccessible homes

Running the validation against the Sampleapp catalog provides some "interesting" results: some homes are declared "abandoned": this could be due to the related user not existing anymore in weblogic console, but that's not the case

E10    saw.security.validate.homes Abandoned home /users/weblogic  

Looking deeper in the logs we can see that the same user folders are flagged as

User facing object '/users/weblogic' has no user permissions and is inaccessible  

Logging in with the user weblogic doesn't allow me to check the "My Folders" in the catalog. When switching to "Admin View" and trying to open "My Folder" I get the following error

 Command Line

As written in the logs looks like the user folder has permission problems. How can we solve this? One option is to use again the runcat.sh command with the forgetAccounts option to remove the inconsistent homes. However this solution deletes all the content related to the user that was stored under the "My Folders".

In order to keep the content we need to overwrite the folder's permission with an administrator account. Unfortunately, when right-clicking on the folder, the "Permission" option is not available.

 Command Line

As a workaround I found that clicking on Properties and then on Set Ownership of this item and all subitems allows you to grant full access to the administrator which is then able to reset the proper user the relevant access privilege.

 Command Line

Once the workaround is implemented the users is able to check his "My Folder" content, however the the errors are still present in catalog validation. The solution is storing the relevant artifacts in another part of the catalog, run runcat.sh with forgetAccounts option and then reimport the objects if needed.

Inconsistent Objects

The main two reasons generating inconsistent objects are:

  • Invalid XML: The object (analysis or dashboard) XML code is not valid. This can be caused by errors during the write to disk or problems during migrations.
  • Broken Links: analysis contained in a dashboard or linked from other analysis have been renamed or deleted.

Let's see how catalog validation shows the errors.

Invalid XML

To test this case I created a simple analysis with two columns and then went to the Advanced tab and deliberately removed an > to make the XML invalid.

 Command Line

When trying to applying the change I got the following error which denied me the possibility to save.

 Command Line

Since I really wanted to ruin my analysis I went directly to the file system under $BI_HOME/bidata/service_instances/ssi/metadata/content/catalog/root/shared/$REQUEST_PATH and changed the XML directly there.

After than I run the catalog validation with only the flag items equal to Report and the rest set to None since I'm looking only at invalid XMLs.
The result as expected is:

Message: Unterminated start tag, 'saw:column', Entity publicId: /app/oracle/biee/user_projects/domains/bi/bidata/service_instances/ssi/metadata/content/catalog/root/shared/rm+demo/notworkinanalysis, Entity systemId: , Line number: 9, Column number: 13  

Which tells me that my analysis notworkinganalysis is invalid with an unterminated start tag, exactly the error I was expecting. Now I have two choices: either fixing the analysis XML manually or rerunning the catalog validation with option Clean which will delete the analysis since it's invalid. As said before there is no automated fix.

I wanted to do a further example on this, instead of removing the >, i removed a quotation mark " to make the analysis invalid

 Command Line

After clicking to Apply OBIEE already tells me that there is something wrong in the analysis. But since it allows me to save and since I feel masochist I saved the analysis.

 Command Line

But... when running the catalog validation as before I end up seeing 0 errors related to my notworkinganalysis.

 Command Line

The answer to Jackie Chan question is that I got 0 errors since in this second case the XML is still valid. Removing a " doesn't make the XML syntax invalid! In order to find and solve that error we would need to use Oracle's Baseline Validation Tool.

Broken Links

To test the broken links case I created the following scenario:

  • Analysis SourceAnalysis which has navigation action to TargetAnalysis

 Command Line

  • Dashboard TestDashboard which contains the TargetAnalysis object.

In order to break things I then deleted the TargetAnalysis.

 Command Line

Running catalog validation with the option links to Report. As expected I get a line

N1    saw.catalog.impl.scour.validateDeadLink Referenced path /shared/RM Demo/TargetAnalysis in file /shared/RM Demo/_portal/TestDashboard/page 1 is inaccessible.  

But I don't get anything on the SourceRequest object, for which navigation is failing.

 Command Line

But if instead of an action link I use TargetAnalysis to filter the results of SourceAnalysis

 Command Line

And then delete TargetAnalysis, I get the expected error:

N1    saw.catalog.impl.scour.validateDeadLink Referenced path /shared/RM Demo/TargetAnalysis in file /shared/RM Demo/SourceAnalysis is inaccessible

Summarizing the broken link validation reports if missing objects are included in the main definition of other objects (as filters or as parts of dashboards) but doesn't seem to report if the missing object is only linked via an action.

Conclusion

My experiments show that catalog validation finds some errors like invalid homes, XML files and broken links which otherwise users would hit at the run-time and that won't make them happy. But there are still some errors which it doesn't log like analysis with wrong column syntax, luckily for most of the cases other tools like the Baseline Validation can spot them easily so use all you have, use as frequently as possible and if you want more details about how it works and how it can be included in the automatic checks for code promotions don't hesitate to contact us!

Categories: BI & Warehousing

Video: 2 Minute Integration with Oracle Integration Cloud Service

OTN TechBlog - Fri, 2017-06-23 07:00

More than just a tip, this video from Oracle ACE Robert van Molken and ACE Associate Phil Wilkins, actually demonstrates that you can set up an integration in Oracle Integration Cloud Service in under two minutes.  The video walks you step-by-step through a simple integration that uses REST and SOAP connections integrated using basic map data.

And these guys should know. Robert and Phil are the co-authors of Implementing Oracle Integration Cloud Service (2017, Packt Publishing). Robert also has a new book coming out next year, Blockchain Across Oracle, also from Packt.

Additional Resources

Partner Webcast – Delivering Contextual Collaboration and Engagement with Oracle Content and ...

The key to digital business transformation are the productivity tools and services that employees rely on to deliver business value. However, the digital workplace of today requires a new set of...

We share our skills to maximize your revenue!
Categories: DBA Blogs

Pages

Subscribe to Oracle FAQ aggregator