Feed aggregator

The Importance of BI Documentation

Rittman Mead Consulting - Thu, 2016-03-17 05:00
Why Is BI Documentation Important?

Business intelligence systems come with a lot of extra information. Even beautifully constructed analyses have piles of background information and histories. Administrators might often have memos and updates that they’d like share with analysts. Sales figures might have anomalies that need further explanation. But OBIEE does not currently have any options for BI Documentation inside the dashboard.

Let’s say a BI user for a cell phone distribution company is viewing a report comparing the yearly sales figures for several different cell phones. If the analyst notices that one specific cell phone is outperforming the others, but doesn’t know what makes that specific model unique, then they have to go searching for that information.


But what if the individual phone model specifications and advertising and marketing histories were already included as reports inside the dashboard? What if the analyst, with only a couple of clicks, discovered that the reason one cell phone was outperforming the others was due to its next-gen screen, camera, and chip upgrades, which proved popular with consumers? Or what if the analyst discovered that the popular phone, while containing outdated peripherals, was selling so well because a Q3 advertising push for that model only? All of this information might not be contained in the dashboard’s visuals, but greatly affects the analysts’ understanding of the reports.

Current Options for OBIEE Documentation

Some information can be displayed as visuals, but many times this isn’t a practical solution. Besides making dashboards too cluttered, memos, product descriptions, company directories, etc., are not practical as charts and graphs. As of right now, important documentation can be stored in a wide range of places outside of the BI dashboard, but the operating reality at most organizations means that important information is spread across several locations and not always accessible to the people who need it.


Workarounds are inefficient, cost time, cause BI users to leave the BI environment (potentially reducing usage), and increase frustration. If an analyst has to email several different people to locate the information she wants, that complicates her workflow and produces extraneous communications (who likes answering emails?). Before now, there wasn’t an easy solution to these problems.

ChitChat’s BI Documentation Features

With ChitChat, it’s now possible to store critical documentation where it belongs—at the source of the conversation. Keep phone directories, memos from administrators (or requests from analysts to administrators), product descriptions, analytical histories—really, the possibilities are endless—inside the dashboard where they are accessible to the people who need them. Shorten workflows and make life easier for your BI users.

ChitChat’s easy-to-use functionality allows BI users to copy and paste or write (ChitChat has a built-in WYSIWYG text editor) important information inside the BI dashboard, creating a quicker path to insightful and actionable analytics. And isn’t that the goal in the end?

To learn more about ChitChat’s many commentary features, or to request a demo, click here.

The post The Importance of BI Documentation appeared first on Rittman Mead Consulting.

Categories: BI & Warehousing

Links for 2016-03-16 [del.icio.us]

Categories: DBA Blogs

GDC16 Day 3: Another Day of Fun & Data!

Oracle AppsLab - Thu, 2016-03-17 01:17

Early morning view of the GDC16 Expo Hall.

The Expo opened today and will be open until the end of Friday! There was a lot to see and do! I managed to explore 1/3 of the space. Walking in, we have the GDC Store to the left and the main floor below the stairs. Upon entering the main floor, Unity was smack dab in the center. It had an impressive set up, but not as impressive as the Oculus area nor Clash of Kings.

Built to look like a store :O

Clash of Kings. The biggest booth of all booths. They brought the game to real life with hired actors!

There were a lot of demos you could play, with many different type of controllers. Everyone was definitely drinking the VR Kool-Aid. Because of the popularity of some of the sessions, reservations for a play session are strongly encouraged. Most, if not all of the sessions ,were already booked for the whole day by noon. I managed to reserve the PS VR play session for tomorrow afternoon by scanning a QR code to their scheduling app!

The main floor was broken up into pavilions with games by their respective counties. It was interesting to overhear others call their friends to sync up and saying “I’m in Korea.” Haha.

I spent the rest of the time walking around the floor and observing others play.

Fly like a bird! #birdly #GDC16 pic.twitter.com/oeHUnmfhgp

— Tawny (@iheartthannie) March 16, 2016

I did get a chance to get in line for an arcade ride! My line buddy and I decided to get chased by a T-Rex! We started flying in the air as a Pterodactyl. The gleeful flight didn’t last long. The T-Rex was hungry and apparently really wanted us for dinner. It definitely felt like we were running quickly, trying to get away.

Another simulation others tried that we didn’t was a lala land roller coaster. In this demo, players can actually see their hand on screen.

Waiting to try out sim arcade ride that senses your hands! Fairytale coaster ride w/ bunny companion in tow #GDC16 pic.twitter.com/Y8mqDs2ILg

— Tawny (@iheartthannie) March 16, 2016

Sessions & Highlights

Playstation VR. Sony discusses development concepts, design innovations and what PS VR is and is not. I personally liked the direction they are going for collaboration.

  • Design with 2 screens in mind. For console VR, you may be making 2 games in 1. One in VR and one on TV. You should consider doing this to avoid having one headset per player and to allow for multiplayer cooperation. Finding an art direction for both is hard. Keep it simple for good performance.
  • Make VR a fun and social experience. In a cooperative environment, you get 2 separate viewpoints of the same environment (mirroring mode) or 2 totally different screen views (separate mode). This means that innovation between competitive and Co-op mode is possible.

The AppsLab team and I have considered this possibility of a VR screen and TV screen experience as well. It’s great that this idea is validated by one of the biggest console makers.

A year of user engagement data. A year’s worth of game industry data, patterns and trends was the theme of all the sessions I attended today.

  • There are 185 million gamers in the US. Half are women.
    • 72 million are console gamers. Of those console owners the average age is ~30 years old.
    • There are 154 million mobile gamers. This is thanks to the rise of free-2-play games. Mobile accessibility has added diversity to the market and brought a new group of players. Revenues grew because of broad expansion. The average age for the mobile group is ~39.4 years old.
    • There are 61 million PC gamers thanks to the rise of Steam. These gamers tend to be younger at an average age of ~29.5yrs.
  • There are different motivations as to why people play games. There are two group of players: Core vs. casual players. Universally, the primary reason casual players play games is when they are waiting to pass time and as a relaxing activity.
  • There is great diversity within the mobile market. There is an obvious gender split between what females and males play casually. Females tend to like matching puzzle (Candy Crush), simulation and casino games while males tend to like competitive games like sport, shooter and combat city builder games.
  • When we look internationally, players in Japan have less desire to compete when playing games. Success of games based on cooperative games.
  • Most homes have a game console. In 2015, 51% of homes owned at least 2 game consoles. At the start of 2016, there was an increase of 40% in sales for current 8th generation game consoles (PS4, Xbox One, etc minus the Wii).
  • Just concentrating on mobile gamers, 71% play games on both their smart phone and tablet, 10% play only on their tablet.
  • Top factors leading to churn are lack of interest, failure to meet expectation and too much friction.
  • Aside from Netflix and maybe Youtube, Twitch gobbles up more prime time viewers, almost 700K concurrent views as of March 2016. Its viewership is increasing despite competition with the launch of YouTube Gaming.

Day 1 — User research round table. This was my first round table during GDC, and it’s nice to be among those within the same profession. We covered user research for VR, preventing bias and testing on kids! Experts provided their failures on these topics and offers suggestions.

  • Testing for Virtual Reality.
    • Provide players with enough time warming up in the new environment before asking them to perform tasks. Use the initial immersive exposure for to calibrate them.
    • Be ready to pull them out at any indication of nausea.
    • Use questionnaires to screen out individuals who easily get motion sickness.
    • It’s important to remember that people experience sickness for different reasons. It’s hard to eliminate all the variables. Some people can have vertigo or claustrophobia that’s not necessarily the fault of the VR demo. There is a bias toward that in media. People think they are going to be sick so they feel sick.
    • Do not ask people if they feel sick before the experience else you are biasing them to be sick.
    • Individuals are only more likely to feel sick if your game experience does not match their expectations. Some people feel sick no matter what.
    • One researcher tested 700–800 people in VR. Only 2 persons said that they felt sick. 7–8 said they felt uncomfortable.
    • An important questions to ask is “At what point do they feel sick?” If you get frequent reports at that point vs. Generalized reports, then you can do something to make the game better.
  • Bias.
    • Avoid bragging language. Keep questions neutral.
    • Separate yourself from the product.
    • Remember participants think that you are an authority. Offload instructions to the survey, rather than relay the instructions yourself. It’s going to bias the feedback.
    • Standardize the experiment. Give the same spiel.
    • The order of question is important.
    • Any single geographic region is going to introduce bias. Only screen out regions if you think culture is going to be an issue.
  • Testing with kids.
    • It’s better to test with 2 kids in a room. With kids, they are not good at verbalizing what they know and do not know. Having 2 kids allows you to see them verbalize their thoughts to each other as they ask questions and help each other through the game.
    • When testing a group of kids at once, assign the kids their station and accessories. Allowing them to pick will end up in a fight over who gets the pink controller.
    • Younger kids aren’t granular so allow for 2 clear options on surveys. A thumbs up and thumbs down works.
    • Limit kids to one sugary drink or you’ll regret it.

Possibly Related Posts:

Something Amiss

Greg Pavlik - Thu, 2016-03-17 00:02
Looks like this curious non-review of the novel Laurus seems to have been referring to "Brahmins" as "Brahman" - I suppose republished to correct the mistake:
​Russian Brahman by Alan Jacobs | Articles | First Things
www.firstthings.com/article/2016/04/russian-brahman
First Things
Russian Brahman. by Alan Jacobs April 2016. Laurus by eugene vodolazkin translated by lisa hayden oneworld, 384 pages, $24.99. Eugene Vodolazkin's ...
​Russian Brahmin by Alan Jacobs | Articles | First Things
www.firstthings.com/article/2016/04/russian-brahmin
First Things
Russian Brahmin. by Alan Jacobs April 2016. Laurus by eugene vodolazkin translated by lisa hayden oneworld, 384 pages, $24.99. Eugene Vodolazkin's ...

Whatever his grasp of Hindu concepts, it's obvious Jacobs knows little to nothing about the tradition of Russian yurodivy, which makes this review overall kind of silly at best. Interested readers can refer to the hagiographies of Xenia of Petersburg or Feofil of the Kiev Caves Lavra to become acquainted with some of the conceptual background to the novel, both published by the monastery press in Jordanville, NY in English. As a complement the Pavel Lungin movie Ostrov is worth watching carefully - the film is based partly on Feofil, though like the life of St Xenia, it explores the theme of vicarious repentance. (It was not until the third time I saw the film that I fully grasped it - the visuals are stunning and in many respects a distraction.)

All of that aside, what continues to trouble me in general is the fact that most of the reviews of Laurus that I've seen have been oriented toward theological critiques - endorsements or arguments revolving around the reviewer's reading of what the author might want us to think about religion. And yet it is obvious that Vodolazkin did not write a religious apologetic (Jacobs invokes Karamazov, which is simultaneously a religious argument and a humanistic work - but Laurus is anything but the former). Laurus deserves a review as a work of notable - even great - world literature: which is to say first and foremost an exploration of what Vodolazkin is attempting to accomplish as a writer and what that has produced as a work of literature. The lack of serious analysis is particularly puzzling given the devices Vodolazkin uses to deal with language, identity, personality, relationship, and - yes - time. We could do with a few less sermons and a bit more thought.

Designing PL/SQL Programs

Andrew Clarke - Wed, 2016-03-16 18:57
When I started out, in COBOL, structured programming was king. COBOL programs tended to be lengthy and convoluted. Plus GOTO statements. We needed program desire to keep things under control.

So I noticed the absence of design methodologies when I moved into Oracle. At first it didn't seem to be a problem. SQL was declarative and self-describing, and apparently didn't need designing. Forms was a 4GL and provided its own structure. And PL/SQL? Well that was just a glue, and the programs were so simple.

Then one day I was debugging several hundred lines of PL/SQL somebody had written, and struggling to figure out what was going on. So I drew a flow chart of the IF branches and WHILE loops. Obvious really, but if the original author had done that they would have realised that the program had an ELSE branch which could never be chosen; more than one hundred lines of code which would never execute.
Let me sleep()
Good design is hard to define: in fact, good design is often unobtrusive. It's bad design we notice, because it generates friction and hinders our progress. By way of illustration, here is a poor design choice in Oracle's PL/SQL library: DBMS_LOCK.SLEEP() .

SLEEP() is a simple program, which suspends processing for a parameterized number of seconds. This is not something we want to do often, but it is useful in testing. The problem is its home in the DBMS_LOCK package, because that package is not granted to public by default.

DBMS_LOCK is a utility package for building our own locking mechanisms. There's not much need for this any more. Oracle's default locking model is pretty good. There is SELECT .. FOR UPDATE for pessimistic locking, which is even more powerful since the SKIP LOCKED syntax was permitted in 11g. We have Advanced Queuing, Job Scheduling, oh my. It's hard to find a use case for user-defined locks which isn't re-inventing the wheel, and easy to see how we might end up implementing something less robust than the built-in locks. So DBAs tend not to grant execute on DBMS_LOCK without being asked, and then often not without a fight.

But as developers we need access to a sleep routine. So DBAs have to grant execute on DBMS_LOCK, and then that gives away too much access. It would be better if SLEEP() was easily accessible in some less controversial place.

Why is this an example of bad design? Because user-defined locks need a sleep routine but  SLEEP()has other uses besides lock implementations. Putting  SLEEP() in DBMS_LOCK means it's harder to use it.
Riding the Hobby Horse
Occasionally in a recruitment interview I have asked the candidate how they go would design a PL/SQL program. Mostly the question is met with bemusement. PL/SQL design is not A Thing. Yet many of us work on huge PL/SQL code-bases. How do they turn out without a design methodology? Badly:
  • Do you have one schema crammed with hundreds of PL/SQL program units, perhaps named with a prefix to identify sub-systems?
  • Do you have a package called UTILS?
  • Do you query USER_PROCEDURES or USER_DEPENDENCIES (or even USER_SOURCE) to find a piece of code which implements some piece of functionality?
  • Do you have the same functionality implemented in several places?
  • Does a "simple change" cascade into changes across multiple program units and a regression testing nightmare?
All these are symptoms of poor design. But there are ways to avoid this situation.

Designing PL/SQL Programs series

Restricting database access to IP addeess using sqlnet.ora

Learn DB Concepts with me... - Wed, 2016-03-16 16:15
**************************************************************
                      FROM MY CLIENT MACHINE:
**************************************************************

C:\Windows\System32>tnsping pdborcl1

TNS Ping Utility for 32-bit Windows: Version 11.2.0.1.0 - Production on 07-MAR-2016 15:32:20

Copyright (c) 1997, 2010, Oracle.  All rights reserved.

Used parameter files:
C:\app\oracle\product\11.2.0\client_1\network\admin\sqlnet.ora


Used TNSNAMES adapter to resolve the alias
Attempting to contact (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = 10.215.XX4.XX)(PORT = 1521))) (CONNECT_DATA = (SERVICE_NAME = pdbo
cl1)))
OK (0 msec)

**************************************************************
NOW I want my DB to be only be accessible from this machine and restrict all other. For this I will need to make changed to my DB server sqlnet.ora :
**************************************************************

ON DB SERVER:

oracle@Linux03 admin]$ cat sqlnet.ora
# sqlnet.ora Network Configuration File: /u01/app/oracle/product/12.1.0.2/db_1/network/admin/sqlnet.ora
# Generated by Oracle configuration tools.

NAMES.DIRECTORY_PATH= (TNSNAMES, EZCONNECT)

tcp.validnode_checking = yes  ## I have just added this line to test if it works

[oracle@Linux03 admin]$ vi sqlnet.ora
[oracle@Linux03 admin]$ lsnrctl reload

LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 07-MAR-2016 15:20:03

Copyright (c) 1991, 2014, Oracle.  All rights reserved.

Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=Linux03)(PORT=1521)))
TNS-00584: Valid node checking configuration error
 TNS-12560: TNS:protocol adapter error

As expected it doesn't work as listener needs to know where to accept or deny connections from.
Now I tweak the sqlnet.ora file to below

[oracle@Linux03 admin]$ cat sqlnet.ora
# sqlnet.ora Network Configuration File: /u01/app/oracle/product/12.1.0.2/db_1/network/admin/sqlnet.ora
# Generated by Oracle configuration tools.

NAMES.DIRECTORY_PATH= (TNSNAMES, EZCONNECT)


tcp.validnode_checking = yes
tcp.invited_nodes = (Linux03)  ## I only want to accept connections from my localhost i.e Linux03

RESTART THE LISTENER NOW.

[oracle@Linux03 admin]$ lsnrctl reload

LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 07-MAR-2016 15:21:06

Copyright (c) 1991, 2014, Oracle.  All rights reserved.

Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=Linux03)(PORT=1521)))
The command completed successfully

 (OR)

[oracle@Linux03 admin]$ lsnrctl stop

LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 07-MAR-2016 15:28:36

Copyright (c) 1991, 2014, Oracle.  All rights reserved.

Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=Linux03)(PORT=1521)))
The command completed successfully
[oracle@Linux03 admin]$ lsnrctl start

LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 07-MAR-2016 15:28:41

Copyright (c) 1991, 2014, Oracle.  All rights reserved.

Starting /u01/app/oracle/product/12.1.0.2/db_1/bin/tnslsnr: please wait...

TNSLSNR for Linux: Version 12.1.0.2.0 - Production
System parameter file is /u01/app/oracle/product/12.1.0.2/db_1/network/admin/listener.ora
Log messages written to /u01/app/oracle/diag/tnslsnr/Linux03/listener/alert/log.xml
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=Linux03)(PORT=1521)))
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))

Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=Linux03)(PORT=1521)))
STATUS of the LISTENER
------------------------
Alias                     LISTENER
Version                   TNSLSNR for Linux: Version 12.1.0.2.0 - Production
Start Date                07-MAR-2016 15:28:41
Uptime                    0 days 0 hr. 0 min. 0 sec
Trace Level               off
Security                  ON: Local OS Authentication
SNMP                      OFF
Listener Parameter File   /u01/app/oracle/product/12.1.0.2/db_1/network/admin/listener.ora
Listener Log File         /u01/app/oracle/diag/tnslsnr/Linux03/listener/alert/log.xml
Listening Endpoints Summary...
  (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=Linux03)(PORT=1521)))
  (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
The listener supports no services
The command completed successfully

****************************************************
NOW I TRY TO CONNECT FROM CLIENT AGAIN :
****************************************************

C:\Windows\System32>tnsping pdborcl1

TNS Ping Utility for 32-bit Windows: Version 11.2.0.1.0 - Production on 07-MAR-2016 15:32:57

Copyright (c) 1997, 2010, Oracle.  All rights reserved.

Used parameter files:
C:\app\oracle\product\11.2.0\client_1\network\admin\sqlnet.ora


Used TNSNAMES adapter to resolve the alias
Attempting to contact (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = 10.215.XX4.XX)(PORT = 1521))) (CONNECT_DATA = (SERVICE_NAME = pdbo
cl1)))
TNS-12537: TNS:connection closed

C:\Windows\System32>


Nope it doesn't allow me to connect.
*************************************************************
Lets try to remove the tcp.validnode_checking & tcp.invited_nodes and see
**************************************************************

[oracle@Linux03 admin]$ cat sqlnet.ora
# sqlnet.ora Network Configuration File: /u01/app/oracle/product/12.1.0.2/db_1/network/admin/sqlnet.ora
# Generated by Oracle configuration tools.

NAMES.DIRECTORY_PATH= (TNSNAMES, EZCONNECT)


[oracle@Linux03 admin]$ lsnrctl reload

LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 07-MAR-2016 15:31:58

Copyright (c) 1991, 2014, Oracle.  All rights reserved.

Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=Linux03)(PORT=1521)))
The command completed successfully
[oracle@Linux03 admin]$ vi sqlnet.ora
[oracle@Linux03 admin]$ lsnrctl reload

LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 07-MAR-2016 15:32:16

Copyright (c) 1991, 2014, Oracle.  All rights reserved.

Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=Linux03)(PORT=1521)))
The command completed successfully

C:\Windows\System32>tnsping pdborcl1

TNS Ping Utility for 32-bit Windows: Version 11.2.0.1.0 - Production on 07-MAR-2016 15:32:36

Copyright (c) 1997, 2010, Oracle.  All rights reserved.

Used parameter files:
C:\app\oracle\product\11.2.0\client_1\network\admin\sqlnet.ora


Used TNSNAMES adapter to resolve the alias
Attempting to contact (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = 10.215.XX4.XX)(PORT = 1521))) (CONNECT_DATA = (SERVICE_NAME = pdbo
cl1)))
OK (0 msec)

**************************************************************************
SUCCESS YOU HAVE SUCCESFULLY IMPLEMENTED ANOTHER LAYER OF SECURITY ON DB SERVER
*************************************************************************


tcp.validnode_checking = yes
tcp.invited_nodes = (Linux03,HOSTNAME1,HOSTNAME2 ...)

YOU CAN ALSO EXPLICITLY EXCLUDE NODE LIKE THIS :

tcp.excluded_nodes = (192.168.100.101,HOSTNAME2,)

But : If you only specify invited nodes with tcp.invited_nodes, all other nodes will be excluded, so there is really no reason to do both. The same is true for excluded nodes. If you put tcp.excluded_nodes = (192.168.100.101,192.168.100.160) then IP containing 192.168.100.101 and 192.168.100.160 will be excluded/denied to connect to database as a database user while allowing others to connect.





Categories: DBA Blogs

Fishbowl Solutions at Collaborate 2016: Demos and Discussions on Oracle JET, ADF, Documents Cloud Service, Controlled Document Management and Portals

fbweb_003105
Fishbowl Solutions is looking forward to Collaborate 2016. We have another full list of activities planned, and we are always excited to meet with customers and discuss their initiatives around enterprise content management and portals, the cloud, as well as front-end user design and experience. With the release of Oracle WebCenter 12c back in October, customers are also eager to understand more of what the new version has to offer. Fortunately for WebCenter customers attending Collaborate, Fishbowl Solutions will be covering all these topics across the 5 presentations we will be giving, as well as one-on-discussions in our booth – #1028.

We are also privileged to be joined by two WebCenter customers who will give presentations on their WebCenter use cases. The first customer, ICON plc (www.iconplc.com/) based in Dublin, Ireland, will discuss the process of improving the front-end experience of the WebCenter-based portal they use to manage the clinical trials process.

The second customer is Rosendin Electric (www.rosendin.com) based in San Jose, CA, and they will share how they implemented Fishbowl’s ControlCenter solution to automate the contract management process within WebCenter.

The best part and biggest benefit of attending Collaborate is hearing stories from actual customers, like ICON and Rosendin. Collaborate is truly a user group conference, and hearing case studies on WebCenter deployments, enhancements, integrations, etc., are invaluable for other customers looking to do the same or similar. Less marketing speak and sales pitches, and more learning. As you plan your schedule for Collaborate, look for Session Types denoted as Case Studies.

Here is a preview of what Fishbowl currently has planned for Collaborate 2016.

Booth: 1028

Demos of our WebCenter Portal Solution Accelerator, User Experience (art of the possible) demos of portal and websites, and ControlCenter (controlled document management for WebCenter).

Presentations: 5

  • Monday, April 11, 10:30-11:30 AM:
    A Designer’s Introduction to the Oracle JET Framework

    This session is targeted at front-end developers looking to leverage their front-end design skills to create Modular Single Page applications with Oracle’s new JavaScript Extension Toolkit (JET) framework. The presentation will include an overview of the JET framework and discussion around the features and benefits, along with a quick comparison with Application Development Framework (ADF). Sim and Beatty will walk through and explain JET’s integrated open source libraries and frameworks, and then take developers through the creation of a simple, JET-integrated cloud application.

  • Monday, April 11, 4:30-5:30 PM:
    Integrating Oracle JET With ADF to Create a Modern and Engaging User Experience

    In this session you will learn about the pros and cons of Oracle’s new JET framework and ADF and how you can combine them to create a modern development experience writing Modular Single Page Applications. Sim and Weaver will discuss how front-end designers can create modern, platform agnostic extendable interfaces with JET, and how developers can create ADF integrations and extendable services with the back-end to serve up small data snippets (JSON).

  • Tuesday, April 12, 10:45-11:45 AM:
    Developing Hybrid Solutions for the Oracle Documents Cloud Service (DoCS)

    This session will provide an overview of Oracle’s Documents Cloud Service (DoCS), including its interface, security model, and how to embed the DoCS UI and integrate with the REST API and Applink Resource to create seamless hybrid off- and on-premise applications. As part of the lecture, Sim will provide live examples and code walkthroughs, as well as talk about hybrid application development and the best times to use the Applink Resource vs the REST API with Oracle’s new Oracle JET framework for developing cloud apps. The presentation will conclude with an overview of an integration that Fishbowl has created to support Oracle DoCs.

  • Tuesday, April 12, 4:45-5:45 PM
    ICON Enhances Its WebCenter Portal Design by Keeping the User in Mind

    ICON Clinical Research Limited is a global provider of outsourced development services to the pharmaceutical, biotechnology, and medical device industries. They specialize in the development, management, and analysis of programs that support clinical development. ICON implemented Oracle WebCenter as the platform for its ICONIK portal, which will be used by the clinical trials team to manage, maintain, and share content created during the trials process. Come to this session to hear how ICON and Fishbowl Solutions leveraged next generation, best practice portal design concepts and technologies to provide a high-end and rich user experience to end users. Learn how ICON leverages WebCenter Portal and Content to surface personalized study documents, quickly manage content, and collaborate with other team members, whether on a desktop or on the go through a mobile device. We will also discuss how ICON has streamlined their business to solve problems that contributed to delays in the clinical trials process, impeding ICON’s customers from bringing products to market.

  • Wednesday, April 13, 9:15-10:15 AM:
    Rosendin Electric Pairs a Modern User Experience with WebCenter Content to Automate Contract Management

    Rosendin Electric is the top-ranked private electrical contractor in the nation whose work spans preconstruction, prefabrication, building information modeling, and renewable energy. Join us to hear Rosendin describe how they leveraged Oracle WebCenter and Fishbowl Solutions’ ControlCenter to automate and improve their contract management process. Rosendin’s new contract management system provides an intuitive, mobile-enabled interface and dashboard view for their contracts team that shows working, pending, and executed contracts. This dashboard is specific for each user, enabling them to quickly take action not only on contracts, but also on associated documents such as non-disclosure agreements and corporate governance documentation. Come see how WebCenter Content has streamlined Rosendin’s contract management process, making it much more efficient while ensuring the lifecycle of contracts and related documents can be easily tracked, viewed, and archived within one enterprise repository.

We hope to see you at Collaborate 2016!

The post Fishbowl Solutions at Collaborate 2016: Demos and Discussions on Oracle JET, ADF, Documents Cloud Service, Controlled Document Management and Portals appeared first on Fishbowl Solutions' C4 Blog.

Categories: Fusion Middleware, Other

Find missing or stale statistics

DBA Scripts and Articles - Wed, 2016-03-16 15:20

Purpose Statistics are primordial for the optimizer to choose the best execution plan possible. Sometimes you encounter a sub-optimal plan and need to find out if the tables involved in the statement are up to date. This query will help you find out which tables have been modified significantly since the last statistics gathering. For … Continue reading Find missing or stale statistics

The post Find missing or stale statistics appeared first on Oracle DBA Scripts and Articles (Montreal).

Categories: DBA Blogs

OTN Developer Lab - CloudWorld DC - Register Today!

OTN TechBlog - Wed, 2016-03-16 14:59
OTN will be offering a Developer Lab at CloudWorld DC.

Topics to be covered in the Developer Lab sessions:
  • Provision a new database in the cloud with Oracle Database Cloud Service (DBCS)
  • Gain experience in building new applications for the Oracle Cloud Set up connectivity between the Compute tier and DBCS
  • Learn about the REST APIs available to access the Oracle Cloud
In order to take part in the Develop Lab please note the following requirements. Each participant must bring their own laptop with the following:
  • Intel Core i5 Laptop with 8GB Memory, 55GB Free Hard Disk Space
  • Windows 7 64-Bit
  • Microsoft Internet Explorer, Apple Safari or Mozilla Browser
There is no fee to participate in this Lab, but space is limited. Register today.

Firefox ESR 45 Certified with EBS 12.1 and 12.2

Steven Chan - Wed, 2016-03-16 13:57

Firefox ESR logo

Mozilla Firefox 45 Extended Support Release (ESR) is certified as a Windows-based client browser for Oracle E-Business Suite 12.1 and 12.2.

What is Mozilla Firefox ESR?

Mozilla offers an Extended Support Release based on an official release of Firefox for organizations that are unable to mass-deploy new consumer-oriented versions of Firefox every six weeks.  From the Mozilla ESR FAQ:

What does the Mozilla Firefox ESR life cycle look like?

Releases will be maintained for approximately one year, with point releases containing security updates coinciding with regular Firefox releases. The ESR will also have a two cycle (12 week) overlap between the time of a new release and the end-of-life of the previous release to permit testing and certification prior to deploying a new version.

Maintenance of each ESR, through point releases, is limited to high-risk/high-impact security vulnerabilities and in rare cases may also include off-schedule releases that address live security vulnerabilities. Backports of any functional enhancements and/or stability fixes are not in scope.

At the end of the support period for an ESR version:

  • the release will reach its end-of-life
  • no further updates will be offered for that version
  • an update to the next version will be offered through the application update service

E-Business Suite to be certified with Firefox Extended Support Releases

New personal versions of Firefox are released roughly every six weeks.  It is impractical for us to certify these new personal versions of Firefox with the Oracle E-Business Suite because a given Firefox release is generally obsolete by the time we complete the certification.

From Firefox 10 and onwards, Oracle E-Business Suite is certified only with selected Firefox Extended Support Release versions. Oracle has no current plans to certify new Firefox personal releases with the E-Business Suite.

EBS patching policy for Firefox compatibility issues

Mozilla stresses their goal of ensuring that Firefox personal versions will continue to offer the same level of application compatibility as Firefox Extended Support Releases. 

Oracle E-Business Suite Development will issue new E-Business Suite patches or workarounds that can be reproduced with Firefox Extended Support Releases.  If you report compatibility issues with Firefox personal releases that cannot be reproduced with Firefox Extended Support Releases, your options are:

  1. Deploy a certified Firefox Extended Support Release version instead of the Firefox personal version
  2. Report the incompatibility between Firefox ESR and Firefox personal to Mozilla
  3. Use Internet Explorer (on Windows) or Safari (on Mac OS X) until Mozilla resolves the issue
EBS Compatibility with Firefox ESR security updates

Mozilla may release new updates to Firefox ESR versions to address high-risk/high-impact security issues.  These updates are considered to be certified with the E-Business Suite on the day that they're released.  You do not need to wait for a certification from Oracle before deploying these new Firefox ESR security updates.

Certified desktop operating systems
  • Windows 10 (32-bit and 64-bit)
  • Windows 8.1 (32-bit and 64-bit)
  • Windows 7 SP1 (32-bit and 64-bit)
  • Windows Vista SP2

References

Related Articles

Categories: APPS Blogs

March 24: Fike Corporation―Oracle Sales Cloud and CPQ Cloud Customer Forum

Linda Fishman Hoyle - Wed, 2016-03-16 12:57

Join us for another Oracle Customer Reference Forum on March 24, 2016. Fike Corporation's Jeannie Foster, Manager, Global Business Support Sales & Corporate Reporting, and Jeff Hunter, CRM Business Analyst, will discuss the evolution of the business from the inside out.

Early in the company's success, Fike deployed JD Edwards to manage its overall business. As it matured, it found a need to give its sales organization a better way to configure, price, and quote. Enter Oracle CPQ Cloud (previously BigMachines).This solution gave Fike the ability to easily configure products, price within the parameters of the company, and fulfill its incoming quotes passed to JD Edwards.

Recently, Fike deployed Oracle Sales Cloud to further collaborate with its global customers within the regions, allowing better insight into customer value and forecasting for better lead-time in its manufacturing processes.

Register now to attend the live Forum on Thursday, March 24, 2016, at 8:00 a.m. PT / 10:00 a.m. CT, and learn more about Fike’s experience with Oracle Sales Cloud and CPQ Cloud.

Log Buffer #465: A Carnival of the Vanities for DBAs

Pythian Group - Wed, 2016-03-16 12:17

This Log Buffer Edition covers some tricks, tips, workarounds, and tech-dives covered in various blog posts from Oracle, SQL Server and MySQL.

Oracle:

SQLcl can run Ruby script!!.. JRuby to be precise.

Financial Information Discovery Integration with Oracle Assets

Data Vault Modeling and Snowflake Elastic Data Warehouse

Couchbase Bucket Index Status in NetBeans IDE 8.1

Make SQL Developer Run Faster

SQL Server:

Find a String in a String

Microsoft cloud platform allows you to provide resiliency for your on-premises SQL Server deployments. Marcin Policht describes this functionality, available as part of the Azure Site Recovery solution.

Trust, or Verify – How FKs Can (or Cannot) Help Performance

10 New Features Worth Exploring in SQL Server 2016

Validating a Windows Cluster Prior to Installing SQL Server 2014

MySQL:

When you think of Cloud services for database administration, Database as a Service (DBaaS) is what tends to comes to mind, but one of the ways that the Cloud has had an enormous impact on our work productivity is in the area of collaboration.

Introduction to MySQL Troubleshooting Webinar: Q & A for How to Create a Test Setup

MariaDB is a community-developed fork of MySQL and aims to be an enhanced, drop-in replacement for it.

The MySQL binary package have been growing a lot the last GA versions.

Indexing JSON documents via Virtual Columns

Categories: DBA Blogs

OFM Forum Session Slides - Oracle JET and WebSocket

Andrejus Baranovski - Wed, 2016-03-16 10:46
I would like to post slides from my session on Oracle Fusion Middleware Forum - Oracle JET and WebSocket. This session was done today.

Slides are available on slideshare:


Sample application used for the demo, can be downloaded from here (it consists of three parts - WebSocker server implementation, ADF BC tester application to generate continuos DB updates and JET application) - Oracle JET Live List with WebSocket.

ADF application with live data DVT component rendered in JET can be downloaded from here - When JET Comes To Rescue - Live Data Charts in ADF.

Breaking Down Barriers with DevOps

Pythian Group - Wed, 2016-03-16 09:53

In a recently recorded webinar,  How To Deliver Faster, Better Software Releases with DevOps, Pythian’s Chief Digital Officer, Aaron Lee and guest speaker Amy DeMartine, Senior Research Analyst at Forrester Research, discussed DevOps and the important role it plays in the race to deliver the best customer experience with faster and better software releases.

One of the webinar participants asked about the critical success factors when implementing a rapid release software model. 

The case for adopting agile operations is very compelling. Rapid release cycles can reduce costs, provide better and more stable solutions, increase customer and employee satisfaction, and most importantly, outpace competition and drive revenue.

Driving both efficiency and effectiveness of operations is easier said than done. DeMartine highlighted 7 habits of effective DevOps, but stressed that culture is the number one enemy.

She said, “this fear of change is a huge cultural gap. DevOps is at the point of cultural change, and this is one of the biggest ones that IT has to break”.

Historically, IT teams have held large and lengthy change boards, requiring a strong reliance on rigid approval processes. Combine that history with the complexity and degree of unknowns that operations faces, and it isn’t surprising that some teams are risk averse and consensus driven.

While new technologies, processes, and automation are critical to implementing rapid release models and enhancing DevOps, both DeMartine and Lee agreed that it boils down to the people, and the way Dev and Ops interact with each other.

“It’s clear what the responsibilities of a product manager are. It’s clear where the developers responsibilities start and stop,” said Lee. Often this leads to operations “picking up everything else” and facing an increased amount of unknowns.

Part of the solution is understanding how to embrace and maximize the opportunity for change, as opposed to minimizing the occurrence of change.

Most importantly, Dev and Ops need to know and understand each other. By setting and balancing clear boundaries and expectations, each team can understand the role the other is playing, and begin to see the environment under a common goal.

For more on this topic, download the on-demand webinar below:

devopswebinar_CTA (1)

Categories: DBA Blogs

Oracle Midlands : Event #14 – Summary

Tim Hall - Wed, 2016-03-16 09:17

Last night was Oracle Midlands Even #14 with Lothar Flatz.

The session started with “Raiders of the Data Dictionary I: Indexing for the Workload” where Lothar discussed a project he worked on based around completely revamping the indexing of a system. I guess the best way I can describe it is to say it’s a more scientific approach to indexing, using the contents of the dictionary to provide information about columns and column groups used in queries to determine the indexes to create. In some ways the approach was quite extreme (throw everything away and start again), but I also admire the bravery of that approach!

Next up was “Raiders of the Data Dictionary II: The Curse of the Buffer Cache” where Lothar discussed the problems associated with trying to keep large, frequently used tables in the buffer cache. Once again, some interesting points made and some things that will definitely influence my approach in future.

It was great to meet Lothar in person for the first time. After being in the game for 20+ years many subjects can start to feel a little repetitive, so it’s nice when someone comes along with a different spin on a subject. I certainly found myself asking a lot of questions of him and myself, which surely must be what this knowledge spreading thing is all about!

Thanks to Lothar for taking the time to come and speak to us and to everyone that came along to listen. Thanks to Mike for doing a great job in keeping Oracle Midlands going and to Red Stack Tech for their continued support.

See you at the next one! :)

Cheers

Tim…

Oracle Midlands : Event #14 – Summary was first posted on March 16, 2016 at 3:17 pm.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.

RCU-6083 : Failed Drop/Remove Schema using RCU : Prerequisite OPSS failed

Online Apps DBA - Wed, 2016-03-16 03:22

  Recently, one of our trainee came across the error when he was trying to drop RCU Schema that Prerequisite ” Oracle Platform Security services”  failed  . In this post, we will cover how to find what is the error and how to resolve this issue but before that one should know what is RCU. RCU: Repository Creation […]

The post RCU-6083 : Failed Drop/Remove Schema using RCU : Prerequisite OPSS failed appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

GDC16 Day 2: Highlights & Trends

Oracle AppsLab - Wed, 2016-03-16 02:50

Just like yesterday, the VR sessions were very popular. Even with the change to bigger rooms, lines for popular VR talks would start at least 20 minutes before the session started. The longest line I was in snaked up and down the hallway at least 4 times. The wait was well worth it though!

Today was packed. Many sessions overlapped one another. Wish I could have cloned 3 of myself

Enterprise Manager: Using Metrics Extensions (SQL)

Anthony Shorten - Wed, 2016-03-16 00:41

One the major features of Oracle Enterprise Manager (OEM) is the ability to create Metrics Extensions. These are metrics you want to track that may or may not be provided with the underlying products. I want to illustrate this point in a series of articles on using Oracle Enterprise Manager with Oracle Utilities products.

The first article is about how to use the basic metrics extensions capability with a simple SQL statement. This is not usual as it will be part of the database targets (not the Oracle Utilities targets) but I feel it will introduce specific techniques that we will reuse a lot in subsequent articles and serves as really good starting point.

A couple of things before we start:

  • The Metrics Extension part of OEM basically is a facility for you to add all sorts of custom metrics for OEM to track. You will create the extension and then associate it with targets to track.
  • The Metrics Extension component allows for incremental development. You specify and test the Metric first in the user interface. You can then mark it as deployed which will create a version. You then deploy the metric extension to be tracked on targets. The version tracking is useful as you can different versions of the metric deployed to different targets in different stages of development. I will touch on this only. More information is in the Metrics Extension documentation associated with the version of OEM you are using.
  • The screen dumps and example in this article are based upon a tracking query outlined in the Batch Troubleshooting Guide which flattens the Batch Run Tree and summarizes it. It is not a base view but a custom view that is used for illustrative purposes only. Refer to Performance Troubleshooting Guideline Series (Doc Id: 560382.1) from My Oracle Support.
  • The example shown is for Oracle Enterprise Manager 13c but can apply to other versions of Oracle Enterprise Manager.
  • The example will use SQL and in future articles we will explore other adapters.
  • The example is just for illustrative purposes only.

To perform this task you need to be authorized to use the Metrics Extension facility and the targets you will associate with the metric. Refer to your  installation to see if that is the case.

To setup the Metrics Extension, the following process can be used:

  • Navigate to the Metrics Extension facility. This can be done from the link page or menu (Monitoring --> Metric Extensions). For example:

Metrics Extension Menu

  •  From the Create menu, Select Metrics Extension. For example:

 Create Metric Extension

  • Specify the Metric Name, Target Type (Database Instance in this case), Display Name, Adapter (SQL in this case), Description and other attributes for the metric including default collection frequency. For example:

Metric Extension General Properties

  • You might notice the Select Advanced Properties which allows you to specify other attributes on the target to specialize the metric. This is new to OEM 13c and in this case will allow you to target multi-tenant databases (or not) for example.
  • Now as this is an SQL based metric you need to specify the SQL statement to execute to gather the data. In this example, we are using the custom view from the Performance Troubleshooting Guideline Series (Doc Id: 560382.1) from My Oracle Support. Now, in my example, I hardcoded the owner of the view. This is just an illustration. You can get over this by making sure the credentials have access to the view or create a synonym. Remember the database user must have SELECT access as a minimum. The example of the SQL is shown below:

SQL Example

  • For each column in the query you need to define it as part of the Metric. You do not have to define all of them but it is recommended to get full reuse. For each column, defines it attributes including if it data or a key value. Key values are used for SLA tracking. Also you can define more meta data to allow OEM to determine how to process it. The columns for our example are shown below:

Example Column definitions

  • Now we extend the metric by adding a few deltas. Delta's are virtual column that compare the last value with the current value. It is great for checking changes in values at the metric level. In our sample I will add two deltas. One for the Maximum Elapsed Time to see if the job elapsed time is getting worse and one for Maximum Run Rate (Throughput) to track if the number of records processed per period is getting lower. To do this select the field and create the Delta on that field. For example:

Max Elasped Time Delta

  • The Delta Column can also hold the Alert Threshold which is the default SLA including the messages that are available. For the Maximum Elapsed Time I want to detect if the change in the value has increased (greater than 0) and you can even set specific limits. I set a Critical SLA of delta of above 10 (as an example). For example:

Delta Definition with SLA - Max Elapsed Time

  • Repeat for the Max Throughput as that should be tracked to see if goes down (less records processed per minute). For example:

Adding Delta on Throughput

  • Again setup the Max Throughput. For example:

Deleta defintion for throughput

  •  Now the metric is complete with all the API fields. For example:

Complete Metric definition

  • The credentials for the metric need to be defined. When you create a metric you simply attach it to the metrics collection to use it. Again ensure that the credential is valid for the query. In my example I will use the standard database monitoring credential. For example:

Credentials

  • You can attach a database and run the test to verify the metric. This does not attach the metric to the target. It just tests it. For example:

 Testing the Metric

  • Review before saving the metric. At any time you can change the metric before you publish it. For example:

Review the Metric

Review the metric

  • Now the metric is still in editable mode so can be edited as much as necessary. This is indicated on the metric screen. For example:

Summary of Metric

  • To implement the metric you must save it as a Deployable Draft from the Actions Menu. For example:

Save As Deployable Draft

  • A version number is locked in and it is marked as deployable. For example:

Marked As Deployable

  • Now you need to identify the targets you want to deploy this metric to. You select the metric and use Deploy To Targets. from the Actions menu. For example:

Deploy to Targets

  • In this example, we will select the databases that will use this metric. You should note that if you specified Additional Parameters on the Target Type selection, those will be applied to the search. In my example, a standalone database and the CDB version are available (PDB's are not listed). For example:

Selecting Targets

  • OEM will then copy the metric to the targets supplied as a background job. For example:

Scheduled Deployment to Targets

  • You can set targets for individual targets using the Metrics and Collection Settings on the individual target. For example:

Setting Target specific values

  • Scroll down to see the metric and set the appropriate values. If not they will be defaulted from the metric itself. For example:

Example Metric in the Target

 

This is the conclusion of this article. Obviously I cannot cover everything you need to know in one article but hopefully you can see how easy it is to add custom metric extensions. In other articles I will add more detail and add other types of metrics.

Presenting in Collaborate 2016

Vikram Das - Tue, 2016-03-15 16:00
I am presenting in OAUG Collaborate 2016 about our experiences with running E-Business Suite on Engineered Systems - Exadata, Exalogic, ZFS and PCA.  Session Details:

https://collaborate.zerista.com/event/member/223628

Please attend, if you are coming to OAUG Collaborate.
Categories: APPS Blogs

OTN Interview about Application Development with Oracle

Shay Shmeltzer - Tue, 2016-03-15 14:34

A few weeks ago, I set down with Bob from OTN for an interview that covered some of the key products our group works on.

I covered the various frameworks (ADF, JET, MAF), what we are doing with cloud based development (DevCS) and our tools for citizen developers (ABCS).

In case you are interested in any of these acronyms here is the video:

Note that things move really fast at Oracle, and since this interview we already released a new version of Oracle JET and also made it open source, we released an update to Developer Cloud Service, and Application Builder Cloud Service has gone production.

Categories: Development

Pages

Subscribe to Oracle FAQ aggregator