Skip navigation.

Darwin IT

Syndicate content
Darwin-IT professionals do ICT-projects based on a broad range of Oracle products and technologies. We write about our experiences and share our thoughts and tips.Martien van den Akkerhttps://plus.google.com/110503432901891966671noreply@blogger.comBlogger292125
Updated: 2 hours 50 min ago

How to clear MDS Cache

9 hours 8 min ago
In the answer on a question on community.oracle.com, I found the following great tip: http://www.soatutor.com/2014/09/clear-mds-cache.html.

In 12cR2 this looks like:
1. Start System MBean Browser; In Domain, pull down the Weblogic Domain menu and choose 'System MBean Browser':


2. Browse for the Application Defined Beans:



3. Expand it and navigate to oracle.mds.lcm, choose server (AdminServer or SOAServer1)
 4. Navigate to the node Application em (AdminServer) or soa-infra (SOAServer) -> MDSAppRuntime -> MDSAppRuntime. Click on the tab Operations, and then on clearCache


5. Click on Invoke:


Then the following confirmation is shown:


By the way, often a redeploy of the SOA Composite that calls the WSDL or other artefact helps. But unfortunately not always.





Automatic install of SOA Suite and Service Bus 12cR2.

Wed, 2016-05-25 04:41
Lately I worked on a set of scripts to automatically install Weblogic Infrastructure, SOA/BPM Suite, Service Bus, etc. Since I implemented a reworked set yesterday at another customer it might be nice to describe them here.

The scripts help in installing the software and creating the Repository. I started to create a script for creating the domain, but haven't it working yet. A good starting poing would be this blog of Edwin Biemond for the 12cR1 (12.1.3) version. If I managed have it working for 12c related to my other scripts I will get back to it. Probably a nice reference would also be this description of Lucas Jellema (also 12.1.3).

To create the scripts I followed the Enterprise Deployment guide for SOASuite 12c, Install tasks documentation. To administer your different environments (dev, test, acc, prod) of the Fusion Middleware the Enterprise Deployment Workbook might come in handy. And then there is the Installing and Configuring Oracle SOA Suite and Oracle Business Process Management.

The scripts are based on my earlier work on the automatic install of the quickstarts under Linux.

By the way: for these scripts I use shell (bash) under Linux. But since the response files use references that you'd probably want to have based on properties (I would) I should rework those using something like awk/sed (which I don't know) or ANT (which I do know, but need an ANT installation. But maybe in a next phase.

For this installation we need the following downloads, from edelivery:
Product Jar File Zip file Note Fusion Middleware Infrastructure fmw_12.2.1.0.0_infrastructure.jar V78156-01.zip OracleFMW12cInfrastructure SOA & BPM Suite fmw_12.2.1.0.0_soa.jar V78169-01.zip SOASuiteAndBPM Service Bus fmw_12.2.1.0.0_osb.jar V78173-01.zip ServiceBus Managed File Transfer fmw_12.2.1.0.0_mft.jar V78174-01.zip ManagedFileTransfer
The scripts and software is placed in a folderstructure containing the following sub-folders:
Folder Name Containing JavaJava jdk U74+ rpm: jdk-8u74-linux-x64.rpmManagedFileTransfer
  • V78174-01.zip
  • fmw_12.2.1.0.0_mft.rsp
OracleFMW12cInfrastructure
  • V78156-01.zip
  • fmw_12.2.1.0.0_infrastructure.rsp
rcu
  • rcuSOAPasswords.txt
  • rcuSOA.rsp
scripts
  • fmw12c_env.sh
  • install.sh
  • installFMW.sh
  • installJava.sh
  • installMFT.sh
  • installSB.sh
  • installSOA.sh
  • rcuSOA.sh
ServiceBus
  • V78173-01.zip
  • fmw_12.2.1.0.0_osb.rsp
SOASuiteAndBPM
  • V78169-01.zip
  • fmw_12.2.1.0.0_soa.rsp

The scripts and response (.rsp) files I'll explain below. In each product subfolder there is the downloaded zip file (containing the installation-jar file) and the accompanying response file. In the scripts folders there are the product installation scripts and the master script install.sh. So create a folder structure as above and place the downloaded products and the provided scripts in the appropriate folder.

So here we go.

Setting the environmentFirst I need a fmw12c_env.sh script to set some basic environment variables and especially the location of the FMW_HOME, where the software is going to be installed:
#!/bin/bash
echo set Fusion MiddleWare 12cR2 environment
export JAVA_HOME=/usr/java/jdk1.8.0_74
export FMW_HOME=/u01/app/oracle/FMW12210
export SOA_HOME=$FMW_HOME/soa
export OSB_HOME=$FMW_HOME/osb
export MFT_HOME=$FMW_HOME/mft


Adapt the location of the FMW_HOME and possibly the (desired or current) location of your JAVA_HOME. The other 'homes' are relative to the FMW_HOME: these are the locations within the FMW_HOME where the products are installed (In 11g these were Oracle_SOA1 or Oracle_OSB1.
Install JavaFor the 12cR2 version of the  we need an Java 8 Installment. Of course preferably the latest version but at least above Update 65. I used update 74, but you can change it to a later update. The script for the installation is as follows:
#!/bin/bash
. $PWD/fmw12c_env.sh
export JAVA_INSTALL_HOME=$PWD/../Java
export JAVA_INSTALL_RPM=jdk-8u74-linux-x64.rpm
#
echo JAVA_HOME=$JAVA_HOME
if [ ! -d "$JAVA_HOME" ]; then
# Install jdk
echo Install jdk 1.8
sudo rpm -ihv $JAVA_INSTALL_HOME/$JAVA_INSTALL_RPM
else
echo jdk 1.8 already installed
fi
Save it as installJava.sh under scripts.

Update the JAVA_INSTALL_RPM according to the downloaded rpm as placed in the Java subfolder. Again adapt the JAVA_HOME in the fmw12c_env.sh accordingly.What this script does is check if the folder as in JAVA_HOME exists. If not then apparently the denoted version is not installed and so it does.

Sudo grants to  oracle-user
To be able to run the script above (since it uses rpm via sudo)  we need to adapt the sudo-ers file.

Log on as root via the command:
[oracle@darlin-vce- db ~]$ su -

Password:

Last login: Fri Feb 26 06:44:05 EST 2016 on pts/0

Edit de sudoers file:
[root@darlin-vce- db ~]# vi /etc/sudoers

Uncomment the lines for the Cmnd_Alias-es SOFTWARE en SERVICES (remove the hash ’#’ at the beginning of the line):
## Installation and management of software

Cmnd_Alias SOFTWARE = /bin/rpm, /usr/bin/up2date, /usr/bin/yum

## Services

Cmnd_Alias SERVICES = /sbin/service, /sbin/chkconfig

And add the follwing two lines at the end of the file:
## Extra rights for oracle to do for instance rpm without password.

oracle ALL= NOPASSWD: SERVICES, SOFTWARE

Save the file (use an exclamation mark in the ‘:wq!’ command, since sudoers is readonly.After this you can run the installJava.sh.
Install InfrastructureFirst we need to install the Fusion Middleware InfraStructure.This is a Weblogic Server delivery that includes a RCU for the infrastructure schema's in the database. You can't use the 'vanilla' delivery of weblogic server, you'll need this one.

The install script is as follows:
#!/bin/bash
. $PWD/fmw12c_env.sh
#
export FMW_INSTALL_HOME=$PWD/../OracleFMW12cInfrastructure
export FMW_INSTALL_JAR=fmw_12.2.1.0.0_infrastructure.jar
export FMW_INSTALL_RSP=fmw_12.2.1.0.0_infrastructure.rsp
export FMW_INSTALL_ZIP=V78156-01.zip
#
# Fusion Middlware Infrastucture
if [ ! -d "$FMW_HOME" ]; then
#Unzip FMW
if [ ! -f "$FMW_INSTALL_HOME/$FMW_INSTALL_JAR" ]; then
if [ -f "$FMW_INSTALL_HOME/$FMW_INSTALL_ZIP" ]; then
echo Unzip $FMW_INSTALL_HOME/$FMW_INSTALL_ZIP to $FMW_INSTALL_HOME/$FMW_INSTALL_JAR
unzip $FMW_INSTALL_HOME/$FMW_INSTALL_ZIP -d $FMW_INSTALL_HOME
else
echo $FMW_INSTALL_HOME/$FMW_INSTALL_ZIP does not exist
fi
else
echo $FMW_INSTALL_JAR already unzipped.
fi
if [ -f "$FMW_INSTALL_HOME/$FMW_INSTALL_JAR" ]; then
echo Install Fusion Middleware Infrastucture 12cR2
$JAVA_HOME/bin/java -jar $FMW_INSTALL_HOME/$FMW_INSTALL_JAR -silent -responseFile $FMW_INSTALL_HOME/$FMW_INSTALL_RSP
else
echo $FMW_INSTALL_JAR not available!
fi
else
echo $FMW_HOME available: Fusion Middleware 12c Infrastucture already installed.
fi

Save it as installFMW.sh under scripts. As in the installJava.sh this script checks if the FMW_HOME already exists. If not it checks on the availability of the installer-jar. If not then it checks the zip file that should contain the installer-jar. If so then it will unzip the zipfile. If the zip file does not exist then it stops with a message. You can unzip the zip-file prior in starting the scripts, because that is the primary requirement. You can leave the jar file for subsequent installation on other servers. It would be handy if you put this on a shared staging repository folder.

If  in the end the jar-file exists it starts the installer with java from the JAVA_HOME and performs a silent install using a  a response file. This is a file that is recorded at the end of a manual installation session and contains the choices made in the Oracle Universal Installer wizard. It is placed together with the zip file in the product folder.
It looks like as follows:
[ENGINE]

#DO NOT CHANGE THIS.
Response File Version=1.0.0.0.0

[GENERIC]

#Set this to true if you wish to skip software updates
DECLINE_AUTO_UPDATES=true

#
MOS_USERNAME=

#
MOS_PASSWORD=<SECURE VALUE>

#If the Software updates are already downloaded and available on your local system, then specify the path to the directory where these patches are available and set SPECIFY_DOWNLOAD_LOCATION to true
AUTO_UPDATES_LOCATION=

#
SOFTWARE_UPDATES_PROXY_SERVER=

#
SOFTWARE_UPDATES_PROXY_PORT=

#
SOFTWARE_UPDATES_PROXY_USER=

#
SOFTWARE_UPDATES_PROXY_PASSWORD=<SECURE VALUE>

#The oracle home location. This can be an existing Oracle Home or a new Oracle Home
ORACLE_HOME=/u01/app/oracle/FMW12210

#Set this variable value to the Installation Type selected. e.g. Fusion Middleware Infrastructure, Fusion Middleware Infrastructure With Examples.
INSTALL_TYPE=Fusion Middleware Infrastructure

#Provide the My Oracle Support Username. If you wish to ignore Oracle Configuration Manager configuration provide empty string for user name.
MYORACLESUPPORT_USERNAME=

#Provide the My Oracle Support Password
MYORACLESUPPORT_PASSWORD=<SECURE VALUE>

#Set this to true if you wish to decline the security updates. Setting this to true and providing empty string for My Oracle Support username will ignore the Oracle Configuration Manager configuration
DECLINE_SECURITY_UPDATES=true

#Set this to true if My Oracle Support Password is specified
SECURITY_UPDATES_VIA_MYORACLESUPPORT=false

#Provide the Proxy Host
PROXY_HOST=

#Provide the Proxy Port
PROXY_PORT=

#Provide the Proxy Username
PROXY_USER=

#Provide the Proxy Password
PROXY_PWD=<SECURE VALUE>

#Type String (URL format) Indicates the OCM Repeater URL which should be of the format [scheme[Http/Https]]://[repeater host]:[repeater port]
COLLECTOR_SUPPORTHUB_URL=



Save it as fmw_12.2.1.0.0_infrastructure.rsp under OracleFMW12cInfrastructure.

If you choose to use another FMW_HOME as suggested, you'll need to change the ORACLE_HOME variable in the file accordingly. This is one of the elements that I want to have replaced automatically using a property, based on the FMW_HOME env-variable.

Install SOA and BPM SuiteThe script for installation of the SOA and BPM Software is more or less the same as the FMW Infrastructure:

#!/bin/bash
. $PWD/fmw12c_env.sh
#
export SOA_INSTALL_HOME=$PWD/../SOASuiteAndBPM
export SOA_INSTALL_JAR=fmw_12.2.1.0.0_soa.jar
export SOA_INSTALL_RSP=fmw_12.2.1.0.0_soa.rsp
export SOA_INSTALL_ZIP=V78169-01.zip
#
# SOA and BPM Suite 12c
if [[ -d "$FMW_HOME" && ! -d "$SOA_HOME" ]]; then
#
#Unzip SOA&BPM
if [ ! -f "$SOA_INSTALL_HOME/$SOA_INSTALL_JAR" ]; then
if [ -f "$SOA_INSTALL_HOME/$SOA_INSTALL_ZIP" ]; then
echo Unzip $SOA_INSTALL_HOME/$SOA_INSTALL_ZIP to $SOA_INSTALL_HOME/$SOA_INSTALL_JAR
unzip $SOA_INSTALL_HOME/$SOA_INSTALL_ZIP -d $SOA_INSTALL_HOME
else
echo $SOA_INSTALL_HOME/$SOA_INSTALL_ZIP does not exist!
fi
else
echo $SOA_INSTALL_JAR already unzipped
fi
if [ -f "$SOA_INSTALL_HOME/$SOA_INSTALL_JAR" ]; then
echo Install SOA and BPM Suite 12cR2
$JAVA_HOME/bin/java -jar $SOA_INSTALL_HOME/$SOA_INSTALL_JAR -silent -responseFile $SOA_INSTALL_HOME/$SOA_INSTALL_RSP
else
echo $SOA_INSTALL_JAR not available!.
fi
else
if [ ! -d "$FMW_HOME" ]; then
echo $FMW_HOME not available: First install Fusion Middlware Infrastucture
fi
if [ -d "$SOA_HOME" ]; then
echo $SOA_HOME available: SOA Already installed
fi
fi

Save it as installSOA.sh under scripts.
This installs the software for both SOA and BPM. The choice to include BPM or not are made at creation of the domain. Or adapt the INSTALL_TYPE element in the response file below. this one use BPM, but if you adapt it to SOA (I haven't got the actual value at hand, but assume it would be SOA) I assume the BPM software is omitted.

As in the FMW infrastructure installation we need a response file:
[ENGINE]

#DO NOT CHANGE THIS.
Response File Version=1.0.0.0.0

[GENERIC]

#Set this to true if you wish to skip software updates
DECLINE_AUTO_UPDATES=true

#
MOS_USERNAME=

#
MOS_PASSWORD=<SECURE VALUE>

#If the Software updates are already downloaded and available on your local system, then specify the path to the directory where these patches are available and set SPECIFY_DOWNLOAD_LOCATION to true
AUTO_UPDATES_LOCATION=

#
SOFTWARE_UPDATES_PROXY_SERVER=

#
SOFTWARE_UPDATES_PROXY_PORT=

#
SOFTWARE_UPDATES_PROXY_USER=

#
SOFTWARE_UPDATES_PROXY_PASSWORD=<SECURE VALUE>

#The oracle home location. This can be an existing Oracle Home or a new Oracle Home
ORACLE_HOME=/u01/app/oracle/FMW12210

#Set this variable value to the Installation Type selected. e.g. SOA Suite, BPM.
INSTALL_TYPE=BPM



Save it as fmw_12.2.1.0.0_soa.rsp under SOASuiteAndBPM.
This one is a little smaller then the FMW-infra one. And again here the ORACLE_HOME should be adapted in the case you choose to use another FMW_HOME location.
Install Service BusThe script for installation of the Service Bus Software is more or less the same as the SOA and BPM:

#!/bin/bash
. $PWD/fmw12c_env.sh
#
export OSB_INSTALL_HOME=$PWD/../ServiceBus
export OSB_INSTALL_JAR=fmw_12.2.1.0.0_osb.jar
export OSB_INSTALL_RSP=fmw_12.2.1.0.0_osb.rsp
export OSB_INSTALL_ZIP=V78173-01.zip
#
# ServiceBus 12c
if [[ -d "$FMW_HOME" && ! -d "$OSB_HOME/bin" ]]; then
#
#Unzip ServiceBus
if [ ! -f "$OSB_INSTALL_HOME/$OSB_INSTALL_JAR" ]; then
if [ -f "$OSB_INSTALL_HOME/$OSB_INSTALL_ZIP" ]; then
echo Unzip $OSB_INSTALL_HOME/$OSB_INSTALL_ZIP to $OSB_INSTALL_HOME/$OSB_INSTALL_JAR
unzip $OSB_INSTALL_HOME/$OSB_INSTALL_ZIP -d $OSB_INSTALL_HOME
else
echo $OSB_INSTALL_HOME/$OSB_INSTALL_ZIP does not exist!
fi
else
echo $OSB_INSTALL_JAR already unzipped
fi
if [ -f "$OSB_INSTALL_HOME/$OSB_INSTALL_JAR" ]; then
echo Install ServiceBus 12cR2
$JAVA_HOME/bin/java -jar $OSB_INSTALL_HOME/$OSB_INSTALL_JAR -silent -responseFile $OSB_INSTALL_HOME/$OSB_INSTALL_RSP
else
echo $OSB_INSTALL_JAR not available!
fi
else
if [ ! -d "$FMW_HOME" ]; then
echo $FMW_HOME not available: First install Fusion Middlware Infrastucture
fi
if [ -d "$OSB_HOME" ]; then
echo $OSB_HOME available: ServiceBus Already installed
fi
fi

Save it as installSB.sh under scripts.
This installs the software for both SOA and BPM. The choice to include BPM or not are made at creation of the domain.
As in the FMW infrastructure installation we need a response file:
[ENGINE]

#DO NOT CHANGE THIS.
Response File Version=1.0.0.0.0

[GENERIC]

#Set this to true if you wish to skip software updates
DECLINE_AUTO_UPDATES=true

#
MOS_USERNAME=

#
MOS_PASSWORD=<SECURE VALUE>

#If the Software updates are already downloaded and available on your local system, then specify the path to the directory where these patches are available and set SPECIFY_DOWNLOAD_LOCATION to true
AUTO_UPDATES_LOCATION=

#
SOFTWARE_UPDATES_PROXY_SERVER=

#
SOFTWARE_UPDATES_PROXY_PORT=

#
SOFTWARE_UPDATES_PROXY_USER=

#
SOFTWARE_UPDATES_PROXY_PASSWORD=<SECURE VALUE>

#The oracle home location. This can be an existing Oracle Home or a new Oracle Home
ORACLE_HOME=/u01/app/oracle/FMW12210

#Set this variable value to the Installation Type selected. e.g. Service Bus.
INSTALL_TYPE=Service Bus



Save it as fmw_12.2.1.0.0_osb.rsp under ServiceBus.
This one is a little smaller then the FMW-infra one. And again here the ORACLE_HOME should be adapted in the case you choose to use another FMW_HOME location.
Install Managed File TransferThe script for installation of the Managed File Transfer Software is again more or less the same as the SOA and BPM:
#!/bin/bash
. $PWD/fmw12c_env.sh
#
export MFT_INSTALL_HOME=$PWD/../ManagedFileTransfer
export MFT_INSTALL_JAR=fmw_12.2.1.0.0_mft.jar
export MFT_INSTALL_RSP=fmw_12.2.1.0.0_mft.rsp
export MFT_INSTALL_ZIP=V78174-01.zip
#
# MFT 12c
if [[ -d "$FMW_HOME" && ! -d "$MFT_HOME/bin" ]]; then
#
#Unzip MFT
if [ ! -f "$MFT_INSTALL_HOME/$MFT_INSTALL_JAR" ]; then
if [ -f "$MFT_INSTALL_HOME/$MFT_INSTALL_ZIP" ]; then
echo Unzip $MFT_INSTALL_HOME/$MFT_INSTALL_ZIP to $MFT_INSTALL_HOME/$MFT_INSTALL_JAR
unzip $MFT_INSTALL_HOME/$MFT_INSTALL_ZIP -d $MFT_INSTALL_HOME
else
echo $MFT_INSTALL_HOME/$MFT_INSTALL_ZIP does not exist!
fi
else
echo $MFT_INSTALL_JAR already unzipped
fi
if [ -f "$MFT_INSTALL_HOME/$MFT_INSTALL_JAR" ]; then
echo Install MFT 12cR2
$JAVA_HOME/bin/java -jar $MFT_INSTALL_HOME/$MFT_INSTALL_JAR -silent -responseFile $MFT_INSTALL_HOME/$MFT_INSTALL_RSP
else
echo $MFT_INSTALL_JAR not available!
fi
else
if [ ! -d "$FMW_HOME" ]; then
echo $FMW_HOME not available: First install Fusion Middlware Infrastucture
fi
if [ -d "$MFT_HOME" ]; then
echo $MFT_HOME available: MFT Already installed
fi
fi

Save it as installMFT.sh under scripts.
Again we need a response file:
[ENGINE]

#DO NOT CHANGE THIS.
Response File Version=1.0.0.0.0

[GENERIC]

#Set this to true if you wish to skip software updates
DECLINE_AUTO_UPDATES=true

#
MOS_USERNAME=

#
MOS_PASSWORD=<SECURE VALUE>

#If the Software updates are already downloaded and available on your local system, then specify the path to the directory where these patches are available and set SPECIFY_DOWNLOAD_LOCATION to true
AUTO_UPDATES_LOCATION=

#
SOFTWARE_UPDATES_PROXY_SERVER=

#
SOFTWARE_UPDATES_PROXY_PORT=

#
SOFTWARE_UPDATES_PROXY_USER=

#
SOFTWARE_UPDATES_PROXY_PASSWORD=<SECURE VALUE>

#The oracle home location. This can be an existing Oracle Home or a new Oracle Home
ORACLE_HOME=/u01/app/oracle/FMW12210



Save it as fmw_12.2.1.0.0_mft.rsp under ManagedFileTransfer.
And again here the ORACLE_HOME should be adapted in the case you choose to use another FMW_HOME location.
Install the lotYou could run the scripts above one-by-one. Or have them called using a master script:
#!/bin/bash
echo _______________________________________________________________________________
echo Java SDK 8
./installJava.sh
echo
echo _______________________________________________________________________________
echo Fusion Middleware Infrastructure
./installFMW.sh
echo
echo _______________________________________________________________________________
echo SOA & BPM Suite
./installSOA.sh
echo
echo _______________________________________________________________________________
echo ServiceBus
./installSB.sh
echo
echo _______________________________________________________________________________
echo Managed File Transfer
./installMFT.sh

Save it as install.sh under scripts.
Repository CreationWhen the software is installed, it's time to create the repository. This requires:
  • a database, for instance an 11g XE, 11gR2 latest or 12c
  • Sys password
Where in 11g you had a separate repository creation utility in a giant (as big as a soasuite installation) installer, in 12c the RCU comes in parts per product. However, in the end it is one utility that 'grows' with each added product.

The commandline interface of the RCU is described here. In that document the commandline interface and options are described. In turns out (but not described) that the RCU also supports a response file.

The rcu install script is as follows:
#!/bin/bash
. $PWD/FMW12c_env.sh
echo Run rcu for SOA Infrastucture
export RCU_INSTALL_HOME=$PWD/../rcu
export RCU_SOA_RSP=rcuSOA.rsp
export RCU_SOA_PWD=rcuSOAPasswords.txt
#export RCU_SOA_PWD=rcuSOAPasswords-same.txt
$FMW_HOME/oracle_common/bin/rcu -silent -responseFile $RCU_INSTALL_HOME/$RCU_SOA_RSP -f < $RCU_INSTALL_HOME/$RCU_SOA_PWD

Save it as rcuSOA.sh under scripts.

This script uses both a respone file and a password file.
The response file is as follows:
#RCU Operation - createRepository, generateScript, dataLoad, dropRepository, consolidate, generateConsolidateScript, consolidateSyn, dropConsolidatedSchema, reconsolidate
operation=createRepository

#Enter the database connection details in the supported format. Database Connect String. This can be specified in the following format - For Oracle Database: host:port:SID OR host:port/service , For SQLServer, IBM DB2, MySQL and JavaDB Database: Server name/host:port:databaseName. For RAC database, specify VIP name or one of the Node name as Host name.For SCAN enabled RAC database, specify SCAN host as Host name.
connectString=darlin-vce-db:1521:PDBORCL

#Database Type - [ORACLE|SQLSERVER|IBMDB2|EBR|MYSQL] - default is ORACLE
databaseType=ORACLE

#Database User
dbUser=sys

#Database Role - sysdba or Normal
dbRole=SYSDBA

#This is applicable only for database type - EBR
#edition=

#Prefix to be used for the schema. This is optional for non-prefixable components.
schemaPrefix=DEV

#List of components separated by comma. Remove the components which are not needed.
componentList=UCSUMS,MDS,WLS,STB,OPSS,IAU,IAU_APPEND,IAU_VIEWER,SOAINFRA,ESS,MFT

#Specify whether dependent components of the given componentList have to be selected. true | false - default is false
#selectDependentsForComponents=false

#If below property is set to true, then all the schemas specified will be set to the same password.
useSamePasswordForAllSchemaUsers=false

#This allows user to skip cleanup on failure. yes | no. Default is no.
#skipCleanupOnFailure=no

#Yes | No - default is Yes. This is applicable only for database type - SQLSERVER.
#unicodeSupport=no

#Location of ComponentInfo xml file - optional.
#compInfoXMLLocation=

#Location of Storage xml file - optional
#storageXMLLocation=

#Tablespace name for the component. Tablespace should already exist if this option is used.
#tablespace=

#Temp tablespace name for the component. Temp Tablespace should already exist if this option is used.
#tempTablespace=

#Absolute path of Wallet directory. If wallet is not provided, passwords will be prompted.
#walletDir=

#true | false - default is false. RCU will create encrypted tablespace if TDE is enabled in the database.
#encryptTablespace=false

#true | false - default is false. RCU will create datafiles using Oracle-Managed Files (OMF) naming format if value set to true.
#honorOMF=false

#Variable required for component SOAINFRA. Database Profile (SMALL/MED/LARGE)
SOA_PROFILE_TYPE=SMALL

#Variable required for component SOAINFRA. Healthcare Integration(YES/NO)
HEALTHCARE_INTEGRATION=NO


Regarding the elements you want to fill using properties, this one is the largest. Important are mostly:
  • connectString=darlin-vce-db:1521:PDBORCL
  • databaseType=ORACLE
  • dbUser=sys
  • dbRole=SYSDBA
  • schemaPrefix=DEV
  • componentList=UCSUMS,MDS,WLS,STB,OPSS,IAU,IAU_APPEND,IAU_VIEWER,SOAINFRA,ESS,MFT
  • useSamePasswordForAllSchemaUsers=false

I think properties like connectString, databaseType, dbUser, dbRole speak more or less for them selves. The property 'schemaPrefix' need to be adapted according to the target environment. This can be something like DEV, TST, ACC or PRD. Or SOAO, SOAT, SOAA, SOAP (the last one is funny...)

Then the component list. For SOA and MFT there are several required components. These can be found here in the 12.1.3 docs. For 12.2.1 the list of component id's can be founde here. Unfortunately there you can't find the requirements in detail as in 12.1.3.

Then there is a password file. If you set useSamePasswordForAllSchemaUsers to true, you need only two: the sys password and the generic schema password. If as in this example the value is false you need to specify them for each schema. The password file I use looks like:

welcome1
DEV_UMS
DEV_MDS
DEV_WLS
DEV_WLS_RUNTIME
DEV_STB
DEV_OPSS
DEV_IAU
DEV_IAU_APPEND
DEV_IAU_VIEWER
DEV_SOAINFRA
DEV_ESS
DEV_MFT

The first password in the list is the system password. Then in the order of the components the passwords are listed. A few remarks:
  • The component UCSUMS (User Messaging Services) result in a schema DEV_UMS (provided that he schemaPrefix = DEV).
  • I use here passwords that equal the schema names. You probably would not do that in acceptance and/or production, but maybe you do in Dev and test. However, in the example it is handy to know at which place which password need to go.
  • The component WLS needs two passwords, since it results in two schema's: DEV_WLS and DEV_WLS_RUNTIME. It is not documented (I could not find it) but it took me considerable time, since afte DEV_WLS the passwords did not match and it complained about a missing password. Looking in a manual created repository I found that it also created the DEV_WLS_RUNTIME.
  • For Managed File Transfer (MFT) also Enterprise Schedule Service (ESS) is needed. As well as the prerequisites for SOAINFRA.
  • SOAINFRA is needed for both SOA&BPM and  Service Bus. So even if you only install Service Bus, you need to install SOAINFRA.

ConclusionAs said I these scripts help in installing the software and installing the Repository. They use shell scripts but it should not be too hard to translate them to ANT or other tooling like Ansible or Puppet if you're into one of those. To me it would be a nice finger-practice to translate it to ANT to be able to dynamically adapt the response files. I'd probably do that in the near future. And it would be a nice learning path to implement this in Ansible or Puppet.

But first for me it would be a challence to create a domain script in wlst. So hopefully I get to write about that soon.

Have you seen the menu?

Fri, 2016-04-22 05:49
And did you like it? Hardly possible to miss I think. It kept me nicely busy for a few hours. Got some great examples, and this one is purely based on css and unnumbered lists in combination with anchors. Unfortunately the menu worked with non-classed <ul>, <li> and <a> tags. So embedding the css, caused my other elements to be redefined. (It even redefined the padding of all elements).

But with some trial and error I got it working in a subclassed form. And I like it, do you?

I also found that besides articles, you also can create pages in blogger. Did not know about that, completely overlooked that. I think I try something out, so if you're a regular visitor, you might find that there's work in progress.

The wish for a menu popped up a little while ago, and I kept thinking about it, to be able to get some structure in my articles. From the beginning I tagged every article, but not with a real plan. So I got tags stating 'Oracle BPM Suite', but also 'SOA Suite'. And 'Database', but also 'Database 11g'. Not so straightforward and purposeful.

But a purpose arose. For a longer while I'm thinking about if writing a book would be something for me. I like to write articles on a (ir)regular basis. On this blog you can find a broad range of subjects. But could I do a longer series on a particular subject? And could it lead to a more structured and larger form like a book? I learned from a former co-worker that he had this idea to write articles on a regular basis to buildup a book gradually. And I like that. But what subject would it be? My core focus area is SOA Suite and BPM Suite. But loads of books are written about that. Well, maybe not loads, but at least some recognized, good ones. And even PCS (Process Cloud Service) and ICS (Integration Cloud Service) are (being) covered.

But when Oracle acquired Collaxa in 2004, I worked at Oracle Consulting and got to work with it in the very early days. And I think in the Netherlands at least, I was (one of) the first one(s) from Oracle to provide training on BPEL, at least for Oracle University in the Netherlands. So I got involved in BPEL from the first hour Oracle laid hands on it. Could BPEL be a subject I could cover? Of course I'll not be the first one to cover that. Both on BPEL 1.1 as on 2.0 you can google up a book (is that already a term?), the one on 1.1 I still had stacked in a pile behind another one on my bookshelf.

So let's see where this leads me. You can expect a series on BPEL, in parallel of other articles on subjects that come around during my work. From real novice (do you already use scopes and local variables?), up to some more advanced stuff (how about dynamic partnerlinks; are you already into Correlation Sets, transaction handling, BPEL and Spring? )

It might bleed to death. It might become a nice series and nothing more than that. And it might turn out a real informative stack of articles that could be re-edited into a book. But when I'm at it, turning to cover the more advanced subjects, I plan to pol for what you want to have covered. I think I do know something about BPEL. But as you read with me, maybe you could point me out to subjects I don't know yet. Consider yourself invited to read along.

XA Transactions with SOASuite JMS Adapter

Tue, 2016-04-19 12:39
JMS is perfect for setting transaction boundaries and in OSB it is pretty clear on how JMS transactions are handled. However, in SOASuite using the JMS adapter the SOA Infrastructure is handling your JMS transactions by default; and messages are removed from the queue rightaway because the Get's are Auto-acknowledged. If something fails, you would expect that messages are rolled back to the JMS queue and eventually moved to the error queue. But, again by default, not with the SOASuite/JMS Adapter. In that case the BPEL process, for instance, fails and get's in a recovery state, to be handled in the 'Error Hospital'in Enterprise Manager. But I want JMS to handle it! (Says the little boy...)

So how do we accomplish that? Today I got the chance to figure that out.

Start with a JMS setup with a JMS Server, Module and a Queue with an Error Queue that is configured to be the error destination on the first queue. On the first queue set a redelivery limit to 3 and a redelivery delay on for instance 60000 ms (or something like that). I'm not going in to that here.
Create also a Connection Factory in the JMS Module with a proper jndi, something like 'jms/myApplicationCF'.

In the JMS adapter on SOASuite there are several OutboundConnectionFactories already pre-configured. It is quite convenient to use the one with JNDI 'eis/wls/Queue'. But if you look into that, you'll see that it uses the default WebLogic JMS Connection factory 'weblogic.jms.XAConnectionFactory'. Not much wrong with that, but you can't configure that for your own particular situation. But more over it is configured with 'AcknowledgeMode' = 'AUTO_ACKNOWLEDGE'. As you can read in the docs there are three values for the AcknowledgeMode:
  • DUPS_OK_ACKNOWLEDGE, for consumers that are not concerned about duplicate messages
  • AUTO_ACKNOWLEDGE, in which the session automatically acknowledges the receipt of a message
  • CLIENT_ACKNOWLEDGE, in which the client acknowledges the message by calling the message's acknowledge method
So create a new outbound connection factory, with a JNDI like 'eis/jms/MyApp'. 
Now, apparently we don't want  'AUTO_ACKNOWLEDGE', because that would cause the message-get acknowledged 'On Get'. So you could rollback until 'Saint Juttemis' (as we say in our family) but it won't go back on the queue. Dups aren't ok with me, so I'll choose 'CLIENT_ACKNOWLEDGE' here. Then there's another option: 'IsTransacted'. I want that one on 'true'. Then in ConnectionFactoryLocation, you'd put the JNDI of your JMS Connection factory, in my example 'jms/myApplicationCF'.

So you'll get something like:
 
On the tab Transaction, validate that the transaction support is set to a XA Transaction:

Having done that, you can update/redeploy your JMS Adapter with the changed plan. I figure that how to do that is straight forward, especially when you've done that with DB Adapters already.

I created two SOA Projects (actually I adapted those created by a co-worker). The first one is TestPutJMS:

The project is straight forward with a WSDL refering to an xsd with two fields:







The bpel is then as follows:

It assigns the request to the input variable of the invoke of the JMSPut. The JMS_Put is an jms-adapter configuration, referring to the JNDI 'eis/jms/myApp', defined in the JMS Adapter.

After that there's an if on the action field, where in the case of a certain value a fault is thrown, to validate if the Put is rolled back.

In my case it's more interesting to look at the Get part. That project is as follows:

In this case there's a mediator wired to the get adapter config, also referring to the 'eis/jms/myApp' JNDI. The mediator routes to the bpel process. The transaction handling of a mediator is simple and straight-forward:
  • If there's a transaction it will subscribe to that,
  • if there isn't, a new transaction is created.
The JMS Adapter creates an new XA Transaction. On the JMS Adaptor on WLS we configured that no Auto Acknowledge should occur, and we want a transaction. Thus, this is the transaction that is re-used by the Mediator. But how about the BPEL?  The BPEL is asynchronous request only. Since it has no way to reply the response, or it would be on a response queue.
By default you would have a property 'bpel.config.oneWayDeliveryPolicy' set to 'async.persist'. But that would mean that a new thread is started. Setting it on 'sync' would cause the thread that is started by the Adapter is reused. I also want to subscribe to the already running transaction of the JMS Adapter as it is passed through by the mediator. Setting the property 'bpel.config.transaction' to 'required' will take care of that. Summarized, I set the following properties on the bpel:
  • bpel.config.transaction: required => subscribe to already opened transaction
  • bpel.config.oneWayDeliveryPolicy: sync => reuse existing running thread


The process looks like:

 
Here I have an if with a conditional throw of an exception as well. Based on the value of the action element I can have it to throw a custom exception, that will cause the BPEL to fail and the transaction rolled back.
When I have a redelivery limit to 3, I'll get three retries, so in total 4 tries of the BPEL process. After that, the message is moved to the JMS Error Queue.

A nice article on the JMS Transactions from the A team is found here. However, the setup above leaves the redelivery handling by JMS. So, in 12cR2 that is, I find that the properties of the JMS Queue apparently has preference over the settings I did in the TestJMSGet Service on the composite:

I hope this article clears things up regarding the JMS Adapter configuration for transactions.

Extend your SOA Domain with Insight

Mon, 2016-04-11 11:43
Lately I wrote about how to install RealTime Integration Business Insight. It's about installing the software, actually. In the quickstart you'll read that you actually have to extend your domain as well.

It actually states that you can install it in your SOA QuickStart  installment as well, but I didn't try that (yet).

However, you need to extend your domain with the following items:
  • Insight SOA Agent 12.2.1 [soa]
  • Insight Service Bus Agent 12.2.1 [osb]
  • Insight 12.2.1 [soa]
To do so, shutdown your domain (if not done so), but (as I found needed) start (or leave it up) your infra database.

Set your FMW environment, as I put in my fmw12c_env.sh script:
[oracle@darlin-vce-db bin]$ cat ~/bin/fmw12c_env.sh
#!/bin/bash
echo set Fusion MiddleWare 12cR2 environment
export JAVA_HOME=/usr/java/jdk1.8.0_74
export FMW_HOME=/u01/app/oracle/FMW12210
export WL_HOME=${FMW_HOME}/wlserver
export NODEMGR_HOME=/u01/app/work/domains/soabpm12c_dev/nodemanager

export SOA_HOME=$FMW_HOME/soa
export OSB_HOME=$FMW_HOME/osb
export MFT_HOME=$FMW_HOME/mft
#
echo call setWLSEnv.sh
. $FMW_HOME/wlserver/server/bin/setWLSEnv.sh
export PATH=$FMW_HOME/oracle_common/common/bin:$WL_HOME/common/bin/:$WL_HOME/server/bin:$PATH[oracle@darlin-vce-db bin]$
... and navigate to the $FMW_HOME/oracle_common/common/bin folder and start config.sh:

[oracle@darlin-vce-db ~]$ . fmw12c_env.sh
set Fusion MiddleWare 12cR2 environment
call setWLSEnv.sh
CLASSPATH=/usr/java/jdk1.8.0_74/lib/tools.jar:/u01/app/oracle/FMW12210/wlserver/modules/features/wlst.wls.classpath.jar:

PATH=/u01/app/oracle/FMW12210/wlserver/server/bin:/u01/app/oracle/FMW12210/wlserver/../oracle_common/modules/org.apache.ant_1.9.2/bin:/usr/java/jdk1.8.0_74/jre/bin:/usr/java/jdk1.8.0_74/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/home/oracle/.local/bin:/home/oracle/bin:/u01/app/oracle/FMW12210/wlserver/../oracle_common/modules/org.apache.maven_3.2.5/bin

Your environment has been set.
[oracle@darlin-vce-db ~]$ cd $FMW_HOME/oracle_common/common/bin
[oracle@darlin-vce-db bin]$ ls
clonedunpack.sh config_builder.sh pack.sh reconfig.sh
commBaseEnv.sh config.sh prepareCustomProvider.sh setHomeDirs.sh
commEnv.sh configWallet.sh printJarVersions.sh unpack.sh
commExtEnv.sh getproperty.sh qs_config.sh wlst.sh
[oracle@darlin-vce-db bin]$ ./config.sh

In the first screen set the radio button to 'Update an existing domain':

Then Click Next, and check the items listed above:

Click Next, Next, ... Finish.
If you would have checked the 'Deployments' checkbox under the Advanced Configuration, you could have reviewed that the particular deployments are automatically targeted to the BAM, OSB and SOA clusters.

After this you can start your servers and start using insight, for example beginning with the Set up of the Insight Demo Users. This is properly described in the Quickstart Guide. But, as I'm on to it, let me try right a way. The demo users setup is downloadable here. Download it and unzip it in a folder on your server.

First we'll have to set the environment. So I call my neat fmw12_env.sh script first (in a new terminal), and explicitly set the $MW_HOME variable:
[oracle@darlin-vce-db bin]$ . fmw12c_env.sh
set Fusion MiddleWare 12cR2 environment
call setWLSEnv.sh
CLASSPATH=/usr/java/jdk1.8.0_74/lib/tools.jar:/u01/app/oracle/FMW12210/wlserver/modules/features/wlst.wls.classpath.jar:

PATH=/u01/app/oracle/FMW12210/wlserver/server/bin:/u01/app/oracle/FMW12210/wlserver/../oracle_common/modules/org.apache.ant_1.9.2/bin:/usr/java/jdk1.8.0_74/jre/bin:/usr/java/jdk1.8.0_74/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/home/oracle/.local/bin:/home/oracle/bin:/u01/app/oracle/FMW12210/wlserver/../oracle_common/modules/org.apache.maven_3.2.5/bin

Your environment has been set.
[oracle@darlin-vce-db bin]$ export MW_HOME=$FMW_HOME
[oracle@darlin-vce-db bin]$ echo $MW_HOME
/u01/app/oracle/FMW12210
[oracle@darlin-vce-db bin]$ echo $JAVA_HOME
/usr/java/jdk1.8.0_74
[oracle@darlin-vce-db bin]$ echo $ANT_HOME
/u01/app/oracle/FMW12210/wlserver/../oracle_common/modules/org.apache.ant_1.9.2

We're going to call an ant script that apparently needs the following variables set:
  • MW_HOME= <Middleware home of the environment>
  • JAVA_HOME= <Location of java home>
  • ANT_HOME=$MW_HOME/oracle_common/modules/org.apache.ant_1.9.2
  • PATH=$JAVA_HOME/bin:$ANT_HOME/bin:$PATH
The first one is not set by my script (I called it $FMW_HOME), so I needed to set $MW_HOME to $FMW_HOME, the last three are set by my script.

Running the script with a developer topology domain (everything in the AdminServer or DefaultServer in the SOA QuickStart) will probably go ok. But a stuborn guy as I am tries to do this in a more production like topology with seperate SOA, OSB and BAM clusters. So it turns out that you need to adapt the insight.properties that is in the bin folder of the InsightDemoUserCreation.zip (also when you're not like me, you'll need to review it...).
After editing, mine looks like:

#Insight FOD Automation file

wls.host = darlin-vce-db
wls.port = 7001
soa_server_port = 7005
bam_server_port = 7006
userName = weblogic
passWord = welcome1
oracle_jdbc_url = jdbc:oracle:thin:@darlin-vce-db:1521:ORCL
db_soa_user = DEV_SOAINFRA
oracle_db_password = DEV_SOAINFRA
db_mds_user = DEV_MDS
mds.password = DEV_MDS
jdbc_driver = oracle.jdbc.OracleDriver

When all is right then you can run:
[oracle@darlin-vce-db bin]$ cd /media/sf_Stage/InsightDemoUserCreation/bin/
[oracle@darlin-vce-db bin]$ ant createInsightDemoUsers

Unfortunately I can't show you correct output since, although I seem to have set my properties correctly, I get failures. It turns out that my server (all in one VM) ran so slow, that Insight could not be started due to time outs in getting a database connection....
After restarting BAM all went well, except for the exceptions indicating that the users were already created.

Use external property file in WLST

Thu, 2016-04-07 00:41
I frequently create a wlst script, that needs properties. Not so exciting, but how to do that in a convenient way, and how to detect in a clean way that properties aren't set?

You could read a property file like described here. The basics are to use in fact Java to create a properties object and a FileInputStream to read it:
#Script to load properties file.

from java.io import File
from java.io import FileInputStream
from java.util import Properties


#Load properties file in java.util.Properties
def loadPropsFil(propsFil):

inStream = FileInputStream(propsFil)
propFil = Properties()
propFil.load(inStream)

return propFil

I think the main disadvantage is that it clutters the script-code and you need to call 'myPorpFil.getProperty(key)' to get the property value.

Following the documentation you can use the commandline option '-loadProperties propertyFilename' to explicitly provide a property file. I found this actually quite clean. Every property in the file becomes automatically available as a variable in your script.

Besides that I found a teriffic blog-post on error handling in wlst. It states that with ' except NameError, e:' you can handle the reference to a variable that is not declared earlier.

I combined these two sources to come up with a script template that alows me to provide property files for different target environments as a commandline option, while detecting if properties are provided. So let's assume you create a porpererty file named for instance 'localhost.properties' like:
#############################################################################
# Properties voor localhost Integrated Weblogic
#
# @author Martien van den Akker, Darwin-IT Professionals
# @version 1.0, 2016-04-06
#
#############################################################################
#
# Properties voor localhost
adminUrl=localhost:7101
adminUser=weblogic
adminPwd=welcome1
clustername=LocalCluster
# Generieke properties voor het creeeren van JMS componenten
#jmsFileStoresBaseDir=/app/oracle/config/cluster_shared/filestore/
jmsFileStoresBaseDir=c:/Data/JDeveloper/SOA/filestore
#Filestore 01
...

Then you can use that with the following script, named for instance 'createJMSServersWithFileStoreV2.py':
#############################################################################
# Create FileStores and JMS Servers
#
# @author Martien van den Akker, Darwin-IT Professionals
# @version 1.0, 2016-04-06
#
#############################################################################
# Modify these values as necessary
import sys, traceback
scriptName = 'createJMSServersWithFileStoreV2.py'
#
#
def usage():
print 'Call script as: '
print 'Windows: wlst.cmd'+scriptName+' -loadProperties localhost.properties'
print 'Linux: wlst.sh'+scriptName+' -loadProperties environment.properties'
print 'Property file should contain the following properties: '
print "adminUrl='localhost:7101'"
print "adminUser='weblogic'"
print "adminPwd='welcome1'"

def main():
try:
#Connect to administration server
print '\nConnect to AdminServer via '+adminUrl+' with user '+adminUser
connect(adminUser, adminPwd, adminUrl)
...
except NameError, e:
print 'Apparently properties not set.'
print "Please check the property: ", sys.exc_info()[0], sys.exc_info()[1]
usage()
except:
apply(traceback.print_exception, sys.exc_info())
stopEdit('y')
exit(exitcode=1)

#call main()
main()
exit()

You can call it like 'wlst createJMSServersWithFileStoreV2.py -loadProperties localhost.properties'. If you don't provide a property file you'll get:
e:\wls>wlst createJMSServersWithFileStoreV2.py

Initializing WebLogic Scripting Tool (WLST) ...

Welcome to WebLogic Server Administration Scripting Shell

Type help() for help on available commands

Apparently properties not set.
Please check the properties: exceptions.NameError adminUrl
Call script as:
Windows: wlst.cmdcreateJMSServersWithFileStoreV2.py -loadProperties localhost.properties
Linux: wlst.shcreateJMSServersWithFileStoreV2.py -loadProperties environment.properties
Property file should contain the following properties:
adminUrl='localhost:7101'
adminUser='weblogic'
adminPwd='welcome1'


Exiting WebLogic Scripting Tool.


e:\wls>

Pretty clean. You could even use the 'except NameError, e:' construct to conditionally execute code when properties are set by ignoring/handling the situation when particular properties are intentionally not provided.

Install Oracle Real-Time Integration Business Insight

Fri, 2016-04-01 07:50
Yes, Oracle FMW Integration Insight is available, as I wrote in an earlier post. You can download it here.
But of course we're very curious on how to install it. Do I have to unzip it into my FMW_HOME? Is there a nice Oracle Installer that I can run silently? No, none of that: it comes as a set of OPatch patches on SOASuite 12.2.1:
  1. p22189824_122100_Generic.zip: OPatch containing Oracle Real-Time Integration Business Insight 12.2.1.0.0
  2. p22655174_122100_Generic.zip: OPatch containing updates to SOA and BAM 12.2.1.0.0 
  3. p22659236_122100_Generic.zip: OPatch containing updates to Service Bus 12.2.1.0.0
Following the README.txt in the zip the correct order is to first install ORIBI and then patch SOA&BPM and then SB.


Earlier I wrote about installing BPM QuickStart under Linux. Based on that I created a script to install SOASuite. Maybe I should write about that in another post. I haven't tried if it's possible to install this in a SOA or BPM Quickstart, but I did it in a full FMW installation, that I built using my scripts. So this installation I scripted following the scripting-work I've done earlier


For this setup I have 2 folders:
  • scripts: with the scripts.
  • ofm_integration_insight_1221: with the downloaded ofm_integration_insight_12.2.1.0.0_disk1_1of1.zip
The scripts folder contains two scripts:
fmw12c_env.sh:
#!/bin/bash
echo set Fusion MiddleWare 12cR2 environment
export JAVA_HOME=/usr/java/jdk1.8.0_74
export FMW_HOME=/u01/app/oracle/FMW12210
export SOA_HOME=$FMW_HOME/soa
export OSB_HOME=$FMW_HOME/osb
export MFT_HOME=$FMW_HOME/mft

This provides the settings to the FMW_HOME and the JAVA_HOME. And the product homes I needed for my SOABPM Suite installation (I definately should write that down!).

The actual install script is installOII.sh:
#!/bin/bash
. $PWD/fmw12c_env.sh
#
export CD=$PWD
export OII_INSTALL_HOME=$CD/../ofm_integration_insight_1221
export OII_INSTALL_ZIP=ofm_integration_insight_12.2.1.0.0_disk1_1of1.zip
export OPATCH_SOABPM_ZIP=p22655174_122100_Generic.zip #OPatch containing updates to SOA and BAM 12.2.1.0.0
export OPATCH_SOABPM_NR=22655174
export OPATCH_OSB_ZIP=p22659236_122100_Generic.zip #OPatch containing updates to Service Bus 12.2.1.0.0
export OPATCH_OSB_NR=22659236
export OPATCH_OII_ZIP=p22189824_122100_Generic.zip #OPatch containing Oracle Real-Time Integration Business Insight 12.2.1.0.0
export OPATCH_OII_NR=22189824
export PATCHES_HOME=$FMW_HOME/OPatch/patches
export ORACLE_HOME=$FMW_HOME
# Unzip OII Install zip
if [ ! -f "$OII_INSTALL_HOME/$OPATCH_OII_ZIP" ]; then
if [ -f "$OII_INSTALL_HOME/$OII_INSTALL_ZIP" ]; then
echo Unzip $OII_INSTALL_HOME/$OII_INSTALL_ZIP to $OII_INSTALL_HOME
unzip $OII_INSTALL_HOME/$OII_INSTALL_ZIP -d $OII_INSTALL_HOME
else
echo $OII_INSTALL_HOME/$OII_INSTALL_ZIP does not exist
fi
fi
#
echo Check zips
cd $OII_INSTALL_HOME
md5sum -c patches.MD5
cd $CD
#
# Check patches folder
if [ ! -d "$PATCHES_HOME" ]; then
mkdir $PATCHES_HOME
else
echo $PATCHES_HOME available
fi
#
#Unzip OII patch
if [ ! -d "$PATCHES_HOME/$OPATCH_OII_NR" ]; then
if [ -f "$OII_INSTALL_HOME/$OPATCH_OII_ZIP" ]; then
echo Unzip $OII_INSTALL_HOME/$OPATCH_OII_ZIP to $PATCHES_HOME
unzip $OII_INSTALL_HOME/$OPATCH_OII_ZIP -d $PATCHES_HOME
echo Apply OII Patch
cd $PATCHES_HOME/$OPATCH_OII_NR
$ORACLE_HOME/OPatch/opatch apply
else
echo $OII_INSTALL_HOME/$OPATCH_OII_ZIP does not exist!
fi
else
echo OII Patch $PATCHES_HOME/$OPATCH_OII_NR already available
fi
cd $CD
#Unzip SOA&BPM patch
if [ ! -d "$PATCHES_HOME/$OPATCH_SOABPM_NR" ]; then
if [ -f "$OII_INSTALL_HOME/$OPATCH_SOABPM_ZIP" ]; then
echo Unzip $OII_INSTALL_HOME/$OPATCH_SOABPM_ZIP to $PATCHES_HOME
unzip $OII_INSTALL_HOME/$OPATCH_SOABPM_ZIP -d $PATCHES_HOME
echo Apply SOA BPM Patch
cd $PATCHES_HOME/$OPATCH_SOABPM_NR
$ORACLE_HOME/OPatch/opatch apply
else
echo $OII_INSTALL_HOME/$OPATCH_SOABPM_ZIP does not exist!
fi
else
echo SOA-BPM Patch $PATCHES_HOME/$OPATCH_SOABPM_NR already available
fi
cd $CD
#Unzip OSB patch
if [ ! -d "$PATCHES_HOME/$OPATCH_OSB_NR" ]; then
if [ -f "$OII_INSTALL_HOME/$OPATCH_OSB_ZIP" ]; then
echo Unzip $OII_INSTALL_HOME/$OPATCH_OSB_ZIP to $PATCHES_HOME
unzip $OII_INSTALL_HOME/$OPATCH_OSB_ZIP -d $PATCHES_HOME
echo Apply OSB Patch
cd $PATCHES_HOME/$OPATCH_OSB_NR
$ORACLE_HOME/OPatch/opatch apply
else
echo $OII_INSTALL_HOME/$OPATCH_OSB_ZIP does not exist!
fi
else
echo OSB Patch $PATCHES_HOME/$OPATCH_OSB_NR already available
fi
cd $CD
echo Finished installing Oracle Fusion MiddleWare Integration Insight


The script first unzips the downloaded ofm_integration_insight_12.2.1.0.0_disk1_1of1.zip into:
  • p22189824_122100_Generic.zip
  • p22655174_122100_Generic.zip
  • p22659236_122100_Generic.zip
  • patches.MD5
  • README.txt
Then  it performs 'md5sum -c patches.MD5' to checks on the zips, but it ignores the results, just prints them.
Then for each patch it checks if the patch is already unzipped in the FMW_HOME/Opatch/patches folder. If so, it just assumes that its applied as well. If not the patch-zip is unzipped in the patches folder. And then it will perform opatch apply.
Opatch will ask if you want to proceed (answer with 'y') and if the system is ready to be patched (again answer with 'y'). For the SB Patch (the last in the list) it will look like:
Apply OSB Patch
Oracle Interim Patch Installer version 13.3.0.0.0
Copyright (c) 2016, Oracle Corporation. All rights reserved.


Oracle Home : /u01/app/oracle/FMW12210
Central Inventory : /u01/app/oraInventory
from : /u01/app/oracle/FMW12210/oraInst.loc
OPatch version : 13.3.0.0.0
OUI version : 13.3.0.0.0
Log file location : /u01/app/oracle/FMW12210/cfgtoollogs/opatch/22659236_Apr_01_2016_09_37_29/apply2016-04-01_09-37-22AM_1.log


OPatch detects the Middleware Home as "/u01/app/oracle/FMW12210"

Verifying environment and performing prerequisite checks...
OPatch continues with these patches: 22659236

Do you want to proceed? [y|n]
y
User Responded with: Y
All checks passed.

Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/oracle/FMW12210')


Is the local system ready for patching? [y|n]
y
User Responded with: Y
Backing up files...
Applying interim patch '22659236' to OH '/u01/app/oracle/FMW12210'

Patching component oracle.osb.server, 12.2.1.0.0...

Patching component oracle.osb.server, 12.2.1.0.0...
Patch 22659236 successfully applied.
Log file location: /u01/app/oracle/FMW12210/cfgtoollogs/opatch/22659236_Apr_01_2016_09_37_29/apply2016-04-01_09-37-22AM_1.log

OPatch succeeded.
Finished installing Oracle Fusion MiddleWare Integration Insight

If you have a home with only OSB or only SOA-BPM adapt the script yourself to not patch the not-installed product.

Oh, I did not check on the install, for now I assume it worked.  Next step for me is describe the SOA/BPM install and check-out on the Integration Insight product.

Auto DDL: delete obsolete columns from table

Wed, 2016-03-30 05:55
A quick one. In the past I used to generate ddl based on queries, like the following. But I find myself to re-invent them again. So to have it saved for my offspring: here's one on deleting obsolete columns as generated on importing an excel sheet in SQLDeveloper:


declare
l_schema_name varchar2(30) := 'MY_SCHEMA';
l_table_name varchar2(30) := 'A_TABLE';
cursor c_cols is
select column_name
from all_tab_columns col
where col.table_name = l_table_name
and col.owner = l_schema_name
and col.column_name like 'COLUMN%';
begin
for r_cols in c_cols loop
execute immediate 'alter table '||l_schema_name||'.'||l_table_name||' drop column '||r_cols.column_name;
end loop;
end;
/

And here's one to generate a check constraint on all index colunns of a table:

declare
l_schema_name varchar2(30) := 'MY_SCHEMA';
l_table_name varchar2(30) := 'A_TABLE';
l_constraint_name_pfx varchar2(30) := 'XXX_ALIAS_CHK';
l_idx pls_integer := 1;
cursor c_cols is
select column_name
from all_tab_columns col
where col.table_name = l_table_name
and col.owner = l_schema_name
and col.column_name like 'IND_%';
begin
for r_col in c_col loop
execute immediate 'ALTER TABLE '||l_schema_name||'.'||l_table_name||' ADD CONSTRAINT '||l_constraint_name_pfx||l_idx||' CHECK ('||r_col.column_name||' in (''J'',''N''))ENABLE';
l_idx := l_idx+1;
end loop;
end;
/

Real-Time Integration Business Insight Available

Wed, 2016-03-23 11:22
To day Real-Time Integration Business Insight is available. I wrote about it in my summary of the OPN FMW Community forum. I hope I can get into it in the near future.

Enable Process Analytics in BPM12c

Wed, 2016-03-23 11:07
To be able to use BAM12c together with BPM12c, you'll need to enable process analytics. This means that only when that is enabled BAM12c will write the sample data to the proces cubes/star schema.

To do so you'll need to go to the enterprise manager (eg. http://darlin-vce-db:7001/em). Then open up the System MBean Browser. This can be started from the soa-infra:

And than from the SOA Infrastructure -> Administration -> System MBean Browser:


However, you can also start it a little quicker from the Weblogic Domain menu:
In the MBean Browser look for 'Application Defined MBeans':
Than look for 'oracle.as.soainfra.config'-> 'your server' -> AnalyticsConfig -> analytics:

Then in the pane make sure that both 'DisableAnalytics' and 'DisableProcessMetrics' are set to false:


 And click 'Apply'.

Above you'll see the layout of 12.2.1, but in 12.1.3 it works the same. Restart the SOA Server after that.

I'm not the first one to write about these changes, but I found that you can only update these fields if you have started the BAM server at least once. Apparently the BAM Server registers itself so that only after that you can update and apply these attributes.




BAM 12c: Extent Data objects

Wed, 2016-03-23 10:42
BAM 12c is a huge improvement against 11g. Best thing I think is that it is quite a lot easier to create a dashboard. There are several tutorials on BAM, for instance at the BAM12c site, so I'm not going to explain how to create a dashboard here.

One thing however on business queries: the examples mostly start with a BPM process and then query from the Process of Activity Data Object as created on deployment of the particular process. How-ever, often you'll find that you want to filter on a certain date range, for instance process started less than a day or a week ago. Or activities running less than an hour, or between an hour and two hours, two and three hours or longer. But then you'll find that you can't filter in the Business Queries on a date function. For instance you can't filter on something like  '{process start date} < now() - 7'.

To solve that you can add extra Calculated Fields that return yes or no or  1 or 0 if a certain date calculation condition is met. To do so go to the administration tab of the BAM Composer (eg. http://darlin-vce-db:7006/bam/composer):
 
 Then you can expand the Data Objects and you'll find that the process that is deployed resulted in two Dataobjects, one for Activities and one or the Process-instances:

By the way, to get those you can need to have the process analytics enabled. I'll explain that in another blog.

Click for instance on the CustomerSurvey Activity, then on the tab 'Calculated Fields' and then on 'Add Calculated Field':
d
You need to provide a name that has no spaces but only lowercase or uppercase letters and underscores. Then you can provide a name that is shown in the designer and in flat queries. The column type can be measure, dimension or attribute, but in this case you'll want attribute, to be able to filter on it. In this case I returned 'J' or 'N' for 'Ja' (Yes) or 'Nee' (No). This is sufficient for filtering. But if you want to count/summarize instances that are running less than one hour, or between one or two hours, etc., then you might want to return 1 or 0.

Click on OK and then save:

By clicking on the pencil-icon you can edit the field.

I'll provide some other examples that I found helpfull for the activity dataobject:



Field NameDisplay NameColumn TypeExpressiondescriptionactivity_started_lt_week_agoActivity started less than week agoAttributeIF(DATEDIFF(SQL_TSI_DAY,{Activity Start Time},now())<=7)THEN("J")ELSE("N")Is the activity started at most 7 days ago? (J/N)activity_started_lt_day_agoActivity started less than day agoAttributeIF(DATEDIFF(SQL_TSI_HOUR,{Activity Start Time},now())<=24)THEN("J")ELSE("N")Is the activity started at most 24 hours ago? (J/N)Activiteit_Looptijd_minActiviteit Loop tijd (min)AttributeIF({Activity Instance Status}=="ACTIVE")THEN(DATEDIFF(SQL_TSI_MINUTE,{Activity Start Time},now()))ELSE(DATEDIFF(SQL_TSI_MINUTE,{Activity Start Time},{Activity End Time}))Actual running time of the activity instance. If the instance is active, than the result is the difference between the start time and the current time (NOW()), otherwise it is the difference between de  start time and the end time. The "Activity Running Time" is aparently different from the predefined runningtime field, because of the sampling moments. Sometimes the Running time is near to zero, while the instance is still active. Activiteit_Looptijd_lt_1hrActiviteit Looptijd < 1 uurAttributeIF({Activity Instance Status}=="ACTIVE")&&(DATEDIFF(SQL_TSI_MINUTE,{Activity Start Time},now())<60 td="">Is Activity Running less than an hour?Activiteit_Looptijd_lt_2hrActiviteit Looptijd < 2 uurAttributeIF({Activity Instance Status}=="ACTIVE")&&(DATEDIFF(SQL_TSI_MINUTE,{Activity Start Time},now())>=60&&DATEDIFF(SQL_TSI_MINUTE,{Activity Start Time},now())<120 td="">Is Activity Running more than one but less than two hours?Activiteit_Looptijd_lt_3hrActiviteit Looptijd< 3 uurAttributeIF({Activity Instance Status}=="ACTIVE")&&(DATEDIFF(SQL_TSI_MINUTE,{Activity Start Time},now())>=120&&DATEDIFF(SQL_TSI_MINUTE,{Activity Start Time},now())<180 td="">Is Activity Running more than two but less than three hours?Activiteit_Looptijd_gt_maxActiviteit Looptijd > maxAttributeIF({Activity Instance Status}=="ACTIVE")&&(DATEDIFF(SQL_TSI_MINUTE,{Activity Start Time},now())>180)THEN(1)ELSE(0)Is Activity Running 3 hours or longer?Activiteit_is_openActiviteit is open?AttributeIF({Activity Instance Status}=="ACTIVE")THEN("J")ELSE("N")Is the activity still Open?
For the process Data Objects these are a good starting point:
Field NameDisplay NameColumn TypeExpressiondescriptionProcess_Running_Time_Min_attrProcess Running Time (Min) AttrAttribute{Process Running Time (millisecs)}/600000Number of minutes a process is executed. There is another comparable field already defined, but that is of type 'Measurement'. You can't use that for  analytid functions as AVG, MIN, MAX, etc.process_started_lt_week_agoProcess started less than week agoAttributeIF(DATEDIFF(SQL_TSI_DAY,{Process Start Time},now())<=7)THEN("J")ELSE("N")Is the process instantie started at most 7 days ago? (J/N)process_started_lt_day_agoProcess started less than day agoAttributeIF(DATEDIFF(SQL_TSI_HOUR,{Process Start Time},now())<=24)THEN("J")ELSE("N")Is the process instance started at most 24 hours ago? (J/N)Process_Looptijd_in_minProcess Looptijd (min)AttributeIF({Process Instance Status}=="ACTIVE")THEN(DATEDIFF(SQL_TSI_MINUTE,{Process Start Time},now()))ELSE(DATEDIFF(SQL_TSI_MINUTE,{Process Start Time},{Process End Time}))Actual running time of the process instance. If the instance is active, than the result is the difference between the start time and the current time (NOW()), otherwise it is the difference between de  start time and the end time. The "Process Running Time" is aparently different from the predefined runningtime field, because of the sampling moments. Sometimes the Running time is near to zero, while the instance is still active. So these help you in filter and aggregate on activity and process running times. Sorry for the dutch names, but I figure you can get the meaning.

The expressions are based on info I got from the user guide. You can find the 12.2.1 user guide over here. The 12.1.3 can be found here. Look for chapter 13.8 (in the 12.2.1 user guide) or 14.8 (in the 12.1.3 user guide).

Unable to logon to BPM Workspace

Tue, 2016-03-22 03:30
Yesterday I tried to test a demo bpm process with a few tasks. But I couldn't logon to the workspace. I couln't find an error, except for:

<[ServletContext@452818297[app:OracleBPMWorkspace module:/bpm/workspace path:null spec-version:3.1]] Servlet failed with an Exception java.lang.IllegalStateException: Response already committed

I tried several optional solutions, like setting the listen-address, that did not work. What solved the issue was setting the ServerURL in the System MBean browser of the soa-infra.
To do so in BPM12cR2, go to the Enterprise Manager, (eg. http://darlin-vce-db:7001/em as it is in my VM) and go to the soa-infra:
Then in the SOA Infrastructure -> Administration menu open the System MBean Browser:
In the System MBean Browser look for the Application Defined MBeans and expand it:
Within the Application Defined MBeans, expand 'oracle.as.soainfra.config', then your server  and then 'SoaInfraConfig' and click on soa-infra:
Find the attribute ServerURL and edit it to the host:port of your soa-server, including the 'http://' protocol, eg. 'http://darlin-vce-db:7005':
 
Don't forget to hit the enter key and click on Apply:


Restart your server and it should be good to go.

OPN Fusion Middleware Community Forum 2016: my summary

Sat, 2016-03-19 05:24

I just boarded on my flight back to Amsterdam and  I'd grab a chance to put down my first impressions of this week.

Apart from catching a flu, it was a great event. As could be expected the word describing the weather fenomena I'm about to enter resonates throughout the whole event. (For those who didn't catch my vague description: it's 'Cloud, Cloud,  Cloud).

One of the presenters put that Oracle needs to go for 100% Cloud for a while to become a recognised Cloud provider. But in a few years it wil most probably turn out that in most cases we wil be in a hybrid environment. And then Oracle will have the best papers to provide for both Cloud and On-Premise.  By the way: Oracle will encounter the same problems with upgrades that customers have for years.  So it can be expected that upgrading Oracle software will improve over time as well.

The most interesting session on the first day for me was the session of Grant Ronald about Application Builder Cloud Service and Max: Mobile Application Accelerator.  The latter is a little like Forms Builder in the cloud to build an app the 4 GL way.  Very cool in my opinion and Grant was amusing as always.  Smart to give him the last slot of the day.

The second day started for me with an excellent introduction and demo of Real Integration Business Insight or in short Insight.  Although leveraging BAM it has a whole different approach.  With BAM you need to complete your composite with Business Indicator definitions and deploy it. Only then you can define your queries, views and dashboard.  But with Insight you can do it top down. You don't need to define any indicators or sensors in your composites or service bus projects.  You define a model with milestones and Business Indicators afterwards in Insight. Then you need to map the indicators and milestones to particular points in your services.  Probably this is done by a more technical person or architect, who knows the implementation of the services.  For the rest business users can implement it. Your model can even span composites that are not interrelated with an ECID, using a correlation key.  This comes in convenient in cases like where  one BPEL process writes something to a database and later another instance picks it up.  Logically you're still working on the same business process but for SOA Suite it wasn't the same flow anymore.  Did you get that I mentioned Service Bus?  Besides BPEL it works for SB as well.  And since BAM is leveraged you'll find the models back in the data sources so you'll be able to expand from there and use the BAM functionality to join it with other sources. 

I took some sessions on PCS. They announced a few new capabilities for the next few releases.  For me Correlation Sets is a highly appreciated addition.  We have it in BPEL from the first release.  In BPM it was added to to the second patchset and apparently they left it out of PCS. Probably because correlation is hard to grasp. But when you want to interact with a running process you need to indicate how the engine can conclude for which instance the incoming event is meant. 

There is a very native integration with DOCS so you can add documents to a folder related to the process instance.  But there will be a new start event to start an instance on arrival of a document in a certain folder.  This caters for an end 2 end document workflow.  Thight integration with Oracle Social Network allows you to interact with peers on an task.

Another thing I took from there was that apparently we still see PCS too much as a stripped down version of BPM Suite in the cloud.  So we tend to think in ways like: 'why is this piece of BPM functionality not in PCS' or 'In BPM we can do this, why can't we do that in PCS as well?'. But although BPM was the starting point I believe that we need to start seeing it as a whole separate product with its own purpose. So let's think from PCS and think on how this could be used and grown up  as a child on it's own legs. So that needed to come of my chest... ;).

The day ended with a nice ACE hacking event on how you could build a showcase with as many Cloud Services as you can.  Not everything went smooth but I think they did a good job.

I took the PCS Hybrid Applications track for the workshop.  It could have been a little more elaborate but it was nice. Got to play with the upcoming April release of PCS in a VM. It looked great. If I get the chance I'll post a little bit more on that later on.

Yesterday we went into the city of Valencia for the Fallas Festivities.  I learned this morning it had something to do with carpenters and to burn their waste.  The whole week there were lots of fireworks, even older people walk around with a box of fireworks and throw around fire-crackers: something we see only teenagers doing on the day before new year. And there were processions and funny statue groups. 

So that completes my overview I think.  I hope I'll be able to write about some subjects in a little more detail soon.

BPM QuickStart 12c on Windows 10

Fri, 2016-03-18 03:27
As many people probably did, I upgraded my laptop to Windows 10. Actually after that I re-installed even windows 10, since I installed an SSD drive (a Samsung EVO 850, which is really fast).

Earlier I wrote about installing SOA/BPM QuickstartInstaller, using silent install. All goes well, but where in 12.1.3 you actually can succesfully create a domain, with 12.2.1 you can't. I don't know why I got it working with 12.1.3, but I found why it doesn't in 12.2.1.

I turns out to be a recurring problem that also arose in JDeveloper 11g and it is caused by jython being not able to recognize the OS.

If you open the jython-modules.jar that can be found here %FMW_HOME%\wlserver\common\wlst\modules, then you'll find a 'Lib' folder with in it the file javashell.py. You can open the jar file using an archiver, I used my favorite tool TotalCommander, with the key-combi <CTRL>-<PgDn>.

Copy the file out of it and find the part '_osTypeMap' and add the string 'Windows 10' to the 'nt' list:

_osTypeMap = (
( "nt", ( 'nt', 'Windows NT', 'Windows NT 4.0', 'WindowsNT',
'Windows 2000', 'Windows 2003', 'Windows XP', 'Windows CE',
'Windows Vista', 'Windows Server 2008', 'Windows 7', 'Windows 8',
'Windows Server 2012', 'Windows 10' )),
( "dos", ( 'dos', 'Windows 95', 'Windows 98', 'Windows ME' )),
( "mac", ( 'mac', 'MacOS', 'Darwin' )),
( "None", ( 'None', )),
)

Just copy it back in to the jar file, overwriting the older version. By the way, although I found out earlier that the fault was in the jython install, the real thanks should go to Christos Vezalis, for his great explanation and the actual solution (I did not know what the string should be).

By the way, this same solution works for JDeveloper 12c as well.

Automatic install SOA/BPM QuickStart under Linux

Fri, 2016-02-26 07:51
Earlier I wrote about the automatic or silent install of JDeveloper. As a preperation for an automatic install of the complete SOASuite I translated this to Linux.

To do so in Linux you'll need to adapt the sudoers file.

Log in als root via het commando
[oracle@darlin-vce-db ~]$ su -
Password:
Last login: Fri Feb 26 06:44:05 EST 2016 on pts/0

Then edit the sudo-ers file:
[root@darlin-vce-db ~]# vi /etc/sudoers
Uncomment the lines for the Cmnd_Alias-es SOFTWARE en SERVICES (remove the hash ’#’ from the start of the lines)
## Installation and management of software
Cmnd_Alias SOFTWARE = /bin/rpm, /usr/bin/up2date, /usr/bin/yum

## Services
Cmnd_Alias SERVICES = /sbin/service, /sbin/chkconfig

Then add the following two lines to the end of the file:
## Extra rights for oracle to do for instance rpm without password.
oracle ALL= NOPASSWD: SERVICES, SOFTWARE
Save the file (use an exclamation-mark in the ‘:wq!’ command, since sudoers is a read-only-file.
I separated the settings for the environment in a seperate file, called 'jdev12c_env.sh', so you can use that before starting JDeveloper later on or to add these settings in your profile:
#!/bin/bash
echo set Jdeveloper BPM Quickstart 12cR2 environment
export JAVA_HOME=/usr/java/jdk1.8.0_74
export JDEV_HOME=/u01/app/oracle/JDeveloper/12210_BPMQS
export JDEV_USER_HOME_SOA=/home/oracle/JDeveloper/SOA
export JDEV_USER_DIR_SOA=/home/oracle/JDeveloper/SOA
I adapted the response file because of a different ORACLE_HOME path under Linux and renamed it to 'bpmqs1221_silentInstall-lnx.rsp':
[ENGINE]

#DO NOT CHANGE THIS.
Response File Version=1.0.0.0.0

[GENERIC]

#Set this to true if you wish to skip software updates
DECLINE_AUTO_UPDATES=true

#
MOS_USERNAME=

#
MOS_PASSWORD=

#If the Software updates are already downloaded and available on your local system, then specify the path to the directory where these patches are available and set SPECIFY_DOWNLOAD_LOCATION to true
AUTO_UPDATES_LOCATION=

#
SOFTWARE_UPDATES_PROXY_SERVER=

#
SOFTWARE_UPDATES_PROXY_PORT=

#
SOFTWARE_UPDATES_PROXY_USER=

#
SOFTWARE_UPDATES_PROXY_PASSWORD=

#The oracle home location. This can be an existing Oracle Home or a new Oracle Home
ORACLE_HOME=/u01/app/oracle/JDeveloper/12210_BPMQS

As an extra I also created an start script called 'jdev12c.sh':
#!/bin/bash
. $PWD/jdev12c_env.sh
echo start Jdeveloper BPM Quickstart 12cR2
$JDEV_HOME/jdeveloper/jdev/bin/jdev

The actual install script is as
#!/bin/bash
. $PWD/jdev12c_env.sh
echo JAVA_HOME=$JAVA_HOME
if [ ! -d "$JAVA_HOME" ]; then
# Install jdk
echo Install jdk 1.8
sudo rpm -ihv ./jdk-8u74-linux-x64.rpm
else
echo jdk 1.8 already installed
fi
# Install BPM QS
if [ ! -d "$JDEV_HOME" ]; then
# Install JDeveloper BPM QuickStart
echo Install JDeveloper BPM QuickStart 12cR2
$JAVA_HOME/bin/java -jar fmw_12.2.1.0.0_bpm_quickstart.jar -silent -responseFile $PWD/bpmqs1221_silentInstall-lnx.rsp
else
echo JDeveloper BPM QuickStart 12cR2 already installed
fi
echo copy jdev.boot naar $JDEV_HOME/jdeveloper/jdev/bin
mv $JDEV_HOME/jdeveloper/jdev/bin/jdev.boot $JDEV_HOME/jdeveloper/jdev/bin/jdev.boot.org
cp $PWD/jdev.boot $JDEV_HOME/jdeveloper/jdev/bin/
echo copy JDeveloper BPM QuickStart 12cR2 environment and start scripts to ~/bin
cp $PWD/jdev12c_env.sh ~/bin
cp $PWD/jdev12c.sh ~/bin

As described in the Windows variant of this install this script copies an adapted version of the jdev.boot file to use the JDEV_USER_HOME_SOA and JDEV_USER_DIR_SOA environment variables as set in the jdev12c_env.sh. Also it copies the jdev12c_env.sh and jdev12c.sh scripts to the ~/bin folder (user home dir), so after install you're good to go by running the jdev12c.sh script for starting JDeveloper.