Andrejus Baranovski

Subscribe to Andrejus Baranovski feed
Blog about Oracle technology
Updated: 7 hours 3 min ago

TensorFlow - Getting Started with Docker Container and Jupyter Notebook

11 hours 11 min ago
I'm studying Machine Learning and would like to share some intro experience working with TensorFlow. To get started with TensorFlow you need to install it, easiest way (at least for me) was to run TensorFlow using Docker. Read installation instructions - Installing TensorFlow.

Once TensorFlow Docker image is installed. I suggest to create container in detach mode (--detach=true) and provide port for Jupyter UI. Make sure to provide meaningful name for Docker container:

docker run --detach=true --name RedSamuraiTensorFlowUI -it -p 8888:8888 gcr.io/tensorflow/tensorflow

Make sure to start and stop container using Docker start/stop commands, don't run and create container each time (this way you will loose your work, since new container will be created each time):

docker start RedSamuraiTensorFlowUI (docker stop RedSamuraiTensorFlowUI)

Once container is running in detached mode, you can access logs by executing docker logs command and specifying container name:

docker logs -f RedSamuraiTensorFlowUI

At this point you should see output in Docker container log, copy URL to Jupyter UI with token and paste it to the browser (for example: http://localhost:8888/?token=d0f617a4c719c40ea39a3732447d67fd40ff2028bb335823):


This will give you access to Jupyter UI. Is possible to run TensorFlow Python scripts directly through command line in Docker environment, but is more convenient to do the same through UI:


UI gives option to create new Terminal session:


Terminal allows to run Python code using command line:


Instead of using command line, more convenient is to create new notebook:


Notebook environment allows to type in Python code and execute math calculations. In the example below I multiply two arrays (1x5, 2x6, 3x7, 4x8) in Python code through TensorFlow library. Result is printed through TensorFlow session object right below and prompt for the next command is displayed - very convenient:


Jupyter UI allows to track running notebooks and terminals:


Whatever action you do in Jupyter UI, it can be tracked using log printed in Docker container log. Jupyter UI is client side JS application:


To double check Docker config, I have TensorFlow Docker image:


And Docker container, which can be started/stopped by name (see command listed above), without running new Docker container every time during restart:

How to Setup Node.js and Oracle JET on Oracle Linux

Sun, 2017-11-19 08:35
What if you want to develop Oracle JET in Oracle Linux? Certainly this is possible - both Node.js and Oracle JET run on Oracle Linux or any other Linux distribution. If you follow Oracle JET Setup Guide, you will see Node.js is listed as prerequisite. But it may not be exactly straightforward to install Node.js on Oracle Linux. Below I will guide you through the steps.

Run command to install development tools to build native add-ons to be installed:

yum install -y gcc-c++ make


Enable Node.js yum repository:

curl -sL https://rpm.nodesource.com/setup_8.x | sudo -E bash -


Install Node.js:

sudo yum install nodejs


Verify if node and npm was installed successfully:


Next you can follow steps described in Oracle JET setup guide, all straightforward. Install Oracle JET:

sudo npm -g install @oracle/ojet-cli


Create new JET application with OJET utility:

sudo ojet create jetwinapp --template=navdrawer


Run application with OJET utility:

sudo ojet serve


JET application runs on Oracle Linux:

ADF Performance on Docker - Lighting Fast

Fri, 2017-11-17 02:32
ADF performance depends on server processing power. Sometimes ADF is blamed for poor performance, but in most of the cases real issue is related to poor server hardware, bad programming style or slow response from DB. Goal of this post is to show how fast ADF request could execute and give away couple of suggestions how to minimize ADF request time. This would apply to ADF application running on any environment, not only Docker. I'm using ADF Alta UI based list application with edit fragment.

Rule number one - enable response compression. This will allow to transfer less data and obviously response will execute faster - shorter content download time. See in the screenshot below - JS file is compressed to 87 KB from original 411 KB. Initial page load in ADF generates around 3 MB of content (if this is very first access and static content is not cached yet on client side). With compression initial load of 3 MB will be around 300 - 400 KB. Thats a big difference. In this example ADF page opens in 1.2 seconds (this is equal to client side JS applications, if static content is downloaded on first access):


You can enable content response compression in WebLogic console (will be applied for all deployed Web apps). Go to domain configuration, Web Applications section:


Select checkbox to enable GZIP compression and provide a list of content types to be compressed:


Thats it - content compression is set.

When I navigate to edit fragment - request is executed in 305 ms. Thanks to fast Docker engine (running on Digital Ocean - Oracle ADF on Docker Container) and content response compression: 3.44 KB transferred for 14.49 KB original content:


Let's try Save operation. I changed Hire Date attribute and then pressed Save button. This will trigger Commit operation in ADF, push data to ADF BC and then execute DML statement with commit in DB. All these steps are completed in 113 ms.


Don't believe anyone who says ADF is slow. As you can see, ADF request is very fast fundamentally - but of course it can become slow, if you add a lot of data fetch and processing logic on top (blame yourself). Client side JS application would not run faster, if it would call backend REST service to save data. The only advantage of JS client side application in this case would be that it executes backend REST call asynchronously, while ADF calls requests in synchronous manner. However, it all depends - sometimes asynchronous calls are not suitable for business logic either.

How come ADF BC call to DB completes so fast? For that we need to check Data Source Connection Delay Time on WLS. In Docker (Digital Ocean) environment it is ridiculously short (thats very good): 66 ms. Check same on your server (go to Data Source monitoring in WLS console), longer delay time means slower response from DB and slower ADF performance:


Navigation back to the list runs in 356 ms, with 197.96 KB of content compressed to 10.47 KB. This is very fast, 350 ms response time is something that user would not notice (almost equal to processing on client side):


To optimize ADF performance, make sure you are using ChangeEventPolicy = NONE for iterators in Page Definitions:

ADF Performance Story - This Time Developer Was Wrong

Fri, 2017-11-10 12:19
ADF is fast. If ADF application is slow, most likely this is related to development mistakes. I would like to tell you one story, based on my ADF tuning experience. Problem description: ADF application runs fast in DEV, when DB size is small. Same application runs slow in TEST/PROD, when DB size is large. Question - what is slow. Answer - slow means forms are loading slow. Ok, lets go to the story.

Developer decides to fetch data from DB and iterate over rows to process them. This is already bad practice, because it is much more effective to process large sets of rows in DB directly, without fetching to middle tier. But let's assume this is valid use case and we really need to fetch rows. Developer implements fetching using getAllRowsInRange method:


VO data is loaded on UI and displayed in the table. Method to iterate through rows is called from button, this logic is not executed on initial load. Iterator is set with Range Size = 10:


This means first 10 rows are fetched on form load and it will open fast no matter of DB size:


This is OK, but while testing row fetching functionality - developer finds a "bug". He will see that instead of fetching all rows in custom method, only 10 rows are returned. And then developer decides to implement a fix - change Range Size to -1:


Now all rows are fetched in custom method, developer is happy. But there is small side effect - for some reason table starts to display all rows. Not good, another fix is needed:


Auto Height Rows = 10 is set to prevent table displaying too many rows. But really still all rows will be fetched, because Range Size = -1:


All these fixes are wrong. Method getAllRowsInRange is not supposed to be used to iterate through all rows, it will return only currently fetched rows. Such implementation obviously will slow down form load functionality, it will fetch all rows from DB, if DB size is large - it will slow down significantly.

If all what you need is to iterate through rows, make sure you dont affect data which is displayed/fetched for UI. Keep Range Size positive:


In custom method iterate through rows by creating rowset iterator:


Download sample application - ADFRangeSizeApp.zip.

Essential WebLogic Tuning to Run on Docker and Avoid OOM

Sun, 2017-11-05 09:43
Read my previous post about how to run ADF on Docker - Oracle ADF on Docker Container. Docker WebLogic image is based on official Oracle Docker image for FMW infrastructure - OracleFMWInfrastructure. WebLogic container created based on this image runs, but not for long - eventually JVM process eats up all memory and OOM (out of memory) exception is thrown. This is known issue related to JVM running in Docker container - Running a JVM in a Container Without Getting Killed. Good news - we can switch on WebLogic memory management functionality to prevent OOM error while running in Docker container. This WebLogic functionality is turned on with special flag -XX:+ResourceManagement. To set this flag, we need to update startWebLogic.sh script, but probably we dont want to rebuild Docker container. Read below how to achieve this.

First we need to access startWebLogic.sh script from Docker container. Make sure Docker container on your host is running and execute Docker copy command:

docker cp RedSamuraiWLS:/u01/oracle/user_projects/domains/InfraDomain/bin/startWebLogic.sh /Users/andrejusbaranovskis/infra/shared

This will copy startWebLogic.sh file from Docker container to your host system.

Search in startWebLogic.sh script content and search for resource management config. By default it is commented out. Set this string for JAVA_OPTIONS. This enables WebLogic resource management and G1GC garbage collector:

JAVA_OPTIONS="-XX:+UnlockCommercialFeatures -XX:+ResourceManagement -XX:+UseG1GC ${SAVE_JAVA_OPTIONS}"

startWebLogic.sh script contains comment, where it recommends to enable this option:


Once JAVA_OPTIONS variable is updated, copy startWebLogic.sh script back to Docker container:

docker cp /Users/andrejusbaranovskis/infra/shared/startWebLogic.sh RedSamuraiWLS:/u01/

Enter into Docker container command prompt (in my case user 501 is root user for Docker container):

docker exec -u 501 -it RedSamuraiWLS bash

Change file permissions for startWebLogic.sh:

chmod 777 startWebLogic.sh

Enter into Docker container as oracle user:

docker exec -it RedSamuraiWLS bash

Copy startWebLogic.sh script from u01 into bin folder (overwrite existing script file):

cp startWebLogic.sh /u01/oracle/user_projects/domains/InfraDomain/bin

Stop Docker container and run docker commit to create new image (which includes change in startWebLogic.sh):

docker commit RedSamuraiWLS abaranovskis/redsamurai-wls:v2

Docker image is created with delta change only, this allows to save space. Run docker images command to verify if new image is created successfully:


Run docker push to upload new image version into Docker repository. Upload will happen fast, because it will upload only delta of changes:

docker push abaranovskis/redsamurai-wls:v2

You should see new image version uploaded into Docker repository:


To run container online, we can login into Digital Ocean console and execute docker run command (I'm using container memory limit -m 4g (4 GB)) -  it will pull and run new image:


Once docker container is running, execute top command in Digital Ocean console to monitor memory consumption. Java process memory consumption should not grow, if there is no user activity in WebLogic server:

Oracle ADF on Docker Container

Fri, 2017-11-03 10:56
Want to run Oracle ADF on Docker? This is possible, I will explain how. If you are new to Docker, it may require to spend significant amount of time to get started with all different bits and pieces. I will try to explain all essential steps, so that you will get up to speed quickly.

First of all you need to have DB accessible, check my previous post explaining how to run Oracle DB on Docker - Oracle Database Docker Image in Docker Cloud (Digital Ocean). DB is required to install RCU schema for WebLogic installation with JRF files.

I have built my own Oracle Fusion Middleware Docker image using Oracle Docker images - Oracle Fusion Middleware Infrastructure on Docker.

First step is to build Oracle JDK (Server JRE) image, this is pre-requisite to build Oracle Fusion Middleware Docker image. Read through instructions documented on Oracle Fusion Middleware Infrastructure on Docker GitHub page. You should navigate to Oracle Java folder (download Oracle Docker files from GitHub link mentioned above) and copy there JDK installation file:


Run command to create JDK Docker image:

./build.sh

Command output:


Double check to verify if image was created successfully by running docker images command:


Let's move on to Oracle FMW image creation. Navigate to Oracle FMW folder and copy FMW infrastructure installation file (I'm installing 12.2.1.3):


Move one folder up and run command:

./buildDockerImage.sh -s -v 12.2.1.3

To build Oracle FMW image. I use flag -s to skip checksum verification for installation file. You should run command from this folder:


You will see long output in the log for this command:


It installs WLS into Docker image:


Run docker images command to verify if image was created successfully:


In the next step, we will create FMW domain and extend it with ADF support. But before that we need to make sure DB details are set correctly, to be able to install RCU schema. Oracle provides infraDomain file with DB and WLS properties, make sure to set correct DB details. If properties are not correct, RCU creation will fail:


Execute docker run command to startup WLS Docker container. During first start up it will create and extend WLS domain with ADF support:

docker run -d -p 7001:7001 --name RedSamuraiWLS --env-file ./infraDomain.env.list oracle/fmw-infrastructure:12.2.1.3

Flag -d means container will run in detached mode and we will be able to return to command prompt. Port with name is specified along with environment properties file. Make sure to reference FMW image which was created in the step above. Once control is returned back to the prompt, run docker command to check status of docker container (flag -a means to show all containers):

docker ps -a

Container should be in the running state. First startup takes longer, because it requires to setup and extend WLS domain:


Once domain is extended, you will see WebLogic starting:


Finally WebLogic should be in Running state:


Run again docker ps -a command to verify container state, it should be up and running:


Once WLS machine is up, you can navigate to Enterprise Manager through URL from outside of Docker container, for example from your host. Login to EM and you will see Admin server is up, but Managed Server is down. There is a way to startup Managed Server too, but if you want to run ADF apps for DEV environment, realistically speaking Admin server is more than enough for deployment too:


Simply delete (this cab done from EM) Managed Server and cluster, keep only Admin Server:


I have deployed sample ADF application:


This application is based on ADF BC, data source is defined too:


ADF application runs from WebLogic on Docker:


Now lets see how to push newly created container to Docker registry.

First we need to create new Docker image from Docker container. This can be done with docker commit command (pointing to container ID and specifying Docker repository name and tag):

docker commit da03e52b42a2 abaranovskis/redsamurai-wls:v1

Run docker images command to verify new image is created successfully. Next run docker login to authenticate with Docker repository. Run docker push to write image to Docker repository:

docker push abaranovskis/redsamurai-wls:v1

Commands execution sequence:


Pushed image should appear in docker repository:


Once image is in Docker online repository, we can startup online Docker container, so that WLS will be accessible online. This can be done through command line or using Docker Cloud UI interface. You can create new container by referencing image from Docker repository:


Our WLS docker container with ADF support runs on Digital Ocean:


Logs are accessible from Docker Cloud UI and you can see server status:

Oracle Database Docker Image in Docker Cloud (Digital Ocean)

Fri, 2017-10-27 17:41
Red Samurai is using Docker for our internal development environment. We are running Oracle Database and WebLogic server (with ADF support) on Docker Cloud. It is much easier to manager Docker containers than to maintain our own server.

In this post I will describe how to setup Oracle Database in Docker Cloud. Of course you could run Docker container locally on your machine, but main power of Docker comes with the option to be able to run container in the Cloud.

If you are new to Docker and Docker Cloud, it is very easy to be lost - there are so many various blogs and articles, hard to understand from where to start. I would recommend to start from Get Started, Part 1: Orientation and setup tutorial. You will learn most of the important Docker commands and push Docker image to the registry. Next check this tutorial - Create your first service. It describes how to create Cloud Node through Docker Cloud interface. Keep in mind - Docker itself doesn't run Docker Container, it provides connectors to various Cloud providers for Docker - Azure, Amazon, Digital Ocean, etc. I did some research and it looks like Digital Ocean is the best option to run Docker container - pricing model is straightforward, registration and management processes are simple and clear.

First step should be to create Docker Node in Docker Cloud (you must be connected to Cloud provider). Here is the info for our Digital Ocean node instance:


Node creation process is straightforward. Once node is created, it becomes part of Node cluster:


Docker Cloud node creation process triggers droplet (Cloud instance) creation in Digital Ocean. This process takes around 5 minutes in total.

Digital Ocean provides very nice UI to manage and monitor droplet:


Graphs to monitor droplet performance:


Digital Ocean provides option to run administration console directly in the browser, without even using separate SSH connection (this is very convenient for quick administration).

I found it to be the easiest way to create Oracle DB docker image in Digital Cloud droplet by executing docker run command directly in droplet administration console. I'm using official Oracle Database Enterprise Edition Docker image.

With docker run command, i can pull and run Docker image (referencing official Oracle DB Docker registry). Where detach=true means container will run without blocking console:

docker run --detach=true --name RedSamuraiDB -p 1521:1521 -p 5500:5500 -e ORACLE_SID=RedSamuraiDB -e ORACLE_PDB=ORCLPDB1 store/oracle/database-enterprise:12.2.0.1

Command is executed from Digital Ocean droplet console:


Image is downloaded directly from Docker Store into Digital Ocean droplet:


With docker run command, image is not only downloaded, but also container is started. We can execute docker ps -l to see if container was started successfully:


Finally we need to reset default password (Oradoc_db1) set for Oracle DB Docker container. This can be done by logging into sqlplus from Digital Ocean droplet console. First we need to enter into Docker container prompt by executing (use Docker container name):

docker exec -it RedSamuraiDB bash

Execute:

ALTER USER SYS IDENTIFIED BY newpass
ALTER USER SYSTEM IDENTIFIED BY newpass


DB is accessible from outside:


This brings power of Docker - Oracle DB setup in few minutes.

Takeaways from OOW'17

Sun, 2017-10-08 09:28
Let's summarise important points from OOW'17 conference. This year conference was a breakthrough for Oracle JET - most of Oracle Cloud UI's were implemented with Oracle JET. Oracle ADF is used for Oracle SaaS (Fusion Middleware) back office applications, same as before. There is often misunderstanding in JET vs. ADF comparison. Keep in mind - JET is only UI layer and it can be compared with ADF Faces only. While ADF is much more than UI - ADF BC Model, ADF Task Flows and ADF Bindings.

1. Use JET for front-end apps, with relatively simple complexity but strong requirement for performance and scalability. Back-end can be implemented with ADF BC REST. Typically it would take longer to implement JET UI comparing to ADF Faces (no drag and drop support, unless using Visual Cloud Builder Service for JET)

2. Use ADF Faces for complex and large back-office applications. Some degree of mix between ADF Faces and JET is possible. Back-end should be done in ADF BC

ADF is stable technology and Oracle is focusing to advertise new things on OOW. Recommended focus for ADF developers was around JCS, DevCS, JET and VBCS:


There was one session about how to move existing ADF app into the cloud with JCS. And session about ADF BC REST, no new features - mainly showing customer system implementation with current features.

JET open source contribution was announced, along with new JET offline support - persistence toolkit (will be available through NPM in next couple of weeks):


Oracle explained why handling offline use case in JET is important:


Persistence toolkit will run on JET hybrid, web and desktop apps (yes you can run JET as desktop app, with 3-rd party Electron plugin):


Here you can read more detail info about persistent toolkit design principles:


JET supports Fetch API (used by persistence toolkit):


Architecture explanation for JET persistence toolkit:


Another hot topic related to JET - Web components. Slots are supported, this allows to build advanced Web components with dedicated facets (ADF Faces terminology):


OOW'17 was packed with chatbot demos. Unfortunately there is no trial access for chatbot cloud yet. There was good session about chatbots from Lyudmil Pelov - he explained how Oracle chatbot service and language processing works:


I was impressed with Oracle Visual Cloud Builder Service progress this year. I really liked this service and I think it will become JET IDE development tool. Is not focused for business users only, but is targeted for JET developers too. It comes with embedded DB or you can call external services too. Next versions will allow direct JET code development, combined with drag and drop:


Develop JET in VBCS using drag and drop where possible and code additional logic in JS/HTML:


VBCS is not targeted for business users anymore, they target it for coders (us) too - thats news I like:



RDK UI is available now for ADF, JET and MAF. ADF RDK was updated to newer version. ADF and JET RDK UI look is identical.

New ADF RDK UI:


JET RDK UI:


Machine Learning was another big thing on OOW and JavaOne. Nice to hear things I was studying in university about neural networks are becoming hype now:


JVM garbage collection tuning. Is better to have objects to be dead young. This means in ADF is better to use Backing Bean Scope where possible, because short scope bean is destroyed after request and this allows VM to clean memory:

ADF BC REST Service from ADF Library JAR

Sun, 2017-09-17 11:39
I had assignment to enable ADF BC REST for existing ADF application, which was developed with multiple ADF libraries. ADF BC REST was supposed to be enabled for one of the ADF libraries and then accessed through Master application. All ADF BC REST samples usually show how to enable REST inside Master application itself. I was curious if it will work to enable ADF BC REST and package it into ADF library, which is consumed from Master application. It worked and I would like to share sample app.

Sample app is based on two JDEV projects - library and master applications. Library application contains ADF BC REST implementation for Employees VO:


ADF Library is generated out of ViewController project, ADF BC REST is packaged through dependency:


Model (enabled with ADF BC support) project in Master application imports ADF library:


Run Master application by running REST project. There is improvement in JDEV 12.2.1.3 comparing to 12.2.1.2 - re-deployment works even if dependent ADF library was changed (this is very useful fix - saves a lot of time during development):


REST response is received:


Download sample application - adfbcrest.zip.

Oracle JET List Implementation based on WorkBetter Alta UI

Mon, 2017-09-11 14:04
During weekend I was playing with JET WorkBetter sample app. I thought it would be nice to extract People list UI into my own app - so that it will be easy to reuse this UI pattern in our own JET applications. I succeeded to achieve my goal and would like to share WorkBetter Alta UI list sample with you.

This is how my own JET list UI looks like, based on sample app:


Header block is implemented in separate div, this contains title and toolbar for buttons. List content is rendered out of template located in its own div:


Template renders JET list component. List in turn is based based on item template:


Item template defines list row content and structure - through set of JET and custom CSS styles. Data attributes are referenced directly from template structure:


For this sample I'm using basic set of mock data defined in JS module, but obviously it can fetch data from REST service too:


Set of custom styles (created based on WorkBetter example) is located in override.css file within JET application structure:


Sample JET list application can be downloaded from GitHub - jetlistapp.

To build this application, I was using JET CLI commands. Especially useful I found options to strip application leaving only sources and then to restore JET environment - this helps to share code.

Here is the sequence of commands how I was using JET CLI:

ojet create jetlistapp --template=navdrawer (creates JET application)

ojet serve (runs JET application)

ojet strip (removes all files except your sources)

ojet restore (brings back JET environment, after application was stripped)

ojet serve (you can run application again, after it was restored)

Oracle JET Simple Table with Fake JSON Server

Thu, 2017-09-07 14:13
What if you want to build JET UI, but there is no back-end REST service ready yet. You could use static JSON data inside JET app and load it from the JSON file. This works, but data retrieval logic will need to be changed after REST service will be ready. This is not productive. Ideally you would like to have fake REST service to simulate REST calls from JET UI, until real REST back-end is ready.

This can be achieved with JSON Server. It provides full fake REST API to be used by front-end client. It is extremely easy to setup JSON Server with NPM. Run npm install to setup JSON Server:

npm install -g json-server


Create simple JSON file with data structure - this will be served from JSON Server:


Run JSON Server with this command, pointing to JSON file:

json-server --watch db.json


You should see REST endpoint URL displayed in the log. Use this URL in JET application to define collection:


JET CollectionTableDataSource can be created on top of collection backed by REST:


JET UI table component renders data directly from CollectionTableDataSource:


Table UI is rendered based on REST call:


To verify REST call, go to developer console and check network request. You should see request executed against fake JSON Server:


Response is visible too:


Download JET sample application with JSON file for JSON Server from my GitHub repository - jetsimplecrud.

ADF 12c Table CRUD Fix for Auto Focus

Sun, 2017-09-03 05:57
I had a post about how to improve user data entry for ADF table with auto focus for new row - Improving ADF UI Table CRUD Functionality with Auto Focus. If you follow comments thread for that post - you will see described approach doesn't work exactly as it should in ADF 12c (focus is set for new row but later is lost after tab navigation - it should move focus to another column). Thanks to the community we have simple fix for this issue, read OTN Forum thread - Set Focus on CreateINsert row in ADF Table. I hope Oracle will fix this functionality in the next ADF versions (currently they say it is expected behaviour, and I don't agree with this). But for now - fix does the job.

 Sample app was updated to 12.2.1.3 and it contains fixes described on OTN Forum - ADFTableFocusApp_v2.zip. Focus is set for first column of new row to avoid extra mouse click:


After tab is pressed - focus moves to the next column:


Fix simulates row selection through ADF JS API. After focus was set for the first column in current row - we execute row selection based on active row key. This allows ADF Faces to move focus to the next column on tab, because it knows currently selected row on client side:


Make sure to set active row key in the bean method, same place where focus ID is calculated (otherwise there will be JS error):

ADF Client Side Validation with JavaScript

Wed, 2017-08-30 09:25
In my previous post I explained how to use JS client side formatter for ADF Faces input components - ADF Goes Client Side - UI Performance Boost with JavaScript. The same principle can be applied for client side validation. Most likely you are not going to implement complex validation rules on the client side, but for simple checks - it will be perfect. There will be no roundtrip to the server and this will allow to improve user experience.

Client side validation check runs instantly when value is changed and there is no need to set Auto Submit = true to send value to the server. This rule checks value to be in certain range, similar simple checks can be implemented on client side (I'm sure you will find similar rules in your project):


If value is in range, client side formatter is applied as expected:


Custom JSF converter is registered in faces-config.xml and can be added to the input field either manually or dropped through the wizard:


Converter is based on ID, which is defined in faces-config.xml:


Definition in faces-config.xml:


Customer converter class defines range values, these properties will be consumed in JS logic:


Range check is done in JS. If validation fails, we display JSF error message. All happens on client side, without request to the server:


Download sample application with client side validation logic - ADFFormattingApp_v2.zip.

Oracle Java Cloud Service - Scaling and Cluster Setup for ADF

Wed, 2017-08-16 12:10
Last couple of weeks I was busy preparing to my OOW'17 session about estimating Java Cloud Service performance for ADF application. I was running stress tests against various JCS instance configurations to be able to create performance estimation methodology. I will describe this methodology on OOW, but here today will list key steps required to scale up JCS instance.

Let's assume you are running single cloud node with 1 CPU and 7.5 GB RAM. This node contains WLS admin and managed server:


To scale up cloud node, simply invoke Scale Up command from the menu. Select new compute shape and confirm scale up operation:


Scale up completed:


Now go to WebLogic console and update Managed Server startup parameters, to adjust heap size to higher value:


Managed server memory can be increased up to certain amount. If more resources will be needed, at some point you will need to create multiple managed servers and connect them into cluster. This would require to define Load Balancer instance (to have single entry point to the cluster):


Cluster node must run in dedicated cloud node. If cluster is based on two managed servers, there must be two cloud nodes:


Managed servers from cloud nodes can be connected into single cluster, this can be done in WebLogic console:


Once cluster is defined, deployment becomes easy - you can deploy ADF application into cluster and it will be propagated to all nodes:


Traffic director running in load balancer instance will be automatically configured to route traffic to cluster nodes:

My Blog Samples Download Repository (For Samples Before 2014 January)

Sat, 2017-08-12 08:44
I have been asked, where to download my blog sample applications, posted before 2014 January. If you try to download such sample - you will get error about sample not found. Thats because Google discontinued their support for Google Code repository.

However, you can still download all my samples posted before 2014 January from Google Code archive. Go to archive URL - Google Code Archive for jdevsamples and you can browse all old samples there by date:


All new samples (after 2014 January) are hosted from Google Drive or from GitHub and are accessible directly from blog by URL.

Oracle JET Busy Context API to Control Asynchronous REST Calls

Sat, 2017-07-29 10:59
I have received feedback from users working with JET UI - sometimes it is not obvious that action button was pressed, users tend to press same button again very fast, which leads to parallel REST calls executing at the same time. In JET - REST call is executed asynchronously, this makes user to believe action was done instantly when button was pressed. However, REST call still may run in the background - while user will be trying to call same service again. While in most of the cases such behaviour is fine, still there are use cases when we want to block action button, until REST response is not received (while response is executed, button will be disabled - this will give visual feedback to the user about action still executing). JET provides Busy Context API to handle asynchronous REST calls in synchronous way.

I will describe how to apply Busy Context API in your JET application. Take a look into my sample app (JET + ADF BC REST) available on GitHub - JETCRUD.

When you run sample app, go to Customers tab and navigate to edit screen. There you will find Save button, which is enabled:


Save button calls saveCustomer() JS function. JET Busy Context is established in this method, before making REST call. Busy Context is attached to Save button. If there are no busy states in the context, function isReady() returns true and we can register busy context. After busy context is registered - REST call can be made. If saveCustomer() JS function will be called again, before REST call is executed - isReady() will return false and no REST call will be made. When busy state is created, we update observable variable - which helps to change disabled property for the button:


Data in one of the fields is changed and user pressed Save button to execute REST call - button becomes disabled:


Button stays disabled until REST call response is received. Of course when REST service is fast you even will not notice that. But if REST service call takes a second or so - you will see disabled button, when action is busy. After REST response is received and if there are no errors - success callback is executed. We call resolve() function there and this removes busy state:


We need to use promise call for whenReady() function to read changed value from isReady() function. This step updates button visual state back to enabled:


Save button becomes enabled:


Observable variable is set for UI button disabled property. This is how visual state is controlled from JS:


Don't forget to add resolve() to error callback too, otherwise button will stay disabled - if REST call fails:

ADF Goes Client Side - UI Performance Boost with JavaScript

Sat, 2017-07-22 10:48
If you would like to boost ADF UI performance, you should look into client side validation and formatting options possible to be done in ADF UI. Today I will describe how you can implement client side converter, to format number value on client side, without making request to the server. Same approach could be used to implement client side validators. You can raise error message and it will be assigned to UI field in the same way, just like any standard ADF error message. While this approach is documented long ago in Oracle ADF developer guide - How To Create Client Side Converter, it is not well known and not often used.

Client side converter is attached to ADF UI field through JSF tag, it points to custom converter ID (make sure autoSubmit=false is set, we don't want request to the server on value change):


Custom converter is defined in Faces Configuration file, it points to custom converter class:


Converter class is responsible to load JavaScript file, where number formatting logic is implemented. Also we have an option to pass initialization parameters:


Example of client side converter logic (to format numbers) code in JS:


Formatting happens on the client, no request to the server is done. User enters value and navigates out of the field - value is formatted:


If fractional part is incorrect, error raised from converter is displayed same as any other error in ADF - attached to the field:


If value is invalid - error is displayed too, this simple validation error comes from JS converter. Request is processed on the client, no call to the server:


Server side formatter in ADF BC is still required. When data is fetched from DB, ADF BC server side formatter is applied to transform data to correct format (this happens when data is fetched and doesn't affect end user performance):


Custom number formatter implemented in ADF BC (read more about it: Generic BigDecimal Formatter in ADF 12.2.1.1):


Download sample application - ADFFormattingApp.zip.

ADF BC - Create View Object From Query with Custom Implementation Class

Sun, 2017-07-16 12:18
I had a request to explain how to create dynamic ADF BC VO from SQL statement and set custom VO implementation class for newly created VO instance. Custom VO implementation class extends from ADF BC ViewObjectImpl and overrides super method:


There is a method createViewObjectFromQueryStmt, in previous ADF versions this method had two parameters - VO instance name and SQL statement. In current ADF 12c - there is a second signature of the same method, which contains option to specify VO implementation class name. Dynamic VO from SQL with VO implementation class:


ADF BC custom methods can be tested with ADF BC tester:


Overridden method from custom VO implementation class is called:


Download sample application - ADFVOFromSQLApp.zip.

ADF 12c BC Proxy User DB Connection and Save Point Error

Mon, 2017-07-10 14:35
If you are modernising Oracle Forms system, high chance you need to rely on DB proxy connection. Read more about it in my previous post for ADF 11g - Extending Application Module for ADF BC Proxy User DB Connection. It works in the same way for ADF 12c, but there is issue related to handling DB error, when DB proxy connection is on. DB error is propagated to ADF but is being substituted by save point error (as result - user would not see original error from DB). It seems like related to JDBC driver in 12c. The workaround is to override ADF SQL builder class and disable save point error propagation (there might be better ways to workaround it).

Proxy connection is established from prepareSession method in generic AM Impl class:


If I would change salary value to negative and save data - DB constraint error would fire (negative not allowed). Unfortunately, end user would not see that error - he gets message about failed save point:


Workaround -  we can disable save point error propagation. Override SQL Builder class and add try/catch block in rollbackToSavepoint method. If error happens, do nothing:


You must register SQL Builder class with AM. Add jbo.SQLBuilderClass property in bc4j.xcfg, pointing to the class:


You should be able to see DB errors after this change is applied:


However, there is one drawback of this workaround to keep in mind. When data is posted to DB, ADF executes lock statement. If update fails, normally ADF would execute rollback to save point and lock will be removed. But not in the case of DB proxy, now rollback to save point is failing - this means lock will stay:


If user would fix data and try to save again - lock error will be returned:


Error during lock:


To bypass lock issue, you should enable DB pooling for AM instance. In this case, after each request DB connection will be returned back to the pool and lock will be released automatically:


Download sample application - AMExtendApp_v3.zip.

Working with Location and Permissions in JET Hybrid

Thu, 2017-07-06 00:35
What if you want to access mobile device location data from JET Hybrid application? This can be achieved with Cordova Geolocation plugin. But you want it to be nicely done and want to make sure application is granted with permission to access location information. Use Cordova Permissions plugin for that.

You could add Cordova plugin to JET app by executing this command:

cordova plugin add 

If this command doesnt work for any reason, you could add plugin information directly into config.xml file (check Geertjan post about the same - Plugging into Devices with Oracle JET on Cordova (Part 1)):



In JS function, before calling location API - we call permissions API to check if app is already granted permission to read location data. In hasPermission method, in case of success - location data is accessed. In case of no permission, request for permission is sent. If request is satisfied - location is accessed (and permission is granted at the same time):


Location data is retrieved through callback:


This is how it works. On very first location access, when permission is not granted yet - we request permission through permission API:


When permission is granted, location is displayed:


Download sample application from GitHub repository - rslocationapp.

Pages