Andrejus Baranovski

Subscribe to Andrejus Baranovski feed
Blog about Oracle technology
Updated: 9 hours 49 min ago

Shredding and Querying with Oracle Offline Persistence in JET

Sat, 2018-02-17 14:44
I think offline functionality topic should become a trend in the future. Its great that Oracle already provides solution for offline - Oracle Offline Persistence toolkit. This is my second post related to offline support, read previous post - Oracle Offline Persistence Toolkit - Simple GET Response Example with JET. I have tested and explained with sample app how it works to handle simple GET response offline. While today I would like to go one step further and check how to filter offline data - shredding and querying offline.

Sample app is fetching a list of employees - Get Employees button. It shows online/offline status - see icon in top right corner. We are online and GET response was cached by persistence toolkit:


We can test offline behaviour easily - this can be done through Chrome Developer Tools - turn on Offline mode. Btw, take a look into Initiator field for GET request - it comes from Oracle Offline Persistence toolkit. As I mention it in my previous post - once persistence toolkit is enabled, all REST calls are going through toolkit, this is how it is able to cache response data:


While offline, click on Get Employees button - you should see data returned from cache. Did you noticed - icon in the top right corner was changed to indicate we are offline:


Ok, now we will see how shredding mechanism works (more about it read on GitHub). While offline, we can search for subset of cached data. Search By Name does that, it gets from cache entry for Lex:


Switch online and call same action again, but with different name - REST call will be invoked against back-end server as expected. Again it is transparent to JET developer, no need to worry if app state is online/offline, same REST request is done in both cases:


Let's take a quick look into implementation part (complete example is available on my GitHub repository).

Online/offline status icon is controlled by observable variable:


It is very simple to determine online/offline state. We need to add event listener for online/offline and reset observable variable accordingly:


Persistence toolkit supports Simple and Oracle shredder/query handlers. I'm using ADF BC REST for backend and so my choice is oracleRestJsonShredding and oracleRestQueryHandler. Oracle shredder understands REST structure returned by ADF BC REST. Oracle query handler support filtering parameters for ADF BC REST for offline filtering - this allows to use same query format for both online and offline. I was happy to read that Oracle query handler explicitly supports ADF BC REST - queryHandlers:


Same REST call with filtering is executed online and offline:

Oracle JET Composite Components - Manual for JET Coder

Wed, 2018-02-07 07:20
JET Composite Components - are useful not only to build UI widgets, but also to group and simplify JET code. In this post, I will show how to wrap JET table into composite component and use all essential features, such as properties, methods, events and slots.

Sample app code is available on GitHub. JET table is wrapped into composite component, it comes with slot for toolbar buttons:


What is the benefit to wrap such components as JET table into your own composite? To name a few:

1. Code encapsulation. Complex functionality, which requires multiple lines of HTML and JS code resides in the composite component
2. Maintenance and migration. It is easier to fix JET specific changes in single place
3. Faster development. There is less steps to repeat and less code to write for developer, when using shorter definition of the wrapper composite component

Sample application implements table-redsam component, for the table UI you can see above. Here is component usage example, very short and clean:


All the properties specific to given table are initialised in the module. Developer should provide REST endpoint, key values, pagination size and column structure. The rest is happening in the composite component and is hidden from the developer, who wants to implement a table:


Properties

We should take a look into array type property. Such property allows to pass array into component. This can be useful either to pass array of data to be displayed or array of metadata to help with component rendering. In our case we pass array of metadata, which helps to render table columns. Array type property is based on two attributes - Header Text and Field. Properties are defined in composite component JSON file:


Properties are retrieved from variable inside component and are assigned to local variables:


This is table implementation inside component, columns are initialised from component property:


Slots

Slot defines a placeholder, where developer who is using composite component can add additional elements. Slot is defined in component JSON definition file:


To define slot, JET slot component should be defined inside composite. You can control layout and location where slot will be rendered:


In our case, we use slot for table toolbar buttons. These buttons are added later, when developer is using composite. To place button into slot, put button inside composite component tag and assign defined slot name for the button. This will allow to render button in the slot:


Methods

Method defined in composite component, can be called from outside. In example below, I call JS function from toolbar slot button:


Function gets composite by ID and calls exposed method:


Method should be defined in composite JSON definition:


Method is implemented inside composite JS module:


Events

Events allows to implement external listeners. Basically this allows to override composite logic in external functions. Event is declared in composite JSON definition:


Composite tag contains event property mapped with external JS function, which will be called when event happens inside composite:


Function code in the module, it prints current row selection key:


Table is defined with listener property inside composite:


Listener inside composite initiates event, which will be distributed outside and handled by method defined in composite tag on-handle-selection property:


Let's see how it works. Call Method button invokes method inside composite:


Table row selection first triggers listener inside composite, then it initiates event and external listener is invoked too:


I think this lists pretty much all of the essential functionality given by JET composite components. I hope you will find it useful in your development.

Say Hello to Red Samurai Contextual Chatbot with TensorFlow Deep Neural Network Learning

Sun, 2018-02-04 02:33
We are building our own enterprise chatbot. This chatbot helps enterprise users to run various tasks - invoice processing, inventory review, insurance cases review, order process - it will be compatible with various customer applications. Chatbot is based on TensorFlow Machine learning for user input processing. Machine learning helps to identify user intent, our custom algorithm helps to set conversation context and return response. Context gives control over  sequence of conversations under one topic, allowing chatbot to keep meaningful discussion based on user questions/answers. UI part is implemented in two different versions - JET and ADF, to support integration with ADF and JET applications.

Below is the trace of conversations with chatbot:


User statement Ok, I would like to submit payment now sets context transaction. If word payment is entered in the context of transaction, payment processing response is returned. Otherwise if there is no context, word payment doesn't return any response. Greeting statement - resets context.

Intents are defined in JSON structure. List of intents is defined with patterns and tags. When user types text, TensorFlow Machine learning helps to identify pattern and it returns probabilities for matching tags. Tag with highest probability is selected, or if context was set - tag from context. Response for intent is returned randomly, based on provided list. Intent could be associated with context, this helps to group multiple related intents:


Contextual chatbot is implemented based on excellent tutorial - Contextual Chatbots with Tensorflow. Probably this is one of the best tutorials for chatbot based on TensorFlow. Our chatbot code follows closely ideas and code described there. You could run the same on your TensowFlow environment - code available on GitHub. You should run model first and then response Python notebooks.

Model notebook trains neural network to recognize intent patterns. We load JSON file with intents into TensorFlow:


List of intent patterns is prepared to be suitable to feed neural network. Patterns are translated into stemmed words:


Learning part is done with TensorFlow deep learning library - TFLearn. This library makes it more simple to use TensorFlow for machine learning by providing higher-level API. In particular for our chatbot we are using Deep Neural Network model - DNN:


Once training is complete and model is created, we can save it for future reuse. This allows to keep model outside of chatbot response processing logic and makes it easier to re-train model on new set of intents when required:


In response module, we load saved model back:


Function response acts as entry point to our chatbot. It gets user input and calls classify function. Classification function, based on learned model, returns list of suggested tags for identified intents. Algorithm locates intent by its tag and returns random reply from associated list of replies. If context based reply is returned, only if context was set previously:


Stay tuned for more blog posts on this topic.

Avoid Blind SQL Call from ADF Task Flow Method

Mon, 2018-01-29 15:42
Keep an eye open on ADF Task Flow Method Call activities where methods from ADF Bindings are called. JDEV 12c sets deferred refresh for ADF binding iterators related to TF Method Call activities and this causing blind SQL to be executed. Blind SQL - query without bind variables.

Let me explain the use case, so that it will be more clear what I'm talking about.

Common example - TF initialization method call where data is prepared. Typically this involves VO execution with bind variables:


Such method call could invoke binding operation either directly (pay attention - bind variable value is set):


Or through Java bean method using API:


My example renders basic UI form in the fragment, after TF method call was invoked:


If you log SQL queries executed during form rendering, you will see two queries instead of expected one. First query is executed without bind variables, while second gets correct bind variable assigned:


What is the cause for first query without bind variables? It turns out - iterator (with setting Refresh = deferred) from page definition mapped with TF method call is causing this. Somehow iterator is initialized not at the right time, when bind variable is not assigned yet and this causing blind SQL call:


Workaround is to set Refresh = never:


With Refresh = never, only one query is executed as expected, with bind variable assigned:


This may look minor, but trust me - with complex queries such fix could be a great help for performance tuning. Avoid executing SQL queries without bind variables.

Download sample application - ADFTFCallBindingApp.zip.

Oracle Offline Persistence Toolkit - Simple GET Response Example with JET

Sun, 2018-01-28 11:56
We have new tool from Oracle which can help to simplify offline logic implementation for JS apps.  In this post I will describe how to use Oracle Offline Persistence Toolkit with Oracle JET. However Offline Persistence is not constrained by JET usage only, this toolkit is available on NPM and can be integrated with other JS solutions.

I should emphasise - offline toolkit primary role is to enable mobile hybrid apps to work offline. In my opinion, toolkit usage doesn't stop here. It can enable user to continue his work, when internet connection is available, but back-end server goes down. Technically user would remain online in this case, but in practice application will be broken - no response from back-end for REST calls. Offline persistence toolkit could help to solve such cases - user could continue working with local cache, until back-end is down.

If you want to learn how offline toolkit works and how to use its API, go to GitHub page - check readme, add it to your JET app and try to run/test. Hands-on is the best way to learn something new.

I will share few hints and sample app.

As per readme, first of all you should add Offline Persistence Toolkit and PouchDB modules from NPM. Run these commands within JET app directory:

1. npm install @oracle/offline-persistence-toolkit

2. npm install pouchdb pouchdb-find

Next you should follow four simple configuration steps and enable JET app to be able to access offline toolkit API.

Step 1 (standard, when adding any additional module)

Add paths to newly added modules in main.js require block:


Step 2 (standard, when adding any additional module)

Add paths to newly added modules in main-release-paths.js:


Step 3 (standard, when adding any additional module)

Added modules would not be copied to build directory automatically. We need to define copying in oraclejet-build.js. Modules should go to build directory. If you need to copy files from given folder and subfolders, use ** for src:


Build content is located in web directory. Offline toolkit and PouchDB modules should be copied to build directory:


Step 4

Initialize window.PouchDB variable in main.js:


Configuration is complete, now we can use Offline Persistence Toolkit API. Add persistence store manager and other modules:


Simplest option is to rely on default fetch listener from offline toolkit. We need to register store factory and map endpoint which we want to cache with persistence manager. When back-end is available - call will go to back-end and response will be cached. Next time, of back-end is not available - data will be fetched from cache. Toolkit intercepts HTTP(-S) request and stores response, if end-point was configured to be listened:


I'm testing offline toolkit with simple Employees REST end-point call from JET. Toolkit allows to execute this call successfully, even if there is no back-end or no connection (of course - if same call was executed at least once before):


UI part is simple - displaying list, when data is fetched:


Data is fetched, we are online:


Offline toolkit will work, if REST response doesn't include Cache-Control header. Make sure there is no Cache-Control header set in response:


ADF BC REST by default sets Cache-Control header, you can remove it Filter class (defined in ADF BC REST app):


Now I turned my connection to be offline, clicked on Get List button - JS calls REST and instead of getting network error, it executes successfully and returns data from cache through offline toolkit functionality:


You should open details for network call and check initiator. You will see that all calls mapped to offline endpoint are going through persistenceManager.js:


Let's double check - may be we are tricked somehow? Remove offline toolkit registration API in the code and re-run application:


As expected - network error is received and fetch fails. This proves - offline toolkit works :)


Sample JET application with offline toolkit configuration is available on GitHub (run ojet restore and ojet serve).

Linear Regression - Machine Learning with TensorFlow and Oracle JET UI Explained

Tue, 2018-01-16 11:33
Machine learning topic is definitely popular these days. Some get wrong assumptions about it - they think machine could learn by itself and its kind of magic. The truth is - there is no magic, but math behind it. Machine will learn the way math model is defined for learning process. In my opinion, the best solution is a combination of machine learning math and algorithms.  Here I could relate to chatbots keeping conversational context - language processing can be done by machine learning with neural network, while intent and context processing can be executed by programmable algorithms.

If you are starting to learn machine learning - there are two essential concepts to start with:

1. Regression
2. Classification

This post is focused around regression, in the next posts I will talk about classification.

Regression is a method which calculates the best fit for a curve to summarize data. Its up to you which type of curve to choose, you should assume which type will be most suitable (this can be achieved with trial and error too) based on given data set. Regression goal is to understand data points by discovering the curve that might have generated them.

In this example I will be using simplest regression possible - linear. Line is described by equation y = W*x + b. Where b is optional and can be 0 (line will cross (0, 0) point). For complex data sets, we might use polynomial equations and generate curves.

Here is Python code which implements linear regression with TensorFlow API (I have provided comments for all steps, reading code should be self explanatory):


Key element in any kind of machine learning - cost. The higher the cost, the worse is learning output. In linear regression, cost is typically defined by the sum of errors. The error in predicting x is calculated by the squared difference between the actual value f(x) and the predicted value M(w, x). The cost is the sum of squared differences between the actual and predicted values.

As you can see in the code above, we define cost function and ask TensorFlow to run optimizer to find the optimal values for model parameters. All the hard math calculation is happening in TensorFlow, our job is to prepare training data and choose right learning approach with correct equation.

Let's run JET UI, which talks to TensorFlow through REST. Training data is randomly generated (always 100 points) during each training session.

Training Epochs - number of learning iterations during training session
Learning rate - learning speed, smaller rate - more careful learning
W - learned model parameter to calculate equation y = W*x
Cost - value which shows how successful learning was, lower cost is better

1. We start from 1 training epoch and learning rate 0.001:


Learning result is not good - red line is result of linear regression, it doesn't represent best fit for training data. Cost is pretty high too, which indicates that learning wasn't successful.

2. 10 training epochs and learning rate 0.001:


As we repeat multiple learning iterations within the same training session - learning result is better. Cost becomes smaller and linear regression calculated line fits better, but still not ideal.

3. 100 training epochs and learning rate 0.001:


It helps to increase learning iterations, cost is significantly improving and line fits much better. This means outcome for W parameter learning is pretty good.

4. 1000 training epochs and learning rate 0.001



Let's make model to learn even harder and repeat more times - cost is becoming even better.

5. 2000 training epochs and learning rate 0.001


We could increase learning iterations further, but at some point it will not help. Learning process will start to suffer from overfitting. You can think about it - learning and repeating so many times, that at the end you start forgetting things. Cost is getting worse when repeating learning iterations further.

6. 2000 training epochs and learning rate 0.0001


It should help to make learning rate smaller, which result in more careful learning. This should allow to get better learning results with higher number of learning iterations. We get best learning cost result here and the most optimal line. You may ask - what is the use of that line? It can help to predict y values which were not available in training dataset.

7. 2000 training epochs and learning rate 0.01


On contrary if we increase learning rate, learning process will be faster - optimizer will run faster. This will result in decreased model output quality, cost will be higher and W parameter value will not produce such best fit line as in previous training run.

Few hints related to Oracle JET UI. You can achieve very good data visualization with JET chart components. For example I could control marker type rendered for training data points:


Line which represent learning result, can be displayed as reference line:


To display reference line, I'm using y-axis property which comes with JET chart:


References:

- Example for Linear Regression with Python and TensorFlow - Gist
- JET UI example - GitHub
- Accessing TensorFlow model through REST - Machine Learning with Oracle JET and TensorFlow
- Book - Machine Learning with TensorFlow 

JET UI Shell with Dynamic Tabs Implementation

Wed, 2017-12-27 08:47
I will describe and provide sample code - how to implement dynamic tabs in Oracle JET. This is useful for module loading, default JET template comes with static tabs - this works OK for the apps with smaller number of modules. When menu structure is more complex, it makes sense to load tabs for each module dynamically, when module is selected from the menu. This allows user to focus on modules he is working on, well just same as tabs in your Web browser.


Dynamic tabs are only visual placeholders, I'm using JET router to load module and keeping tab selection in synch. There are multiple benefits for such approach - module loading is following standard JET implementation with router, visually it looks like we have modules loaded using tabs. Overall implementation for dynamic UI Shell tabs with JET becomes quite simple, when relying on JET router.

In this post I will describe how to:

1. Add new tab and select module
2. Select existing tab from the menu/by clicking on the tab itself and select module
3. Remove tab and select next available module

Dashboard module is default one, it is loaded once application is accessed and we dont allow to remove it. Other modules are accessible from menu list:


Multiple menu items are opened with tabs displayed below. Let's see how tab removal works. If I remove Incidents tab:


Next tab on right will be selected as current:


If tab is removed, which is not selected -  Customers, current tab will remain selected:


If there is no tab on the right, while removing current tab About, first tab on the left will be selected:


Implementation part - menu list is implemented in this example with JET menu component (in real application you would use more complex menu structure, may be hierarchical):


Dynamic tabs solution is based on JET tab bar component, with removable option (you can try it in JET Cookbook - Add/Remove tab example):


Application defines array of tabs and variable to to keep track of current module:


Once menu item is loaded, we check first if same was opened already. If yes - we open it, instead of opening the same tab twice. Tab array is updated with new item and JET router navigates to current module, this is how module gets displayed, no additional magic:


When user selects tab, current module is loaded through JET router:


Tab removal is handled in delete method. Here calculation is done, which tab should be focused after current tab removal. Item is removed from array of tabs and JET router state is updated:


Sample JET code is available on my GitHub repository.

Recipe for JQuery Menu integration into ADF Faces

Thu, 2017-12-21 12:32
May be you are thinking to add some fresh looking element into your ADF app? I would propose JQuery menu. Advantage of such menu - it is purely client side implementation and it allows user to navigate through items really fast. Besides fast navigation, it looks great too and can be easily customized.

In this post I will describe how to achieve JQuery menu integration into ADF Faces and provide sample ADF application. I'm using JQuery menu from here - Animated Responsive Drop Down Menu. There is no need to implement JQuery menu component, we can use one of the existing implementations.

I have recorded short example, you can see how convenient is client side menu navigation. Each menu item loads ADF region:


JQuery menu integrated into my ADF sample:


It can display menu items for multiple levels:


Integration into ADF Faces is very simple, you need to copy JS and CSS files into ADF application:


Define JS and CSS resources in ADF main page, make sure to reference JQuery too:


Menu structure is defined in ADF fragment using HTML tags. Once menu item is selected, call to ADF is made through combination of client/server listeners (check WBClient.menuNavigation method). In my future posts I will describe how to construct similar menu structure dynamically. If your application is secured, make sure to define ADF Security permissions to TF's properly, any unprotected TF can be loaded from the menu:


Client listener method menuNavigation constructs params with target TF address and calls server listener, which in turn displays ADF dynamic region:


Menu fragment with HTML tags is integrated into ADF page template:


Download sample application - ADFJQueryMenuApp.zip.

Machine Learning with Oracle JET and TensorFlow

Sat, 2017-12-16 07:18
Oracle JET works with any kind of REST service, such service could be the one coming from TensorFlow (read more in my previous post - TensorFlow Linear Regression Model Access with Custom REST API using Flask). I have implemented linear regression model with gradient descent optimizer in TensorFlow (will describe this in more detail in my next post - machine learning is all about mathematics) and consuming it in JET UI:


There is option to define training steps (or data points) and learning rate. As outcome we get W and b values for linear equation y = Wx + b. After training is executed (so called machine learning process) - W and b parameters are identified, this allows to predict y value for any x. More about this in my next post, today will focus on JET.

To execute training, we call TensorFlow REST API endpoint and pass number of training steps/points and learning rate. I'm have create JET helper module to perform actual REST call, to separate logic from REST call invocation. Useful technique of custom callback is used to call method located in main module from helper module. We pass callback method name REST call helper function:


REST call is executed through JQuery and success callback is assigned with our method name (instead of processing results in REST call helper method, we can do processing in main module):


This is where results processing happens - in custom callback method. We prepare x/y training dataset to be displayed in chart. We get back estimated W and b parameters, this allows to calculate estimated y based on equation y = Wx + b for all x values used in training set (to compare how well model was trained):


One more nice component of Oracle JET - Form Layout. It makes very easy to create simple forms, specify number of max columns and you will get responsive UI for form components out of the box:


Make sure to add ojformlayout module:


To debug data returned by TensorFlow REST service, check Network history in browser developer tools:


By the way - it is completely up to you to decide about REST JSON data shape. With Python jsonify from Flask you can construct JSON from arrays of data and publish your own REST end point anywhere in Python script (you code in Python for TensorFlow):


Download sample JET application (make sure to use ojet restore, before ojet serve) from GitHub. TensorFlow linear regression model used for this sample is available as GitHubGist.

TensorFlow Linear Regression Model Access with Custom REST API using Flask

Mon, 2017-12-11 13:28
In my previous post - TensorFlow - Getting Started with Docker Container and Jupyter Notebook I have described basics about how to install and run TensorFlow using Docker. Today I will describe how to give access to the machine learning model from outside of TensorFlow with REST. This is particularly useful while building JS UIs on top of TensorFlow (for example with Oracle JET).

TensorFlow supports multiple languages, but most common one is Python. I have implemented linear regression model using Python and now would like to give access to this model from the outside. For this reason I'm going to use Flask, micro-framework for Python to allow simple REST annotations directly in Python code.

To install Flask, enter into TensorFlow container:

docker exec -it RedSamuraiTensorFlowUI bash

Run Flask install:

pip install flask

Run Flask CORS install:

pip install -U flask-cors

I'm going to call REST from JS, this means TensorFlow should support CORS, otherwise request will be blocked. No worries, we can import Flask CORS support.

REST is enabled for TensorFlow model with Flask in these two lines of code:


As soon as Flask is imported and enabled we can annotate a method with REST operations and endpoint URL:


There is option to check what kind of REST operation is executed and read input parameters from POST request. This is useful to control model learning steps, for example:


If we want to collect all x/y value pairs and return in REST response, we can do that by collecting values into array:


And construct JSON response directly out of the array structure using Flask jsonify:


After we run TensorFlow model in Jupyter, it will print URL endpoint for REST service. For URL to be accessible outside TensorFlow Docker container, make sure to run TensorFlow model with 0.0.0.0 as in the screenshot below:


Here is example of TensorFlow model REST call from Postman. POST operation is executed payload and response:


All REST calls are logged in TensorFlow:


Download TensorFlow model enabled with REST from my GitHub.

JET Composite Component in ADF Faces UI - Deep Integration

Tue, 2017-12-05 12:21
Oracle JET team doesn't recommend or support integrating JET into ADF Faces. This post is based on my own research and doesn't reflect best practices recommended by Oracle. If you want to try the same - do it on your own risk. 

All this said, I still think finding ways of further JET integration into ADF Faces is important. Next step would be to implement editable grid JET based component and integrate it into ADF to improve fast user data entry experience.

Today post focus is around read-only JET composite component integration into ADF Faces. I would recommend to read my previous posts on similar topic, today I'm using methods described in these posts:

1. JET Composite Component - JET 4.1.0 Composite - List Item Action and Defferred Loading

2. JET and ADF integration - Improved JET Rendering in ADF

You can access source code for ADF and JET Composite Component application in my GitHub repository - jetadfcomposite.

Let's start from UI. I have implemented ADF application with regions. One of the regions contains JET Composite. There is ADF Query which sends result into JET Composite. There is integration between JET Composite and ADF - when link is clicked in JET Composite - ADF form is refreshed and it displays current row corresponding to selected item in JET Composite. List on the left is rendered from series of JET components, component implements one list item:


As you can see, there are two type calls:

1. ADF Query sends result and JET Composite. ADF -> JET call
2. JET Composite is forcing ADF form to display row data for selected item. JET -> ADF call

Very important to mention - JET Composite is getting data directly from ADF Bindings, there is no REST layer here. This simplifies JET implementation in ADF Faces significantly.

What is the advantage of using JET Composite in ADF Faces? Answer - improved client side performance. For example, this component allows to expand item. Such action in pure ADF Faces component would produce request to the server. While in JET it happens on the client, since processing is done in JS:


There is no call to the server made when item is expanded:


Out of the box - JET Composite works well with ADF Faces geometry. In this example, JET Composite is located inside ADF Panel Splitter. When Panel Splitter is resized, JET Composite UI is nicely resized too, since it is out of the box responsive. Another advantage of using JET Composite in ADF Faces UI:


When link "Open" is clicked in JET Composite - JS call is made and through ADF Server Listener we update current row in ADF to corresponding data. This shows how we can send events from JET Composite to ADF Faces:


It works to navigate to another region:


And come back - JET content is displayed fine even after ADF Faces PPR was executed (simple trick is required for this to work, see below). If we explore page source, we will see that each JET Composite element is stamped in HTML within ADF Faces HTML structure:


Great thing is - JET Composite which runs in JET, doesnt require any changes to run in ADF Faces. In my example, I only added hidden ID field value to JET Composite, to be able to pass it to ADF to set current row later:


I should give couple of hints regarding infrastructure. It is not convenient to copy JET Composite code directly into ADF application. More convenient is to wrap JET code into JAR and attach it this way to ADF. To achieve that, I would recommend to create empty Web project in JDEV, copy JET Composite code there (into public_html folder) and build JAR out of it:


Put all JET content into JAR:


If Web project is located within main ADF app, make sure to use Working Sets and filter it out to avoid it to be included into EAR during build process:


Now you can add JAR with JET into ADF app:


In order for JET HTML/JS resources to be accessible from JAR file, make sure to add required config into main ADF application web.xml file. Add ADF resources servlet, if it is not added already:


Add servlet mapping, this will allow to load content from ADF JAR library:


To load such resources as JSON, CSS, etc. from ADF JAR, add ADF library filter and list all extensions to be loaded from JAR:


Add FORWARD and REQUEST dispatcher filter mapping for ADF library filter from above:


As I mentioned above, JET Composite is rendered directly with ADF Bindings data, without calling any REST service. This simplifies JET Composite implementation in ADF Faces. It is simply rendered through ADF Faces iterator. JET Composite properties are assigned with ADF Faces EL expressions to get data from ADF Bindings:


JET is not compatible with ADF PPR request/response. If JET content is included into ADF PPR response - context gets corrupted and is not displayed anymore. To overcome this we are re-drawing JET context, if it was included into PPR response. This doesn't reload JET modules, but simply re-draws UI. In my example, ADF Query sends PPR request to the area where JET Composite renders result. I have overridden query listener:


Other methods, where PPR is generated for JET Composite - tab switch and More link, which loads more results. All these actions are overridden to call methods in the bean:


Method reDrawJet is invoked, which calls simple utility method to invoke JS function which actually re-draws JET UI:


JET UI re-draw happens in JS function, which cleans Knockout.JS nodes and reapplies current JET model bindings:


JET -> ADF call is made through JET Composite event. This event is assigned with JS function implemented in ADF Faces context. This allows to call JS located in ADF Faces, without changing JET Composite code. I'm using regular ADF server listener to initiate JS -> server side call:


ADF server listener is attached to generic button in ADF Faces:


ADF server listener does it job and applies received key to set current row in ADF. Which automatically triggers ADF form to display correct data:

JET 4.1.0 Composite - List Item Action and Defferred Loading

Mon, 2017-11-27 14:57
I was reviewing JET 4.1.0 features and decided to build simple composite component. Would like to share some of the items I learned.

Composite component comes with collapsible UI and action link, it implements list item, which can be rendered in any kind of parent UI container:


When item is expanded, more info is displayed:


Once user clicks on Open link, JS call is made and key from current item is printed in the background:


Let's take a look into component metadata. I'm using several properties and one event. Through event we can call JS method outside of composite component, this can be very useful:


How this event is used? Take a look into composite HTML implementation, click is handled by JS method, inside that function event will be initialized:


One more interesting thing - I'm using JET Defer functionality, to render HTML expandable content, when item is expanded. This allows to minimize client side load by lazy loading, renders content when it is displayed:


JS method, which handles click - creates event. This will allow to execute event implementation method outside composite component. Syntax looks pretty similar to ADF Server listener call:


Naming of the event is important. If event name is openDetails, then make sure to use on-open-details as property name for the event handler in composite component interface. Here I define external method to be invoked when event is fired:


Here is the code for event listener JS method in main view module:


Let's see how JET Defer works on runtime. This is source from JET running app, when list item is rendered initially, there is no content for expanded block - it only contains JET Defer block:


After we expand it:


Content is rendered in source. This is called lazy loading and this concept allows to improve performance, especially for large lists (or tabs, etc.), when a lot of content is rendered:


Now few hints how to create JET Composite with OJET tooling. Run similar command:

sudo ojet create component list-itemrs

This will create JET Composite with given name. To be able to run JET Composite from index page, add applyBindings to main.js require block:


Don't forget to add JET composite module and actual composite loader:


Finally add your own JET Composite tag to index.html:


Sample code can be accessed in GitHub repository (run ojet restore) - list-itemrs.

TensorFlow - Getting Started with Docker Container and Jupyter Notebook

Thu, 2017-11-23 10:38
I'm studying Machine Learning and would like to share some intro experience working with TensorFlow. To get started with TensorFlow you need to install it, easiest way (at least for me) was to run TensorFlow using Docker. Read installation instructions - Installing TensorFlow.

Once TensorFlow Docker image is installed. I suggest to create container in detach mode (--detach=true) and provide port for Jupyter UI. Make sure to provide meaningful name for Docker container:

docker run --detach=true --name RedSamuraiTensorFlowUI -it -p 8888:8888 gcr.io/tensorflow/tensorflow

Make sure to start and stop container using Docker start/stop commands, don't run and create container each time (this way you will loose your work, since new container will be created each time):

docker start RedSamuraiTensorFlowUI (docker stop RedSamuraiTensorFlowUI)

Once container is running in detached mode, you can access logs by executing docker logs command and specifying container name:

docker logs -f RedSamuraiTensorFlowUI

At this point you should see output in Docker container log, copy URL to Jupyter UI with token and paste it to the browser (for example: http://localhost:8888/?token=d0f617a4c719c40ea39a3732447d67fd40ff2028bb335823):


This will give you access to Jupyter UI. Is possible to run TensorFlow Python scripts directly through command line in Docker environment, but is more convenient to do the same through UI:


UI gives option to create new Terminal session:


Terminal allows to run Python code using command line:


Instead of using command line, more convenient is to create new notebook:


Notebook environment allows to type in Python code and execute math calculations. In the example below I multiply two arrays (1x5, 2x6, 3x7, 4x8) in Python code through TensorFlow library. Result is printed through TensorFlow session object right below and prompt for the next command is displayed - very convenient:


Jupyter UI allows to track running notebooks and terminals:


Whatever action you do in Jupyter UI, it can be tracked using log printed in Docker container log. Jupyter UI is client side JS application:


To double check Docker config, I have TensorFlow Docker image:


And Docker container, which can be started/stopped by name (see command listed above), without running new Docker container every time during restart:

How to Setup Node.js and Oracle JET on Oracle Linux

Sun, 2017-11-19 08:35
What if you want to develop Oracle JET in Oracle Linux? Certainly this is possible - both Node.js and Oracle JET run on Oracle Linux or any other Linux distribution. If you follow Oracle JET Setup Guide, you will see Node.js is listed as prerequisite. But it may not be exactly straightforward to install Node.js on Oracle Linux. Below I will guide you through the steps.

Run command to install development tools to build native add-ons to be installed:

yum install -y gcc-c++ make


Enable Node.js yum repository:

curl -sL https://rpm.nodesource.com/setup_8.x | sudo -E bash -


Install Node.js:

sudo yum install nodejs


Verify if node and npm was installed successfully:


Next you can follow steps described in Oracle JET setup guide, all straightforward. Install Oracle JET:

sudo npm -g install @oracle/ojet-cli


Create new JET application with OJET utility:

sudo ojet create jetwinapp --template=navdrawer


Run application with OJET utility:

sudo ojet serve


JET application runs on Oracle Linux:

ADF Performance on Docker - Lighting Fast

Fri, 2017-11-17 02:32
ADF performance depends on server processing power. Sometimes ADF is blamed for poor performance, but in most of the cases real issue is related to poor server hardware, bad programming style or slow response from DB. Goal of this post is to show how fast ADF request could execute and give away couple of suggestions how to minimize ADF request time. This would apply to ADF application running on any environment, not only Docker. I'm using ADF Alta UI based list application with edit fragment.

Rule number one - enable response compression. This will allow to transfer less data and obviously response will execute faster - shorter content download time. See in the screenshot below - JS file is compressed to 87 KB from original 411 KB. Initial page load in ADF generates around 3 MB of content (if this is very first access and static content is not cached yet on client side). With compression initial load of 3 MB will be around 300 - 400 KB. Thats a big difference. In this example ADF page opens in 1.2 seconds (this is equal to client side JS applications, if static content is downloaded on first access):


You can enable content response compression in WebLogic console (will be applied for all deployed Web apps). Go to domain configuration, Web Applications section:


Select checkbox to enable GZIP compression and provide a list of content types to be compressed:


Thats it - content compression is set.

When I navigate to edit fragment - request is executed in 305 ms. Thanks to fast Docker engine (running on Digital Ocean - Oracle ADF on Docker Container) and content response compression: 3.44 KB transferred for 14.49 KB original content:


Let's try Save operation. I changed Hire Date attribute and then pressed Save button. This will trigger Commit operation in ADF, push data to ADF BC and then execute DML statement with commit in DB. All these steps are completed in 113 ms.


Don't believe anyone who says ADF is slow. As you can see, ADF request is very fast fundamentally - but of course it can become slow, if you add a lot of data fetch and processing logic on top (blame yourself). Client side JS application would not run faster, if it would call backend REST service to save data. The only advantage of JS client side application in this case would be that it executes backend REST call asynchronously, while ADF calls requests in synchronous manner. However, it all depends - sometimes asynchronous calls are not suitable for business logic either.

How come ADF BC call to DB completes so fast? For that we need to check Data Source Connection Delay Time on WLS. In Docker (Digital Ocean) environment it is ridiculously short (thats very good): 66 ms. Check same on your server (go to Data Source monitoring in WLS console), longer delay time means slower response from DB and slower ADF performance:


Navigation back to the list runs in 356 ms, with 197.96 KB of content compressed to 10.47 KB. This is very fast, 350 ms response time is something that user would not notice (almost equal to processing on client side):


To optimize ADF performance, make sure you are using ChangeEventPolicy = NONE for iterators in Page Definitions:

ADF Performance Story - This Time Developer Was Wrong

Fri, 2017-11-10 12:19
ADF is fast. If ADF application is slow, most likely this is related to development mistakes. I would like to tell you one story, based on my ADF tuning experience. Problem description: ADF application runs fast in DEV, when DB size is small. Same application runs slow in TEST/PROD, when DB size is large. Question - what is slow. Answer - slow means forms are loading slow. Ok, lets go to the story.

Developer decides to fetch data from DB and iterate over rows to process them. This is already bad practice, because it is much more effective to process large sets of rows in DB directly, without fetching to middle tier. But let's assume this is valid use case and we really need to fetch rows. Developer implements fetching using getAllRowsInRange method:


VO data is loaded on UI and displayed in the table. Method to iterate through rows is called from button, this logic is not executed on initial load. Iterator is set with Range Size = 10:


This means first 10 rows are fetched on form load and it will open fast no matter of DB size:


This is OK, but while testing row fetching functionality - developer finds a "bug". He will see that instead of fetching all rows in custom method, only 10 rows are returned. And then developer decides to implement a fix - change Range Size to -1:


Now all rows are fetched in custom method, developer is happy. But there is small side effect - for some reason table starts to display all rows. Not good, another fix is needed:


Auto Height Rows = 10 is set to prevent table displaying too many rows. But really still all rows will be fetched, because Range Size = -1:


All these fixes are wrong. Method getAllRowsInRange is not supposed to be used to iterate through all rows, it will return only currently fetched rows. Such implementation obviously will slow down form load functionality, it will fetch all rows from DB, if DB size is large - it will slow down significantly.

If all what you need is to iterate through rows, make sure you dont affect data which is displayed/fetched for UI. Keep Range Size positive:


In custom method iterate through rows by creating rowset iterator:


Download sample application - ADFRangeSizeApp.zip.

Essential WebLogic Tuning to Run on Docker and Avoid OOM

Sun, 2017-11-05 09:43
Read my previous post about how to run ADF on Docker - Oracle ADF on Docker Container. Docker WebLogic image is based on official Oracle Docker image for FMW infrastructure - OracleFMWInfrastructure. WebLogic container created based on this image runs, but not for long - eventually JVM process eats up all memory and OOM (out of memory) exception is thrown. This is known issue related to JVM running in Docker container - Running a JVM in a Container Without Getting Killed. Good news - we can switch on WebLogic memory management functionality to prevent OOM error while running in Docker container. This WebLogic functionality is turned on with special flag -XX:+ResourceManagement. To set this flag, we need to update startWebLogic.sh script, but probably we dont want to rebuild Docker container. Read below how to achieve this.

First we need to access startWebLogic.sh script from Docker container. Make sure Docker container on your host is running and execute Docker copy command:

docker cp RedSamuraiWLS:/u01/oracle/user_projects/domains/InfraDomain/bin/startWebLogic.sh /Users/andrejusbaranovskis/infra/shared

This will copy startWebLogic.sh file from Docker container to your host system.

Search in startWebLogic.sh script content and search for resource management config. By default it is commented out. Set this string for JAVA_OPTIONS. This enables WebLogic resource management and G1GC garbage collector:

JAVA_OPTIONS="-XX:+UnlockCommercialFeatures -XX:+ResourceManagement -XX:+UseG1GC ${SAVE_JAVA_OPTIONS}"

startWebLogic.sh script contains comment, where it recommends to enable this option:


Once JAVA_OPTIONS variable is updated, copy startWebLogic.sh script back to Docker container:

docker cp /Users/andrejusbaranovskis/infra/shared/startWebLogic.sh RedSamuraiWLS:/u01/

Enter into Docker container command prompt (in my case user 501 is root user for Docker container):

docker exec -u 501 -it RedSamuraiWLS bash

Change file permissions for startWebLogic.sh:

chmod 777 startWebLogic.sh

Enter into Docker container as oracle user:

docker exec -it RedSamuraiWLS bash

Copy startWebLogic.sh script from u01 into bin folder (overwrite existing script file):

cp startWebLogic.sh /u01/oracle/user_projects/domains/InfraDomain/bin

Stop Docker container and run docker commit to create new image (which includes change in startWebLogic.sh):

docker commit RedSamuraiWLS abaranovskis/redsamurai-wls:v2

Docker image is created with delta change only, this allows to save space. Run docker images command to verify if new image is created successfully:


Run docker push to upload new image version into Docker repository. Upload will happen fast, because it will upload only delta of changes:

docker push abaranovskis/redsamurai-wls:v2

You should see new image version uploaded into Docker repository:


To run container online, we can login into Digital Ocean console and execute docker run command (I'm using container memory limit -m 4g (4 GB)) -  it will pull and run new image:


Once docker container is running, execute top command in Digital Ocean console to monitor memory consumption. Java process memory consumption should not grow, if there is no user activity in WebLogic server:

Oracle ADF on Docker Container

Fri, 2017-11-03 10:56
Want to run Oracle ADF on Docker? This is possible, I will explain how. If you are new to Docker, it may require to spend significant amount of time to get started with all different bits and pieces. I will try to explain all essential steps, so that you will get up to speed quickly.

First of all you need to have DB accessible, check my previous post explaining how to run Oracle DB on Docker - Oracle Database Docker Image in Docker Cloud (Digital Ocean). DB is required to install RCU schema for WebLogic installation with JRF files.

I have built my own Oracle Fusion Middleware Docker image using Oracle Docker images - Oracle Fusion Middleware Infrastructure on Docker.

First step is to build Oracle JDK (Server JRE) image, this is pre-requisite to build Oracle Fusion Middleware Docker image. Read through instructions documented on Oracle Fusion Middleware Infrastructure on Docker GitHub page. You should navigate to Oracle Java folder (download Oracle Docker files from GitHub link mentioned above) and copy there JDK installation file:


Run command to create JDK Docker image:

./build.sh

Command output:


Double check to verify if image was created successfully by running docker images command:


Let's move on to Oracle FMW image creation. Navigate to Oracle FMW folder and copy FMW infrastructure installation file (I'm installing 12.2.1.3):


Move one folder up and run command:

./buildDockerImage.sh -s -v 12.2.1.3

To build Oracle FMW image. I use flag -s to skip checksum verification for installation file. You should run command from this folder:


You will see long output in the log for this command:


It installs WLS into Docker image:


Run docker images command to verify if image was created successfully:


In the next step, we will create FMW domain and extend it with ADF support. But before that we need to make sure DB details are set correctly, to be able to install RCU schema. Oracle provides infraDomain file with DB and WLS properties, make sure to set correct DB details. If properties are not correct, RCU creation will fail:


Execute docker run command to startup WLS Docker container. During first start up it will create and extend WLS domain with ADF support:

docker run -d -p 7001:7001 --name RedSamuraiWLS --env-file ./infraDomain.env.list oracle/fmw-infrastructure:12.2.1.3

Flag -d means container will run in detached mode and we will be able to return to command prompt. Port with name is specified along with environment properties file. Make sure to reference FMW image which was created in the step above. Once control is returned back to the prompt, run docker command to check status of docker container (flag -a means to show all containers):

docker ps -a

Container should be in the running state. First startup takes longer, because it requires to setup and extend WLS domain:


Once domain is extended, you will see WebLogic starting:


Finally WebLogic should be in Running state:


Run again docker ps -a command to verify container state, it should be up and running:


Once WLS machine is up, you can navigate to Enterprise Manager through URL from outside of Docker container, for example from your host. Login to EM and you will see Admin server is up, but Managed Server is down. There is a way to startup Managed Server too, but if you want to run ADF apps for DEV environment, realistically speaking Admin server is more than enough for deployment too:


Simply delete (this cab done from EM) Managed Server and cluster, keep only Admin Server:


I have deployed sample ADF application:


This application is based on ADF BC, data source is defined too:


ADF application runs from WebLogic on Docker:


Now lets see how to push newly created container to Docker registry.

First we need to create new Docker image from Docker container. This can be done with docker commit command (pointing to container ID and specifying Docker repository name and tag):

docker commit da03e52b42a2 abaranovskis/redsamurai-wls:v1

Run docker images command to verify new image is created successfully. Next run docker login to authenticate with Docker repository. Run docker push to write image to Docker repository:

docker push abaranovskis/redsamurai-wls:v1

Commands execution sequence:


Pushed image should appear in docker repository:


Once image is in Docker online repository, we can startup online Docker container, so that WLS will be accessible online. This can be done through command line or using Docker Cloud UI interface. You can create new container by referencing image from Docker repository:


Our WLS docker container with ADF support runs on Digital Ocean:


Logs are accessible from Docker Cloud UI and you can see server status:

Oracle Database Docker Image in Docker Cloud (Digital Ocean)

Fri, 2017-10-27 17:41
Red Samurai is using Docker for our internal development environment. We are running Oracle Database and WebLogic server (with ADF support) on Docker Cloud. It is much easier to manager Docker containers than to maintain our own server.

In this post I will describe how to setup Oracle Database in Docker Cloud. Of course you could run Docker container locally on your machine, but main power of Docker comes with the option to be able to run container in the Cloud.

If you are new to Docker and Docker Cloud, it is very easy to be lost - there are so many various blogs and articles, hard to understand from where to start. I would recommend to start from Get Started, Part 1: Orientation and setup tutorial. You will learn most of the important Docker commands and push Docker image to the registry. Next check this tutorial - Create your first service. It describes how to create Cloud Node through Docker Cloud interface. Keep in mind - Docker itself doesn't run Docker Container, it provides connectors to various Cloud providers for Docker - Azure, Amazon, Digital Ocean, etc. I did some research and it looks like Digital Ocean is the best option to run Docker container - pricing model is straightforward, registration and management processes are simple and clear.

First step should be to create Docker Node in Docker Cloud (you must be connected to Cloud provider). Here is the info for our Digital Ocean node instance:


Node creation process is straightforward. Once node is created, it becomes part of Node cluster:


Docker Cloud node creation process triggers droplet (Cloud instance) creation in Digital Ocean. This process takes around 5 minutes in total.

Digital Ocean provides very nice UI to manage and monitor droplet:


Graphs to monitor droplet performance:


Digital Ocean provides option to run administration console directly in the browser, without even using separate SSH connection (this is very convenient for quick administration).

I found it to be the easiest way to create Oracle DB docker image in Digital Cloud droplet by executing docker run command directly in droplet administration console. I'm using official Oracle Database Enterprise Edition Docker image.

With docker run command, i can pull and run Docker image (referencing official Oracle DB Docker registry). Where detach=true means container will run without blocking console:

docker run --detach=true --name RedSamuraiDB -p 1521:1521 -p 5500:5500 -e ORACLE_SID=RedSamuraiDB -e ORACLE_PDB=ORCLPDB1 store/oracle/database-enterprise:12.2.0.1

Command is executed from Digital Ocean droplet console:


Image is downloaded directly from Docker Store into Digital Ocean droplet:


With docker run command, image is not only downloaded, but also container is started. We can execute docker ps -l to see if container was started successfully:


Finally we need to reset default password (Oradoc_db1) set for Oracle DB Docker container. This can be done by logging into sqlplus from Digital Ocean droplet console. First we need to enter into Docker container prompt by executing (use Docker container name):

docker exec -it RedSamuraiDB bash

Execute:

ALTER USER SYS IDENTIFIED BY newpass
ALTER USER SYSTEM IDENTIFIED BY newpass


DB is accessible from outside:


This brings power of Docker - Oracle DB setup in few minutes.

Takeaways from OOW'17

Sun, 2017-10-08 09:28
Let's summarise important points from OOW'17 conference. This year conference was a breakthrough for Oracle JET - most of Oracle Cloud UI's were implemented with Oracle JET. Oracle ADF is used for Oracle SaaS (Fusion Middleware) back office applications, same as before. There is often misunderstanding in JET vs. ADF comparison. Keep in mind - JET is only UI layer and it can be compared with ADF Faces only. While ADF is much more than UI - ADF BC Model, ADF Task Flows and ADF Bindings.

1. Use JET for front-end apps, with relatively simple complexity but strong requirement for performance and scalability. Back-end can be implemented with ADF BC REST. Typically it would take longer to implement JET UI comparing to ADF Faces (no drag and drop support, unless using Visual Cloud Builder Service for JET)

2. Use ADF Faces for complex and large back-office applications. Some degree of mix between ADF Faces and JET is possible. Back-end should be done in ADF BC

ADF is stable technology and Oracle is focusing to advertise new things on OOW. Recommended focus for ADF developers was around JCS, DevCS, JET and VBCS:


There was one session about how to move existing ADF app into the cloud with JCS. And session about ADF BC REST, no new features - mainly showing customer system implementation with current features.

JET open source contribution was announced, along with new JET offline support - persistence toolkit (will be available through NPM in next couple of weeks):


Oracle explained why handling offline use case in JET is important:


Persistence toolkit will run on JET hybrid, web and desktop apps (yes you can run JET as desktop app, with 3-rd party Electron plugin):


Here you can read more detail info about persistent toolkit design principles:


JET supports Fetch API (used by persistence toolkit):


Architecture explanation for JET persistence toolkit:


Another hot topic related to JET - Web components. Slots are supported, this allows to build advanced Web components with dedicated facets (ADF Faces terminology):


OOW'17 was packed with chatbot demos. Unfortunately there is no trial access for chatbot cloud yet. There was good session about chatbots from Lyudmil Pelov - he explained how Oracle chatbot service and language processing works:


I was impressed with Oracle Visual Cloud Builder Service progress this year. I really liked this service and I think it will become JET IDE development tool. Is not focused for business users only, but is targeted for JET developers too. It comes with embedded DB or you can call external services too. Next versions will allow direct JET code development, combined with drag and drop:


Develop JET in VBCS using drag and drop where possible and code additional logic in JS/HTML:


VBCS is not targeted for business users anymore, they target it for coders (us) too - thats news I like:



RDK UI is available now for ADF, JET and MAF. ADF RDK was updated to newer version. ADF and JET RDK UI look is identical.

New ADF RDK UI:


JET RDK UI:


Machine Learning was another big thing on OOW and JavaOne. Nice to hear things I was studying in university about neural networks are becoming hype now:


JVM garbage collection tuning. Is better to have objects to be dead young. This means in ADF is better to use Backing Bean Scope where possible, because short scope bean is destroyed after request and this allows VM to clean memory:

Pages