Fusion Middleware

A Fishbowl Success Story: The Benefits of Consolidating Disparate CAD Databases

A large medical device manufacturer wanted to fully integrate their R&D, engineering, and manufacturing organizations. This would allow a more efficient, capable and robust product development system that would help the flow of new, innovative products and never fall short on quality.

One key obstacle was the amount of data scattered across the globe in various PDM, PLM and network folders.  This data needed to be organized and consolidated into a unified system with quality processes that would achieve FDA certification.  This consolidation would enable individuals to access accurate data from any location at any time.  Just from a CAD data perspective, there were 100’s of thousands of Solidworks files across 7+ locations around the world in 4+ PDM/PLM systems plus random network file folders.

The company partnered with Fishbowl to migrate the Solidworks PDM, PLM, CAD systems into their single global Windchill PDMLink system.  A key criterion for them choosing Fishbowl was Fishbowl’s LinkExport and LinkLoader family of products.  LinkExport automates the data extraction from PDMWorks and Enterprise PDM and LinkLoader automates the bulk loading into Windchill.

THE PLAN

The migration plan was to have separate migrations for each location.  Each production migration would be able to be completed over a weekend to minimize business impact (e.g. users would check files into PDMWorks – or whatever – on Friday and then check them out of Windchill on Monday).  This approach spread out the work and lowered risk since each location also needed to comply with quality audits as part of their test and production migration passes.

RESULTS

Fishbowl successfully executed 7 migrations that consisted of 100,000+ files total.  60,000+ files came from five separate Enterprise PDM and PDMWorks systems and another 40,000+ files from network file folders.  All data was bulk loaded into a single Windchill PDMLink and each migration was completed over a weekend so minimal disruption occurred.  The project ROI was less than 6 months, and the increase efficiencies and innovation have resulted in huge corporate gains.

 

Contact     Rick Passolt for more information on LinkLoader and LinkExport
Webinar: Automate and Expedite PTC Windchill Bulk Loading

 

Date: August 17th, 2017

Time: 1:00-2:00pm CST

Speaker: Rick Passolt – Senior Account Executive

Register

The post A Fishbowl Success Story: The Benefits of Consolidating Disparate CAD Databases appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Global Financial Services Company Leverages Oracle WebCenter Content for Compound Document Management to Support Underwriting Memo Application

For this week’s case study, our featured customer, a global financial services company, came to Fishbowl looking to replace their current system they had for composing financial underwriting documents. The company’s existing system was 15 years old but had since been sold and left to languish among its customers. Additionally, as the tool had not been updated it was becoming vastly more difficult to use and maintain in a fast-paced environment. Then, our client looked into creating a custom underwriting memo application on Oracle WebCenter Content with Fishbowl.

Working together, our client and the Fishbowl Solutions product development team worked to build, test, and deploy a new solution to create a modern system with Oracle WebCenter Content. The collaboration between Fishbowl and our featured client proved its success as WebCenter’s content management capabilities and user interface elements reduced credit memo application processing time by 25%.

 

BUSINESS DRIVERS
  • Reduce underwriting process time to enable faster transactions
  • Replace inefficient and archaic system for composing financial underwriting documents
  • Integrate and assemble all content needed for underwriting process to users of current credit application software
  • Ensure content needed for underwriting memo application is securely managed yet highly available
SOLUTION SUMMARY
  • Fishbowl configured Oracle WebCenter Content to manage all content needed for underwriting memo application
  • Integrated Fishbowl’s Compound Document Assembly within company’s credit underwriting system
  • Underwriting memo presented as chapters which include risk factors, business description, operating risk, etc.
  • Compound Document Assembly collates documents and includes non-text elements such as spreadsheets
  • Users can check in/check out the documents and their sections directly from underwriting memo application
  • Users can edit a section of the underwriting memo while another user edits a different section
  • Document structures can be viewed as tabs allowing users to quickly and easily navigate from one report to another
  • Users receive notifications related to any work within system
  • All changes tracked within underwriting memo and versions stored in Oracle WebCenter
CUSTOMER BENEFITS
  • Content management capabilities and user interface elements reduced credit memo application processing time by 25%
  • Content publishing time greatly reduced providing quicker reviews and increased collaboration for underwriting team
  • Documents can be collated and printed for reporting purposes

The post Global Financial Services Company Leverages Oracle WebCenter Content for Compound Document Management to Support Underwriting Memo Application appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Upgrading to Oracle WebCenter Content or Portal 12c: If not now, when?

Fishbowl Solutions will be kicking off a webinar series starting next Thursday, August 3rd. Our first webinar topic will be “5 Key Reasons to Upgrade to Oracle WebCenter Content or Portal 12c”. Why did we pick this topic, and why is this topic relevant now? Those are both good questions, especially if you are a well-informed WebCenter customer and you know that 12c was released almost 2 years ago.

To answer those questions, please let me start by stating that Fishbowl Solutions has performed many WebCenter upgrades over the years. While each one may have been different in size and scope, we have seen some common reasons/themes emerge from what drove customers to start their upgrade when they did.

Why upgrade to WebCenter 12c Now?
  • Get Current with Support and Maintenance
    • Premier and Extended support for 10g customers has elapsed. Most of the customers we talk to know this, but they might not know that they can do an upgrade directly from 10g to 12c. When you consider that Premier support for WebCenter Content and Portal 11g elapses in December of 2018, it makes sense to go directly to 12c instead of 11g. You can review Oracle’s Support Policies for Fusion Middleware here.
  • Explore Cloud Options for Content Management
    • With the release of 12c, Oracle introduced ways to integrate and share content between Oracle WebCenter on premise and the Oracle Content and Experience Cloud. This provided an easy way for organizations to share and collaborate on documents. If your organization is still deciding on your roadmap for content management – on premise, hybrid, cloud first – 12c provides the capabilities to explore use cases for the cloud while maintaining your content on premise.
  • Content and System Consolidation
    • Some legacy WebCenter customers come to the realization that they have too many instances of the system in place, as well as disparate/duplicate content being managed. Instead of trying to audit each one of their individual systems and fix or change any metadata issues, security groups, etc., they decide that doing an upgrade rectifies a lot of these problems, and enables them to get rid of content no longer needing management or retention.
  • Growing List of Environment & Technology Dependencies
    • Perhaps your organization wants to move the latest version of Oracle Database, but you can’t because your legacy WebCenter system utilizes an older version. Unless you upgrade WebCenter, your organization as a whole may be impacted by not being able to utilize the newest version of associated or dependent technologies.
  • User Expectations – Better User Experience
    • WebCenter Content and Portal 12c provide a better user experience for users and administrators. Since organizations want everyone to experience these better interfaces, they start to consider who the actual users of the system are, and they build an experience designed for each of those user personas. So while the upgrade to 12c would have improved the overall experience, organizations use the upgrade to design the best experience possible to ensure widespread adoption and overall use.

We will discuss each of these in more detail during the webinar next Thursday. You can find more information and register for the webinar here.

We hope you can join us.

 

The post Upgrading to Oracle WebCenter Content or Portal 12c: If not now, when? appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Accessing Pivotal Cloud Foundry droplet file system when "cf ssh" isn't enabled

Pas Apicella - Thu, 2017-07-27 00:03
In order to view your application layout you can simply use "cf ssh" to log into the container and then view the files created as part of the droplet. The problem is "cf ssh" isn't always enabled bye the Ops team so what is your alternative in cloud foundry?

You can use "cf curl" to invoke an endpoint using the application GUID as shown in the steps below.

** cf ssh demo **

pasapicella@pas-macbook:~/temp/droplets$ cf ssh pas-swagger-demo
vcap@ef9e4e93-0df9-47a7-5351-dccf:~$ ls -lartF
total 16
-rw-r--r-- 1 vcap vcap  675 Apr  9  2014 .profile
-rw-r--r-- 1 vcap vcap 3637 Apr  9  2014 .bashrc
-rw-r--r-- 1 vcap vcap  220 Apr  9  2014 .bash_logout
drwxr-xr-x 2 vcap vcap    6 Jun 14 03:32 deps/
drwxr-xr-x 1 vcap root   72 Jun 14 03:32 app/
-rw-r--r-- 1 vcap vcap 1087 Jun 14 03:32 staging_info.yml
drwxr-xr-x 2 vcap vcap    6 Jun 14 03:32 logs/
drwx------ 1 vcap vcap   76 Jun 14 03:32 ./
drwxr-xr-x 1 root root   18 Jul 26 23:45 ../
drwxr-xr-x 4 vcap vcap   92 Jul 26 23:48 tmp/
vcap@ef9e4e93-0df9-47a7-5351-dccf:~$

** Steps **

1. Download droplet as follows

Format:

   cf curl /v2/apps/`cf app {appname} --guid`/droplet/download > droplet.tar.gz

Example:

pasapicella@pas-macbook:~/temp/droplets$ cf curl /v2/apps/`cf app pas-swagger-demo --guid`/droplet/download > droplet.tar.gz

To determine the app name you can either use Applications manager UI or use "cf apps" to get the app name


2. This will take some time due to the size of the droplet but when done verify you have this on the file system

pasapicella@pas-macbook:~/temp/droplets$ ls -la
total 150736
drwxr-xr-x   3 pasapicella  staff       102 Jul 27 14:20 .
drwxr-xr-x  23 pasapicella  staff       782 Jul 27 14:19 ..
-rw-r--r--   1 pasapicella  staff  77173173 Jul 27 14:23 droplet.tar.gz

3. Gunzip followed by tar -xvf and you will then have a file system replicator of what your application droplet looks like in CF

pasapicella@pas-macbook:~/temp/droplets$ d
total 313408
drwxr-xr-x   2 pasapicella  staff         68 Jun 14 13:32 deps/
drwxr-xr-x   6 pasapicella  staff        204 Jun 14 13:32 app/
drwxr-xr-x   2 pasapicella  staff         68 Jun 14 13:32 tmp/
-rw-r--r--   1 pasapicella  staff       1087 Jun 14 13:32 staging_info.yml
drwxr-xr-x   2 pasapicella  staff         68 Jun 14 13:32 logs/
drwxr-xr-x  23 pasapicella  staff        782 Jul 27 14:19 ../
-rw-r--r--   1 pasapicella  staff  160460800 Jul 27 14:23 droplet.tar
drwxr-xr-x   8 pasapicella  staff        272 Jul 27 14:25 ./


You really only want to do this to see how your application was staged on the file system as the buildpack may have changed some files or added files based on what you deployed. This is not how you would debug an application but rather view what the file system looks like for your application itself and what content exists in the files should the buildpack have changed file content for example.

Categories: Fusion Middleware

Pivotal Cloud Foundry Isolation Segments Applications demo

Pas Apicella - Mon, 2017-07-24 00:16
PCF Isolation Segments Tile allows operators to isolate deployment workloads into dedicated resource pools called isolation segments. You can read more about how to install the Tile at the following location.

https://docs.pivotal.io/pivotalcf/1-11/opsguide/installing-pcf-is.html

In this demo I will show how you can configure your ORGS/spaces to use an isolation segment and then finally show that it is indeed using the isolated Diego Cells assigned to the segment tile at install time.

1. Determine the isolation segment name as per the Ops Manager title



2. Ensure you have CF CLI version as 6.26 or higher as shown below

pasapicella@pas-macbook:~$ cf --version
cf version 6.28.0+9e024bdbd.2017-06-27

3. First, you need to register an Isolation Segment with the cloud controller.

Note: you will need a prividledged user such as admin to perform this task

$ pasapicella@pas-macbook:~$ cf create-isolation-segment segment1
Creating isolation segment segment1 as admin...
OK

4. After registering the Isolation Segment, you can associate it with an Org

pasapicella@pas-macbook:~$ cf enable-org-isolation pas-org segment1
Enabling isolation segment segment1 for org pas-org as admin...
OK

5. List isolation segments against targeted ORGS as follows

pasapicella@pas-macbook:~$ cf isolation-segments
Getting isolation segments as admin...
OK

name       orgs
shared
segment1   pas-org

6. After associating an Isolation Segments with an Org, you then set it on a Space

- First target the ORG you wish to use

pasapicella@pas-macbook:~$ cf target -o pas-org
api endpoint:   https://api.yyyy.pcfdemo.yyy
api version:    2.82.0
user:           admin
org:            pas-org
No space targeted, use 'cf target -s SPACE'

- Now set the isolation segment on your desired space

pasapicella@pas-macbook:~$ cf set-space-isolation-segment isolation-segment-space segment1
Updating isolation segment of space isolation-segment-space in org pas-org as admin...
OK

In order to move running applications to this isolation segment, they must be restarted.

7. Log in as a user of the ORG/SPACE now rather then ad admin user. We will be using a
non privileged user now to push our app who has access to the ORG/Space we targeted
above. Below just shows we have switched to a different user here.

pasapicella@pas-macbook:~$ cf target
api endpoint:   https://api.yyyy.pcfdemo.yyy
api version:    2.82.0
user:           pas
org:            pas-org
space:          isolation-segment-space

8. Push an application to the space "isolation-segment-space"

pasapicella@pas-macbook:~/piv-projects/PivotalSpringBootJPA$ cf push -f manifest-inmemory-db.yml
Using manifest file manifest-inmemory-db.yml

Creating app pas-albums in org pas-org / space isolation-segment-space as pas...
OK

....

0 of 1 instances running, 1 starting
0 of 1 instances running, 1 starting
1 of 1 instances running

App started


OK

...


     state     since                    cpu      memory           disk           details
#0   running   2017-07-24 02:33:38 PM   225.3%   330.1M of 512M   162.8M of 1G

9. Identify IP address of diego CELL the app is running in

pasapicella@pas-macbook:~/piv-projects/PivotalSpringBootJPA$ cf curl /v2/apps/$(cf app pas-albums --guid)/stats | jq '.[].stats.host'
"10.10.10.71"

With this information you can verify the IP address of the Diego Cells you have for the Isolation Segment as follows using Pivotal Ops Manager Tile tabs. From the images below it's clear 10.10.10.71 is a diego cell IP address of our isolation segment.





More Information


Categories: Fusion Middleware

Protecting Financial Data with Oracle WebCenter and Adobe LiveCycle

For over 15 years, Oracle WebCenter has been used by organizations to store, manage, and retain their high-value content. During that time, Fishbowl has helped customers leverage the system to solve many common and unique content management problems. We want to share some of those success stories with you, with the hope that they will help you form new ideas on how to further leverage WebCenter in your organization. Starting today, we will be publishing an “Oracle WebCenter case study of the week “. These case studies will highlight the ways customers are using WebCenter to solve their business problems and drive new process efficiencies.

This week’s customer case study details a global manufacturer of aluminum rolled products. This company came to Fishbowl in search of a solution to make access to payroll information much more available to employees and financial officers, as well as secure the information provided. Fishbowl utilized Oracle WebCenter Imaging & Capture and Adobe LiveCycle to satisfy this content management use case, and also help the customer save around $75,000.

Business Drivers
  • Reduce costly distribution processes involving printing and mailing over 30,000 pages of reports per year.
  • Make access to payroll information much more readily available to employees and financial auditors.
  • Ensure payroll data stored in Oracle WebCenter is highly secure.
Solution Summary
  • Fishbowl implemented WebCenter Capture and Imaging to scan and manage over a dozen types payroll-related reports including payroll closing, direct deposits, W-4s, and garnishments.
  • Imaged documents output to directory where security policies are applied using Adobe Live Cycle’s Information Rights module. This further ensures unauthorized document access.
  • Documents with security information uploaded and stored in existing Oracle WebCenter Content instance and available for viewing by authenticated users.
Oracle WebCenter and Adobe LiveCycle

Document flow from capture with WebCenter to securing content with Adobe Information Rights Mangement.

Customer Benefits
  • Reduced estimated yearly cost of $75,000 to print and mail over 30,000 payroll-related documents.
  • Ensured that sensitive employee data cannot be seen by unauthorized users.
  • Created a much more accessible and simple Payroll processing system to manage and retain the company’s 16,000+ documents.

 

The post Protecting Financial Data with Oracle WebCenter and Adobe LiveCycle appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

The Future of Content Management: Oracle Content & Experience Cloud

What is the Content and Experience Cloud?

Content and Experience Cloud (CEC) is Oracle’s cloud platform solution for content management and delivery. It brings together Oracle’s Documents Cloud Service (Content) and Oracle’s Sites Cloud Service (Experience) to make a centrally managed platform for your business to contribute, collaborate, and share its content. It sets out to solve many of the headaches associated with content management solutions of the past and present, including:

  • Poor user experience
  • Security concerns
  • Limited access to content and collaboration

This can be drawn as a parallel to Oracle’s motto throughout marketing their Documents Cloud Service: “Simple, Secure, Everywhere”.

In this post, I’m going to detail how Content and Experience Cloud meets each of these challenges, describing some of the features available. I’ll also give an overview of some of the custom development efforts I’ve achieved in the past few weeks, and what kind of enterprise applications could be developed using similar approaches.

Solving the Problems with Traditional Content Management Systems – Including Oracle WebCenter.

User Experience – Low user adoption and poor user experience have been major challenges facing legacy content management systems. Oracle Content & Experience Cloud aims to remedy some of these problems in a number of ways.

  • Mobile, tablet, and desktop access:
    • Oracle adopted a mobile-forward design pattern for CEC interfaces to adjust for devices that can be used anywhere.
    • View, edit, and manage files from any of these devices with the applications Oracle has provided. All desktop and mobile application downloads can be found together on the “Download Apps” page of your CEC service interface, while mobile apps can also be found on both major mobile app markets (Android App and iPhone App).
  • Share files or folders simply, with the ability to assign access levels to limit what can be done to the content.
  • Conversations can be started about folders, files, or a separate topic altogether.
    • Annotations can be made on specific parts of a document.
    • Documents can be attached to conversations.
    • Conversations can be accessed from the web, desktop, and mobile apps.
  • Integrations exist out of the box with programs like Microsoft Word and Excel for syncing documents or spreadsheets to the cloud. A UI overlay will appear on the program, visually confirming the document as it syncs to the cloud, and expands to provide users actions like viewing content access history and versions, starting or viewing the document’s conversation, or sharing the document with other members or with anyone by generating public links. Additional actions will also exist in the file menus, allowing users to manage nearly everything about their documents without needing to leave the editor.

Security – A concern of many businesses considering cloud content management is the safety of their files. Oracle secures files through a multi-layered approach.

  • Access to the CEC service requires a username and password managed by a service administrator.
  • Files are encrypted through SSL while in storage and transit to the cloud.
  • Content owners have access control to the content and folders, which can be customized for different tiers of access. Users who are given access to a file in a folder will not have access to the other files that exist within the folder.
  • Service admins have the option to configure virus scans on files upon upload to the cloud. Infected files will be quarantined from the system.
  • Passcodes can be set for mobile devices accessing the cloud. Any files downloaded from the cloud will additionally require authentication to the CEC app in order to be decrypted.
  • Websites can have security applied to control:
    • Individual user/group membership to the published site.
    • Who can see the site when it is (un)published.
    • Who can see or interact with secured content on the site.
  • CEC also include access to analytics, auditing and automatic backups.

Access to Content, and Collaboration – Productivity can suffer when content is difficult to access, or hard to find. Content and Experience Cloud provides availability to content anywhere, with streamlined methods of sharing and collaboration.

  • The CEC interface gives users the ability to rapidly collaborate internally or externally by sharing content with other members, or creating public links to folders or files.
  • Mobile, tablet, and desktop access out of the box allows users to view and manage content on the go.
  • Content can be worked on without internet access, and can be synced to the cloud once you regain connectivity.
  • Workflow and review processes allow content to easily and efficiently get published.
  • Conversations allow users to comment on files, folders, or digital assets (including the ability to highlight and annotate specific areas of text, and attach files to your comments).
Customizing Your Experience

Oracle provides several expanding development resources that can be used to customize sites on CEC. The modular structure of the site components, and use of modern web libraries and frameworks like RequireJS, KnockoutJS, and Mustache templating help streamline the process of site development, and create a more responsive and rich experience for the end user. I’ve developed a couple proof of concept examples which can serve as a stepping stone to custom enterprise components that are either static, or dynamically accessing files housed in the cloud service.

Custom Component #1: Update Static Content without Coding

Using some of Oracle’s development documentation as a base, the first component I created demonstrates the ability to update static page content through custom settings without touching the code. It utilizes the SitesSDK, which provides a set of functions to integrate custom components with the Content and Experience Cloud. These functions are particularly helpful in providing storage and retrieval of custom settings used to configure components on the page.

For example, when the component is first set on the page, it will load the default settings values, and render them to the template. While editing the site, you can access the settings in the dropdown menu located on the top right of the component.

Custom settings were defined for each of the titles and descriptions of the tile elements. By simply updating the input text for each of these fields in the form and pressing enter, the values update immediately on the component within the page. Moreover, when I am happy with the changes I can click “Save” and “Publish”, and those settings will be published to the site and persist for everyone until they need to be changed again. Anyone with permissions to edit the site would be able to update these values in a matter of seconds and publish the changes without any outages. You can see that updating the “Title 1” field to the value “My Title”, and the “Text 1” field to the value “My Description” will update the first tile within the component.

To demonstrate another use of custom settings, I’ve integrated a filepicker that allows the user to navigate files stored in the cloud, and select image to be displayed in the component on the page. Data returned by the SitesSDK can also give us some information on the image, which may be useful depending on the demands of your component. The image, and information about the image will also display immediately on the component so the editor of the site has a preview of the site with the updated component before publishing it to the site for everyone to see.

Custom settings provide a great way to manage elements of a page on your site that occasionally need manual changes, and don’t necessarily need to rely on pulling content dynamically from the cloud or another source. It gives site managers flexibility to make changes on the fly, and keep the site fresh and current for its audience.

Custom Component #2: Browser for Cloud Content

The second component I created utilizes Oracle’s Content Management API to build a content browser which displays previews, information, and actions on content living in the cloud. The API provides multiple endpoints to allow viewing, creating, modifying, and deleting folders and files. It can also retrieve information on users in the system. Oracle is working to extend the number and functionality of these endpoints in future releases.

In the above screenshot, you can see the documents view from the CEC interface, and the files that live in the “images” folder. Below is the screenshot of the custom component which grabs all of this information, and renders it to the site. The data returned in the responses make it possible to call for thumbnails of images and documents, as well as build actions like “View” and “Download” to open the full file on the CEC interface, or download the file respectively. This functionality can be used to create components that grab content dynamically and display it to your site as it is contributed to the cloud.

With an enterprise-level account, content administrators will have the ability to define their own structured content with access to Content Types, Content Items, Content Layouts, and Digital Assets. This allows the design of content specific to your business, and opens the door to develop components like a news feed which filters and displays only news content items in a widget on the page, or a search form which can return content filtered on any number of criteria.

Conclusion & Looking to the Future: Integrating with On-Premise and other Back-Office Applications

Content and Experience Cloud provides an ideal platform for content management in the cloud. It aggregates content, digital assets, conversations, and sites to a single location, where power users can delegate access to the people who need it, anywhere. Surface your content to sites on the cloud using custom components to build an interface that works for your business. Make updates quickly to provide always-current information without modifying site code, or taking the system offline. Oracle continues to improve and expand on the API endpoints and other development materials with future releases.

I will be working to integrate some of Fishbowl Solutions’ SPA taskflows into custom components for display on CEC Sites similar to what I’ve shown in the previous section, except the taskflow code will be hooked into an existing on-premise WebCenter Content instance to serve back content housed in a locally managed database rather than the Document Cloud Service. This will provide options to businesses looking to transition to the cloud service for benefits like site servers being hosted on the cloud, simple site/component management, and near-instant publishing, while still maintaining all the same content on-prem.

Another integration planned for future development is integration with the AuraPlayer service. AuraPlayer provides the ability to wrap existing Oracle Forms/EBS systems as web services which can eventually be surfaced on a Content and Experience Cloud site as a modern, mobile-friendly, responsive UI. With CEC already accessible by tablet and mobile devices, it stands out as a strong platform candidate.

The post The Future of Content Management: Oracle Content & Experience Cloud appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Dark Eyes

Greg Pavlik - Sat, 2017-07-08 12:06
Dark Eyes
(A traditional Russian folk song translated by Greg Pavlik)
The eyes so black, passionate eyes
Fiery eyes, so beautiful!
How I love you - how I fear you!
You know I saw your eyes at the witching hour!

I cannot explain your deep, deep darkness!
I see a mourning within you for my soul,
I see the flame of triumph within you:
it consumes my sorrowing heart.
It does not make me bitter, it does not make me sad,
This my fate, my consolation:
All that is good is a gift from God,
And I sacrifice it all for those fiery eyes!

Original Russian by the Ukrainian poet and writer Yevhen Hrebinka.
1.
Очи чёрные, очи страстные,
Очи жгучие и прекрасные!
Как люблю я вас, как боюсь я вас!
Знать, увидел вас я в недобрый час!
2.
Ох, недаром вы глубины темней!
Вижу траур в вас по душе моей,
Вижу пламя в вас я победное:
Сожжено на нём сердце бедное.
3.
Но не грустен я, не печален я,
Утешительна мне судьба моя:
Всё, что лучшего в жизни Бог дал нам,
В жертву отдал я огневым глазам!

The river floes break in spring... take 2

Greg Pavlik - Sun, 2017-06-18 16:19
Alexander Blok
The river floes break in spring...
March 1902
translation by Greg Pavlik 



The river floes break in spring,
And for the dead I feel no sorrow -
Toward new summits I am rising,
Forgetting crevasses of past striving,
I see the blue horizon of tomorrow.

What regret, in fire and smoke,
the lament of the cross,
With each hour, with each stroke -
Or instead - the heavens’ gift stoked,
from the bush burnt without loss!

Original:

Весна в реке ломает льдины,
И милых мертвых мне не жаль:
Преодолев мои вершины,
Забыл я зимние теснины
И вижу голубую даль.

Что сожалеть в дыму пожара,
Что сокрушаться у креста,
Когда всечасно жду удара
Или божественного дара
Из Моисеева куста!
 
Март 1902
Translators note: I updated this after some reflection. The original translation used the allegorical imagery that would have been common in patristic writing and hence Russian Orthodoxy. For example, I used the image of Aaron's rod in lieu of the word "cross", which appears in Russian (креста). The rod of Aaron was commonly understood to be a type of the cross in traditional readings of Old Testament Scriptures. Similarly, the final line of Blok's poem "Из Моисеева куста" literally translates to "from Moses's Bush". In my original translation, I rendered the final line "from the bush of Moses, the Mother of God". Since at least the 4th century, the burning bush was interpreted as a type of Mary, the Theotokos (or God-bearer) in the patristic literature (see for example, Gregory of Nyssa, The Life of Moses). In Russian iconography there is even an icon type of the Mother of God called the Unburnt Bush. While the use of "rod" and "Mother of God" allowed me to maintain the rhyme pattern (rod/God in place of креста/куста) of the original poem, it created a awkward rhythm to the poem, especially in the final line. It also added explicit allusions to patristic images that are not explicitly present in the original poem, perhaps fundamentally altering the author's intention. A neat experiment but also one that I think ultimately failed.

The new translation returns to a more literal translation without allegory: "
креста" means simply cross and that is how the poem now reads. The final line has been abbreviated from my original translation, though somewhat less literal - "Из Моисеева куста" is now rendered as "from the bush burnt without loss" rather than the literal "from Moses's bush" or the more awkward original translation "From the Bush of Moses, the Mother of God". The new translation I believe captures more closely the original meaning and manages to maintain at least the rhyme pattern of the original (now cross/loss in place of креста/куста). Overall, this is far from a perfect translation but I think it is an improvement.
One final comment about Blok himself that perhaps illustrates why I am still conflicted about the changes to final line: Blok was a master of the Symbolist movement in Russian poetry, wherein he worked unconventional rhythms and rhyming into his poetry. On that score, I feel somewhat more at liberty to ignore the meter of the original and attempt to express something of a musical quality in English. However, Blok was also deeply influenced by the great philosopher Vladimir Soloviev, a proponent of Sophiology in the Russian intellectual tradition. This led to him writing many of his early poetic compositions about the Fair Lady, Sophia the embodiment of Wisdom. It is with this in mind that I feel some regret at removing the reference to the Mother of God, a human embodiment/enhypostatization of Divine Wisdom.

Swagger UI with Spring Boot 1.5.x

Pas Apicella - Tue, 2017-06-13 23:00
I recently created this demo / blog entry on using HTTPIE with Spring Boot Rest Repositories as shown below.

http://theblasfrompas.blogspot.com.au/2017/05/using-httpie-with-spring-boot-rest.html

I decided to take that same example and add Swagger UI to the RESTful endpoints. The full source code is here.

https://github.com/papicella/httpie-springboot

In short what you need is the following maven dependancies and that will add all you need. I found it works much cleaner if you use the same version of both these dependancies for some reason
  
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.6.1</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.6.1</version>
</dependency>

Finally a Class file describing the config and enabling Swagger is required as follows
  
package pivotal.io.boot.httpie.demo;

import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import springfox.documentation.builders.RequestHandlerSelectors;
import springfox.documentation.service.ApiInfo;
import springfox.documentation.service.Contact;
import springfox.documentation.spi.DocumentationType;
import springfox.documentation.spring.web.plugins.Docket;
import springfox.documentation.swagger2.annotations.EnableSwagger2;

import static springfox.documentation.builders.PathSelectors.regex;

@Configuration
@EnableSwagger2
public class SwaggerConfig
{
@Bean
public Docket swaggerSpringMvcPlugin() {
return new Docket(DocumentationType.SWAGGER_2)
.select()
.apis(RequestHandlerSelectors.basePackage("pivotal.io.boot.httpie.demo"))
.paths(regex("/api/employee/emps.*"))
.build()
.apiInfo(metaData());
}

private ApiInfo metaData() {
ApiInfo apiInfo = new ApiInfo(
"Spring Boot Employee REST API",
"Spring Boot Employee REST API",
"1.0",
"Terms of service",
new Contact("Pas Apicella", "https://www.blogger.com/profile/09389663166398991762", "papicella@pivotal.io"),
"Apache License Version 2.0",
"https://www.apache.org/licenses/LICENSE-2.0");
return apiInfo;
}
}

The GitHub repo also included a Pivotal Cloud Foundry manifest.yml file to make it easy to deploy to Pivotal Cloud Foundry. The example uses a static hostname BUT can easily be changed to use a random-route or alter the hostname itself.

applications:
- name: pas-swagger-demo
  memory: 1G
  instances: 1
  hostname: pas-swagger-demo
  path: ./target/httpie-springboot-0.0.1-SNAPSHOT.jar
  env:
    JAVA_OPTS: -Djava.security.egd=file:///dev/urando

Then it's the simple "cf push"

$ cf push

pasapicella@pas-macbook:~/piv-projects/httpie-springboot$ cf push
Using manifest file /Users/pasapicella/piv-projects/httpie-springboot/manifest.yml

Creating app pas-swagger-demo in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

Creating route pas-swagger-demo.cfapps.io...
OK

..

Showing health and status for app pas-swagger-demo in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

requested state: started
instances: 1/1
usage: 1G x 1 instances
urls: pas-swagger-demo.cfapps.io
last uploaded: Wed Jun 14 03:32:31 UTC 2017
stack: cflinuxfs2
buildpack: container-certificate-trust-store=2.0.0_RELEASE java-buildpack=v3.15-offline-https://github.com/cloudfoundry/java-buildpack.git#a3a9e61 java-main java-opts open-jdk-like-jre=1.8.0_121 open-jdk-like-memory-calculator=2.0.2_RELEASE spring-auto-reconfigur...

     state     since                    cpu      memory         disk           details
#0   running   2017-06-14 01:33:40 PM   291.5%   510.9M of 1G   154.9M of 1G


The application is running on Pivotal Web Services as follows:

http://pas-swagger-demo.cfapps.io/swagger-ui.html



Categories: Fusion Middleware

Using HTTPIE with Spring Boot Rest Repositories

Pas Apicella - Sun, 2017-05-21 22:58
I recently got introduced to HTTPIE as a command line alternative to CURL for testing RESTful api endpoints created using @RestController annotated classes. For more information on httpie follow this link

Before we test this out lets create a very basic Spring Boot Application with classes/interfaces to verify HTTPIE. The following assumes you have a Spring Boot application already created and it has maven dependancies as follows to enable JPA, Rest Repositories, H2 and Web support

Note: We are using Spring Boot 1.5.3 here
  
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.5.3.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>

<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-rest</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
</dependencies>

1. Create classes/interfaces as follows

Employee.java
  
package pivotal.io.boot.httpie.demo;

import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;

@Entity
public class Employee
{
@Id
@GeneratedValue (strategy = GenerationType.AUTO)
private Long id;

private String firstName;
private String lastName;
private String job;

public Employee()
{
}

public Employee(String firstName, String lastName, String job) {
this.firstName = firstName;
this.lastName = lastName;
this.job = job;
}

public Long getId() {
return id;
}

public void setId(Long id) {
this.id = id;
}

public String getFirstName() {
return firstName;
}

public void setFirstName(String firstName) {
this.firstName = firstName;
}

public String getLastName() {
return lastName;
}

public void setLastName(String lastName) {
this.lastName = lastName;
}

public String getJob() {
return job;
}

public void setJob(String job) {
this.job = job;
}

@Override
public String toString() {
return "Employee{" +
"id=" + id +
", firstName='" + firstName + '\'' +
", lastName='" + lastName + '\'' +
", job='" + job + '\'' +
'}';
}
}

EmployeeRepository.java
  
package pivotal.io.boot.httpie.demo;

import org.springframework.data.jpa.repository.JpaRepository;

public interface EmployeeRepository extends JpaRepository <Employee, Long> {
}

EmployeeRest.java
  
package pivotal.io.boot.httpie.demo;

import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;

import java.util.List;

@RestController
@RequestMapping ("/api/employee")
public class EmployeeRest
{
private static Log logger = LogFactory.getLog(EmployeeRest.class);

@Autowired
private EmployeeRepository employeeRepository;

@GetMapping("/emps")
public List<Employee> allEmployees()
{
return employeeRepository.findAll();
}

@GetMapping("/emp/{employeeId}")
public Employee findEmployee (@PathVariable Long employeeId)
{
Employee emp = employeeRepository.findOne(employeeId);

return emp;
}

@PostMapping("/emps")
public Employee createEmployee(@RequestBody Employee employee)
{
return employeeRepository.save(employee);
}

@DeleteMapping("/emps/{employeeId}")
public void deleteEmployee(@PathVariable Long employeeId)
{
Employee emp = employeeRepository.findOne(employeeId);
employeeRepository.delete(emp);
logger.info("Employee with id " + employeeId + " deleted...");
}

}

2. Run the Spring Boot Application which will run on port localhost:8080


  .   ____          _            __ _ _
 /\\ / ___'_ __ _ _(_)_ __  __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
 \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
  '  |____| .__|_| |_|_| |_\__, | / / / /
 =========|_|==============|___/=/_/_/_/
 :: Spring Boot ::        (v1.5.3.RELEASE)

2017-05-22 13:39:22.910  INFO 8875 --- [           main] p.i.b.h.d.HttpieSpringbootApplication    : Starting HttpieSpringbootApplication on pas-macbook with PID 8875 (/Users/pasapicella/pivotal/DemoProjects/spring-starter/pivotal/httpie-springboot/target/classes started by pasapicella in /Users/pasapicella/pivotal/DemoProjects/spring-starter/pivotal/httpie-springboot)

...

2017-05-22 13:39:25.948  INFO 8875 --- [           main] s.b.c.e.t.TomcatEmbeddedServletContainer : Tomcat started on port(s): 8080 (http)
2017-05-22 13:39:25.952  INFO 8875 --- [           main] p.i.b.h.d.HttpieSpringbootApplication    : Started HttpieSpringbootApplication in 3.282 seconds (JVM running for 3.676)

Now we can test HTTPIE and here are some endpoints

3. Here are some examples with output

** All Employees **

pasapicella@pas-macbook:~$ http http://localhost:8080/api/employee/emps
HTTP/1.1 200
Content-Type: application/json;charset=UTF-8
Date: Mon, 22 May 2017 01:26:43 GMT
Transfer-Encoding: chunked

[
    {
        "firstName": "pas",
        "id": 1,
        "job": "CEO",
        "lastName": "Apicella"
    },
    {
        "firstName": "lucia",
        "id": 2,
        "job": "CIO",
        "lastName": "Apicella"
    },
    {
        "firstName": "lucas",
        "id": 3,
        "job": "MANAGER",
        "lastName": "Apicella"
    },
    {
        "firstName": "siena",
        "id": 4,
        "job": "CLERK",
        "lastName": "Apicella"
    }
]

** Find Employee by {employeeId} **

pasapicella@pas-macbook:~$ http http://localhost:8080/api/employee/emp/1
HTTP/1.1 200
Content-Type: application/json;charset=UTF-8
Date: Mon, 22 May 2017 01:31:32 GMT
Transfer-Encoding: chunked

{
    "firstName": "pas",
    "id": 1,
    "job": "CEO",
    "lastName": "Apicella"
}

** POST new employee **

pasapicella@pas-macbook:~$ http POST http://localhost:8080/api/employee/emps firstName=john lastName=black job=CLERK
HTTP/1.1 200
Content-Type: application/json;charset=UTF-8
Date: Mon, 22 May 2017 02:32:34 GMT
Transfer-Encoding: chunked

{
    "firstName": "john",
    "id": 5,
    "job": "CLERK",
    "lastName": "black"
}

** POST with updated employee object **

pasapicella@pas-macbook:~$ http POST http://localhost:8080/api/employee/emps id:=5 firstName=john lastName=black job=CLEANER
HTTP/1.1 200
Content-Type: application/json;charset=UTF-8
Date: Mon, 22 May 2017 02:36:06 GMT
Transfer-Encoding: chunked

{
    "firstName": "john",
    "id": 5,
    "job": "CLEANER",
    "lastName": "black"
}

** Delete employee with {employeeId} 5 **

pasapicella@pas-macbook:~$ http DELETE http://localhost:8080/api/employee/emps/5
HTTP/1.1 200
Content-Length: 0
Date: Mon, 22 May 2017 02:36:56 GMT

Categories: Fusion Middleware

A Sneak Peek at Oracle’s Chatbot Cloud Service and 5 Key Factors Necessary for Bot ROI

In early May, I flew out to Oracle HQ in San Francisco for an early look at their yet-to-be released Oracle Intelligent Bots Service.  The training left me ecstatic that the technology to quickly build great chatbots is finally here. However, the question remains, can chatbots provide real value for your business?

What is a chatbot?

A chatbot is a program that simulates a conversation partner over a messaging app. It can integrate with any kind of messaging client, such as Facebook, WeChat, WhatsApp, Slack, Skype, or you could even build your own client. If you’ve been following our blog, you may have already seen the chatbot (Atlas) we built as part of our annual hackathon.

Here is an example conversation I had with Atlas recently:

Chatbot Conversations

Chatbots use Natural Language Processing and Machine Learning algorithms to take what the user said and match it up against pre-defined conversations. Understanding how chatbots recognize phrases can help determine what conversations a user could have with a bot. Here is some chatbot terminology:

  • An intent is something the users wants, and the bot maps this to an action. For example, the user might want to say some form of “Hi” to the bot, and we would want the bot to respond with a random greeting. A chatbot generally has up to 2,000 intents.
  • Utterances are examples of different phrases that represent an intent. An intent might have 10-15 utterances. The bot will be able to match statements similar to those utterances to the intent, but what a user says doesn’t have to exactly match an utterance. This is where the language processing algorithms are used.
  • Entities are key variables the bot can parse from the intent.

Suppose we are building an HR chatbot that can help users reset passwords. The goal is for our bot to understand that the user needs a password reset link, and then send the correct link to the user. Our intent could be called Password Reset. Since the user could have accounts for different services, we would need to create an entity called AccountType for our bot to parse from what the user said. AccountType could map to “Gitlab”, “WebCenter”, or “OpenAir”.

As a rough design, we could start with:

  • Intent: Password Reset
  • Utterances:
    • I’d like to reset my password.
    • How do I change my password for Gitlab?
    • I forgot my WebCenter pw, can you help?
    • Please assist me in receiving a new password.
    • Forgot my passcode for OpenAir.
    • Give me another password.
  • Entity: AccountType (Gitlab, WebCenter, OpenAir)

Intents like this one will need to be set up for a bot to know what to do when a user says something. If a user asks the bot a question it doesn’t have an intent for, it won’t know what to do and the user will get frustrated. Our bot still won’t know how to order a pizza, but it could help with password resets.

Key Factor #1: Chatbots should have a purpose

A chatbot can only answer questions it is designed to answer. If I was building an HR Help chatbot, it probably would not be able to order a pizza, rent a car for you, or check the weather. It could, for example, reset passwords, report harassment, set up a new hire, and search for policies. Once the requirements are set, developers can build, design, and test to ensure the bot has those capabilities.

This makes it important to set expectations with the user on what types of questions they can ask it, without giving the user a list of questions. Introducing a bot along with its purpose will help with this. For example, we could have the HR Help Bot, the Travel Planning bot, or the Sales Rep Info bot. If we introduced the Fishbowl Ask-Me-Anything bot, users will start asking it a lot of questions we didn’t plan for it to be able to answer.

Conversations can be more complicated than a simple back and forth, or question and answer. The capability is there (Oracle’s solution gives developers full control over a Conversational State Machine), but I have yet to explore the full capabilities.

Once a purpose and a set of intents are identified, a chatbot could be a useful tool to engage customers or employees.

Key Factor #2: Design Architecture

Bots are great for interacting with difference services. Oracle Intelligent Bot Service is designed to make it easy for developers to make REST API calls and database lookups in between parsing what the user says, and returning a response.

Here are a few things to think about when designing a bot’s architecture:

  • Integrations: What services will the bot interact with?
  • Security: Are users typing their bank account number over Facebook chat?
  • Human interaction: How will the bot flip users over to a human to help when they get frustrated?
  • Infrastructure: What will be on premise and what will be in the cloud?
  • Performance: How to minimize network requests?
Key Factor #3: Analytics

Analytics can be used to improve the bot’s capability over time and understand the impact on the company. Some companies may already have metrics around help desk call volume or customer conversion rates, and it would be interesting to compare that data from before and after a bot’s release.

Beyond that, bot analytics will be able to show the performance of the bot. Analytics could show the top questions a bot is asked but can’t answer, how many questions it answers successfully each day, and what questions it mistook for something else. Oracle’s chatbot solution will have some capabilities built in, and the platform is so flexible it will be possible to gather any data about a bot.

Key Factor #4: Bot Building Best Practices

There is a lot to do when it comes to building the bot. From setting up the infrastructure, connecting all the services, and filling out all the utterances. There are some best practices to keep in mind as well.

The bot should sound like a human. Personality can play a big role in giving users a better interaction.

As users become more familiar with chatbots, there will also be a set of questions they expect every bot to be able to answer. This list might start with:

  • Hi.
  • What do you do?
  • Are you human?
  • Help!
  • Tell me a joke.
  • How are you?

When the bot is going to run a query or API that may take a while, it is important to warn the user in advance and echo that the bot understood what the user wanted. Some apps will also support “is typing” statuses, which is another great way to show the bot is thinking.

Key Factor #5: Testing

Users have high expectations for the intelligence level of a chatbot. They expect the Machine Learning algorithms to work well, and the bot to seem smart. If the bot doesn’t meet their expectations on the first try, they are unlikely to use the bot in the future.

Testing and tuning utterances can make the difference for making a bot seem smart. The bot should be able to accurately map what a user says to the correct intent. Oracle’s chatbot solution has some nice testing capabilities around utterances and intents, and making sure what the users says maps correctly.

Chatbots are another piece of software, so it is important to do performance and user testing on it as well.

Conclusion

Chatbots are a great way to tie in a single user interface to a large variety of services, or automate repetitive conversations. There are plenty of business use cases that would benefit from a chatbot, but the ROI depends on thorough requirements gathering and using analytics to optimize the bot. That being said, companies that have already started down the path – like this Accounting Firm in Minneapolis – are seeing benefits from bots automating manual processes leading to a reduction in operating costs by 25 to 40%. Savings like this will vary across use case and industry, but overall the automation gains from a bot are there regardless of what the bot is being used for. We would love to discuss your ideas on how a chatbot could help your business. Leave a comment or contact us with any questions.

The post A Sneak Peek at Oracle’s Chatbot Cloud Service and 5 Key Factors Necessary for Bot ROI appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Unboxing the Future of Fishbowl’s On-premise Enterprise Search Offering: Mindbreeze InSpire

Back on April 3rd, Fishbowl announced that we had formed a partner relationship with Mindbreeze to bring their industry leading enterprise search solutions to Fishbowl customers. We will offer their Mindbreeze InSpire search appliance to customers looking for an on-premise solution to search internal file shares, databases, document management systems and other enterprise repositories.

Since that announcement, we have been busy learning more about Mindbreeze InSpire, including sending some members of our development team to their partner technical training in Linz, Austria. This also includes procuring our own InSpire search appliance  so that we can begin development of connectors for Oracle WebCenter Content and PTC Windchill. We will also begin using InSpire as the search system for our internal content as well.

Fishbowl’s Mindbreeze InSpire appliance arrived last week, and we wanted to share a few pics of the unboxing and racking process. We are very excited about the value that Mindbreeze InSpire will bring to customers, including the time savings of searching, and in many cases not finding, high-value information. Consider these stats:

  • 25% of employee’s time is spent looking for information – AIIM
  • 50% of people need to search 5 or more sources – AIIM
  • 38% of time is spent unsuccessfully searching and recreating content – IDC

Stay tuned for more information on Fishbowl’s software and services for Mindbreeze InSpire. Demos of the system are available today, so contact us below or leave a comment here if you would like to see it in action.

 

 

The post Unboxing the Future of Fishbowl’s On-premise Enterprise Search Offering: Mindbreeze InSpire appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

How to Configure Microsoft IIS with Oracle WebCenter

I was setting up a Oracle WebCenter 12c Suite in a local development environment utilizing a Windows Server 2012 R2 Operating System with a Microsoft SQL Server. Instead of using a OHS (Oracle HTTP Server), I wanted to try using Microsoft IIS (Internet Information Services) to handle the forwarding of sub-sites to the specified ports.  Since the Oracle applications run specified ports (ex. 16200 for Content Server), when a user requests the domain on the default ports (80 and 443) on browsers it won’t redirect to the content server – example: www.mydomain.com/cs vs. www.mydomain.com:16200/cs. The reason I chose to use IIS was because it is already a feature built-in to Windows Server, and thus is one less application to manage.

That being said, IIS and OHS perform in the same manner but are setup and configured differently based on requirements.  Oracle provides documentation about using the Oracle Plug-in for Microsoft IIS, but the content is pretty outdated on the Oracle site.  The page first references IIS 6.0, which was released with Windows Server 2003 in April 2003.  It has now ended its support as of July 14th, 2015. Lower on the page, they show steps for IIS on Windows Server 2012 R2, which got me started.  In the next part of this post, I will review the steps I took to get all functionality working, as well as the limitations/flaws I incurred.

Step 1: Install IIS on the Server

The first part was to install IIS on the server.  In Server 2012, open the Server Manager and select Add Roles and Features.  From there select the option to add the IIS components.

Step 2: Select Default Web Site

Once IIS has been installed, open it and select the Default Web Site.  If you right-click and select edit bindings, you can see the default site is binded to port 80, which is what we want since port 80 is the default port for all web applications.

Step 3: Select Application Pools

Following the instructions from Oracle, download the plug-in and put it in the system folder close to the root level on the desired drive.  For this blog, I have it in C:\IISProxy\.  For each server (Content Server, Portal, etc) you need to perform configurations in IIS.  Open IIS and navigate to the Application Pools section.  Select Add Application Pool and create a pool with a specific name for each server.  There needs to be separate application pools for specific port forwarding to work correctly.

Step 4: Configure Properties

Once created, open Windows Explorer and create a folder inside IISProxy called “CS.”  Copy all he plug-in files into the CS folder.  Now open the iisproxy.ini file and configure the properties to match your environment.  Make sure to configure the Debug parameter accordingly to tailor on your environment.

Step 5: Select the Created Application Pool

Open IIS and select the Default Web Site option.  Right-click and select Add Application.  Add the Alias name and select the Application Pool created above.  Set the physical path to the folder created above and make sure the connection is setup for pass-through authentication.

Step 6: Set Up Handler Mappings

Once OK has been selected, the application should now be displayed on the tree on the left.  The next step is to setup handler mappings for how IIS will handle requests coming in.  Click on the “cs” application you just created and on the main display there should be a Handler Mappings icon to click. Double click the icon.  This is where we will setup the routing of static files vs content server requests. On the right side, click the “Add Script Map” icon.  Add the request path of “*” and add the folder path to the iisproxy.dll.  Open the request restrictions and verify the “Invoke handler…” checkbox is unchecked.  Open the access tab and select the Script radio button.  Click OK and verify the mapping has been applied.

    

Step 7: Map Static Files

Next, we will setup the mapping for static files.  Click “Add Module Mapping” Add “*” for the request path, “StaticFileModule,DefaultDocumentModule,DirectoryListingModule” for the Module and give it a name.  Open request restrictions and select the file or folder radio option.  Navigate to the access tab and select the read radio button.  Click OK and verify the mapping was applied.

  

Step 8: Verify Mapping Execution

After the mappings have been setup, we need to verify they are executed in the correct order.  Do this by going to the back to the handler mappings screen and clicking “View Ordered List”

Step 9: Restart the IIS Server

After these steps are completed, restart the IIS server.  To do this, open command-prompt as an administrator and type “iisreset”.  Once restarted, you now should be able to view the content server on port 80.  If you have other redirects you would like to perform, you can perform the same steps above with a different name (ex. Portal, Inbound Refinery, Console, Enterprise Manager, etc).

With Oracle’s tutorial out-of-date and missing key steps, it was difficult to determine how to set everything up.  With some trial and error and investigation, I think I outlined in the 9 steps above how to help you quickly setup IIS with the WebCenter Suite on a Windows environment so specific port numbers are not needed.  Obviously with any technology decision, application evaluations should take place to determine if IIS or OHS is a better fit. Good luck, and leave a comment if you have any questions or need further clarification.

The post How to Configure Microsoft IIS with Oracle WebCenter appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Fishbowl Innovation: Cloud 2 Cloud Content Migrations for Oracle Content and Experience Cloud (Oracle Documents Cloud Service) and Other Cloud Storage Providers

Migrating content between systems takes a lot of time, and without methods to bulk load or schedule the process may proceed incrementally as users facilitate the process whenever they have time. But what if they don’t have time? What if they had been using an unauthorized cloud storage system, or the system they were using is being decommissioned by the company. How can they quickly move high value content to another system?

For example, let’s say a business division within a company is using Microsoft OneDrive as a collaboration and document sharing system. Then, that company decides that OneDrive is no longer an accepted or company preferred file sync and share system, and employees should use Oracle Documents Cloud Service. For the division using OneDrive, such a declaration could cause some delays in any of their processes that rely on that content. An example of this could be the inability to collaborate with 3rd-parties – such as a law firm – on documents being reviewed by both parties. The process of downloading copies of the content and uploading them to another system could take a significant amount of time, but until that content gets moved over to, in this case Oracle Documents Cloud Service, critical processes could be severely delayed.

The hackathon event team for this solution set out to provide a web-based interface to enable single item and batch migrations between systems to be migrated and removed from one system or the other or just copied. As more and more of these easy-to-use document sharing solutions enter an organization, such a tool could be quite beneficial to ensure content can be easily accessed and shared across systems.

One important point to note about such a solution, and the use of cloud storage systems across an organization, is that governance and acceptable use policies for cloud storage/enterprise file sync and share systems need to be clearly defined. Although the solution developed by Fishbowl could help an organization migrate content to the standardized cloud storage solution, companies need to be proactive with monitoring employee use as they may try and utilize other systems. This can pose security risks – both from a sharing of confidential information perspective and opening up new avenues for cyber attacks. To combat this, solutions like Oracle’s Cloud Access Security Broker (CASB) could be leveraged to provide visibility into the cloud systems being used, and provide security monitoring and threat detection across your entire cloud technology stack.

The screenshot below shows the simple user interface to select available systems and begin migrating content. Fishbowl has customers using this cloud migration tool today, so if you are interested in learning how it could help you expedite your cloud to cloud content migrations, contact us now – info@fishbowlsolutions.com or 952-465-3400.

Here are the technologies the team used to develop Cloud 2 Cloud. If you would like more information on the technical specifics of the solutions, please leave a comment and we will get back to you ASAP.

  • REST API
  • Mustache – web template system for mobile and web applications
  • Debugging tools Fiddler and Postman

 

Cloud 2 Cloud Migration.

The post Fishbowl Innovation: Cloud 2 Cloud Content Migrations for Oracle Content and Experience Cloud (Oracle Documents Cloud Service) and Other Cloud Storage Providers appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Fishbowl Innovation: ATLAS – Intelligent Chatbot and Virtual Assistant for Oracle Applications

If you haven’t heard about chatbots yet, you will soon. Today’s leading technology companies – Apple, Google, Amazon, Microsoft, Oracle – all have bot development strategies proceeding this year. Why? To answer that question, let’s first define the two types of bots being used today. The first is called a “virtual assistant”, which is effectively Apple’s Siri or Amazon’s Alexa. Virtual assistants can help you find or remember things, or even buy things (Dom the pizza bot). These bots are powered by machine learning, which means they will get smarter over time as more people use them and its artificial intelligence learns what people are asking it to do.

The other type of bot is a messaging bot, which have become pretty prevalent within Facebook Messenger. This type of bot can be used for customer service to answer simple questions such as “what are your store hours”? The responses returned by the bot have all been programmed, so if it gets asked a question outside of its pre-defined responses it won’t be able to interact with the user.

So, back to the question, why are bots so popular? Because bots, in many cases, can provide answers to questions or facilitate a purchase faster than a human can. Consider this stat by Deloitte – a minute of work for a bot is equal to 15 minutes of work for a human. Additionally, because messaging apps are ubiquitous (1.3 billion people use them), companies have developed bots to engage and market to users 24 x 7. To look at this from a business perspective, consider the following use cases:

  • Requesting pricing and availability of a product
    • During  a sales meeting, you could type the following into a messaging service on your phone or laptop “what is the pricing and availability of product widget ABC”? The bot would then perform a query for this product in the ERP system and return your answer – “product widget ABC is $299 and can ship today.”
  • Logging your billable hours into a project management system
  • Providing quick answers to simple questions such as “how many PTO days do I have left”?
  • Resetting your password
  • Asking for specific content to be delivered to you, such as a product brochure, price list, or instruction manual
  • Ordering new business cards, which was the example that Larry Ellison shared at Oracle OpenWorld 2016

With each of the examples above, the time savings of not having to log onto a system and perform multiple clicks to facilitate such requests could be huge – especially for employees on the go, such as sales staff, that need information quickly. All the examples above are also opportunities to reduce the amount of calls and service requests to your help desk. According to this press release from Kore Inc., about 20% of IT help desk calls are still related to password resets, an inefficiency that can cost businesses between $15 to $20 per call.

The chatbot that was developed during Fishbowls hackathon was positioned as a personal assistant in the cloud for document management. The team showed how Atlas could be used with a team collaboration system like Slack, and integrated with Oracle WebCenter to retrieve documents based on simple input from the user. For example, “find a document – invoice 123456”. Then filter by user jsim. Here are the technologies the team used to develop and integrate Atlas:

Here some screenshots of this use case and the bot also running within Fishbowl’s demo instance of Oracle WebCenter Portal to show an example of employee self-service. Contact us today for more information on ATLAS and intelligent chat bots – info@fishbowlsolutions.com or 952-465-3400. If you would like more technical information on how Atlas was built and our approach to developing intelligent chatbots for Oracle applications, leave us a comment and we will respond directly.

 

ATLAS returning results for document named Invoice 123456.

 

Atlas filtering results returned by author jsim.

 

ATLAS performing employee self-service actions in WebCenter Portal.

The post Fishbowl Innovation: ATLAS – Intelligent Chatbot and Virtual Assistant for Oracle Applications appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Fishbowl Innovation: Controlled Document Management in the Cloud

Fishbowl Solutions has been delivering enterprise content management (ECM) solutions based on Oracle WebCenter for over 17 years. WebCenter is the only ECM solution we’ve built a consulting practice around and have software solutions for. Its comprehensive capabilities have satisfied numerous customer use cases including employee portals, contract management, and quality control. That being said, we understand customers have other use cases for storing and managing their high value content, and more recently that includes document storage in the cloud.

To satisfy use cases where companies manage the majority of their content with on-premise solutions like WebCenter but may need simple, cloud-bases solutions to manage specific documents that are part of a controlled process – contracts, policies and procedures, etc., Fishbowl developed a proof of concept (POC) for lightweight ECM in the cloud. This solution would provide a low barrier to entry for customers wanting content management capabilities through a simplified user interface that includes a dashboard, document list, and profile driven metadata fields. The other obvious benefit this solution would provide is a much lower overall cost due to a cloud-based subscription model, and less need for development resources and system administrators.

From a development and technology perspective, the team working on this POC discussed how workflow, revisioning, security/permissions, would all need be included to make this a viable solution. Here are some of the technologies they leveraged to develop the solution:

The following are some screenshots of the solution as it appears running on the Google Cloud Platform, but the flexibility of the technologies used to develop the solution means it could integrate with other cloud platforms like Oracle Content and Experience Cloud. Contact us today if you would like more information – info@fishbowlsolutions.com or 952-465-3400. If you are interested in learning more and discussing the technologies involved in the development, please leave a comment and we will get some dialogue going there.

 

The post Fishbowl Innovation: Controlled Document Management in the Cloud appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Hackathon Weekend Recap: Oracle Chatbots, Cloud Content Migrations, and Controlled Document Management in the Cloud

Wow! It’s always amazing what can be accomplished in a weekend when you give people limitless supplies of calories and caffeine, as well as a hard deadline. Couple that with a lot of competitive fire and the drive to innovate, and you just might be able to produce THREE, new software solutions and spark new product ideas. That’s what transpired during the weekend of April 7th as Fishbowl Solutions held its annual hackathon event. Over 250 hours were spent across three teams as they architected, designed, and developed their solutions. Teams then had to present the business use case for the solution and show a demo to the Fishbowl employees that did not participate in the hackathon. The non-participants then voted for the winner. What follows is a recap of the solutions developed in order of where they placed after the voting.

Controlled Document Management in the Cloud

Team Members: Andy Weaver, Lauren Beatty, Nate Scharber, Brent Genereaux, Amy Mellinger
Solution Summary: The goal for this team was to develop a controlled document management solution in the cloud. Essentially, the team wanted to produce a lightweight, cloud-based version of Fishbowl’s flagship controlled document management solution called ControlCenter. Their demo showed how easy it would be to provision users for the cloud instance, and then the simple steps users would take to begin uploading, reviewing and managing documents in the cloud.

For more information on this solution including some business use cases and screenshots, read this blog post — Fishbowl Innovation: Controlled Document Management in the Cloud

Team Controlled Document Management in the Cloud

ATLAS – Intelligent Chatbot

Team Members: John Sim, Danny Lesage, Amanda Jovanovic, Matt Hornung, Sean Deal
Solution Summary: This team was all about bots. What’s a bot? Well, it’s software that can run automated tasks over an Internet connection. Fishbowl’s resident Oracle UX expert, John Sim, is from the United Kingdom and while John was visiting he shared how Dom the pizza bot enables customers to order pizzas from Domino’s using Facebook Messenger. Sadly, Dom can only facilitate such requests in the United Kingdom currently, but this provided a great example of a bot for personal use (and made everyone hungry for pizza). However, Fishbowl isn’t in the business of “chat commerce” for food, so the team set out to develop a chatbot that could help users find content stored in Oracle WebCenter.

For more information on this solution including some business use cases and screenshots, read this blog post — Fishbowl Innovation: ATLAS – Intelligent Chatbot and Virtual Assistant for Oracle Applications

Team ATLAS: Intelligent Chatbot

Cloud 2 Cloud Content Migrations

Team Members: Tim Gruidl, Jake Ferm, Dan Haugen, Tom Johnson
Solution Summary: The premise of this solution was based on the proliferation of cloud storage/file sync and share systems within an organization and how it would take many steps to migrate or copy content between them. For example, moving or copying content from Microsoft OneDrive to Oracle Documents Cloud Service.

For more information on this solution including some business use cases and screenshots, read this blog post — Fishbowl Innovation: Cloud to Cloud Content Migrations for Oracle Content and Experience Cloud (Oracle Documents Cloud Service) and Other Cloud Storage Providers

Team Cloud to Cloud Content Migrations

 

This was Fishbowl’s sixth annual hackathon event, and the bar raises every year with the innovative solutions that get created. Here are some more pictures from this year’s event.

 

The post Hackathon Weekend Recap: Oracle Chatbots, Cloud Content Migrations, and Controlled Document Management in the Cloud appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Binding a Spring Cloud Task to a Pivotal Cloud Foundry Database Service

Pas Apicella - Mon, 2017-05-01 22:54
I previously blogged about how to create and deploy a Spring Cloud Task to Pivotal Cloud Foundry (PCF) as shown below.

http://theblasfrompas.blogspot.com.au/2017/03/run-spring-cloud-task-from-pivotal.html

Taking that same example I have used the Spring Cloud Connectors to persist the log output to a database table to avoid looking through log files to view the output. Few things have to change to make this happen as detailed below.

1. We need to change the manifest.yml to include a MySQL service instance as shown below

applications:
- name: springcloudtask-date
  memory: 750M
  instances: 1
  no-route: true
  health-check-type: none
  path: ./target/springcloudtasktodaysdate-0.0.1-SNAPSHOT.jar
  services:
    - pmysql-test
  env:
    JAVA_OPTS: -Djava.security.egd=file:///dev/urando

2. Alter the project dependancies to include Spring Data JPA libraries to persist the log output to a table. Spring Cloud Connectors will automatically pick up the bound MySQL instance and connect for us when we push the application to PCF

https://github.com/papicella/SpringCloudTaskTodaysDate
  
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-task</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
</dependencies>

3. A Entity class, Spring JPA repository interface and a JPA task Configurer has been created for persisting the log output as shown in the code below.

TaskRunOutput.java
  
package pas.au.pivotal.pa.sct.demo;

import javax.persistence.*;

@Entity
@Table (name = "TASKRUNOUTPUT")
public class TaskRunOutput
{
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private Long id;

private String output;

public TaskRunOutput()
{
}

public TaskRunOutput(String output) {
this.output = output;
}

public Long getId() {
return id;
}

public void setId(Long id) {
this.id = id;
}

public String getOutput() {
return output;
}

public void setOutput(String output) {
this.output = output;
}

@Override
public String toString() {
return "TaskRunOutput{" +
"id=" + id +
", output='" + output + '\'' +
'}';
}
}

TaskRepository.java
  
package pas.au.pivotal.pa.sct.demo;

import org.springframework.data.jpa.repository.JpaRepository;

public interface TaskRepository extends JpaRepository <TaskRun, Long>
{
}

JpaTaskConfigurer.java
  
package pas.au.pivotal.pa.sct.demo.configuration;

import java.text.SimpleDateFormat;
import java.util.Date;

import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import pas.au.pivotal.pa.sct.demo.TaskRunOutput;
import pas.au.pivotal.pa.sct.demo.TaskRunRepository;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cloud.task.configuration.DefaultTaskConfigurer;
import org.springframework.cloud.task.listener.annotation.BeforeTask;
import org.springframework.cloud.task.repository.TaskExecution;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.stereotype.Component;
import org.springframework.transaction.PlatformTransactionManager;

@Component
public class JpaTaskConfigurer extends DefaultTaskConfigurer {
private static final Log logger = LogFactory.getLog(JpaTaskConfigurer.class);

@Autowired
private PlatformTransactionManager transactionManager;

@Autowired
private TaskRunRepository taskRunRepository;

@Override
public PlatformTransactionManager getTransactionManager() {
if(this.transactionManager == null) {
this.transactionManager = new JpaTransactionManager();
}

return this.transactionManager;
}

@BeforeTask
public void init(TaskExecution taskExecution)
{
String execDate = new SimpleDateFormat().format(new Date());
taskRunRepository.save(new TaskRunOutput("Executed at " + execDate));
logger.info("Executed at : " + execDate);
}
}

4. Now as per the previous blog execute the task and verify it completes without error. The screen shot below shows how the "Tasks" tab shows this

Note: You would need to PUSH the application to Pivotal Cloud Foundry before you can execute it which is shown on the original blog entry


5. Now if you follow this blog entry below you can deploy a Web Based interface for Pivotal MySQL instance to view the table and it's output

http://theblasfrompas.blogspot.com.au/2017/04/accessing-pivotal-mysql-service.html

With Pivotal MySQL*Web installed the output can be viewed as shown below.



Categories: Fusion Middleware

Accessing a Pivotal MySQL service instance within Pivotal Cloud Foundry

Pas Apicella - Thu, 2017-04-27 00:18
Recently at a hackathon we used the Pivotal MySQL service rather then a ClearDB MySQL service. As a result we could not connect to our instance from a third party tool as the service instance is locked down. There are various way to access the MySQL service to me the best two options are as follows.

1. Cloud Foundry CLI MySQL Plugin

cf-mysql-plugin makes it easy to connect the mysql command line client to any MySQL-compatible database used by Cloud Foundry apps. Use it to

  • inspect databases for debugging purposes
  • manually adjust schema or contents in development environments
  • dump and restore databases

Install it as explained in the link below:

  https://github.com/andreasf/cf-mysql-plugin

** Using It ** 

1. First ensure you are logged into a Pivotal Cloud Foundry instance you can determine that as follows

pasapicella@pas-macbook:~$ cf target -o ben.farrelly-org -s hackathon
API endpoint:   https://api.run.pivotal.io
API version:    2.78.0
User:           papicella@pivotal.io
Org:            ben.farrelly-org
Space:          hackathon

2. Verify you have a MySQL instance provisioned

pasapicella@pas-macbook:~$ cf services
Getting services in org ben.farrelly-org / space hackathon as papicella@pivotal.io...
OK

name        service   plan    bound apps                                                     last operation
nab-mysql   p-mysql   100mb   nabhackathon-beacon, nabhackathon-merchant, pivotal-mysqlweb   create succeeded

3. Log in as shown below

pasapicella@pas-macbook:~$ cf mysql nab-mysql

...

Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> show databases;
+-----------------------------------------+
| Database                                |
+-----------------------------------------+
| cf_53318c9c_caec_49be_9e33_075fade26183 |
| information_schema                      |
+-----------------------------------------+
2 rows in set (0.30 sec)

mysql> use cf_53318c9c_caec_49be_9e33_075fade26183;
Database changed

mysql> show tables;
+---------------------------------------------------+
| Tables_in_cf_53318c9c_caec_49be_9e33_075fade26183 |
+---------------------------------------------------+
| beacon                                            |
| beacon_product                                    |
| customer                                          |
| customer_registration                             |
| merchant                                          |
| payment                                           |
| payment_product                                   |
| product                                           |
+---------------------------------------------------+
8 rows in set (0.29 sec)

2. Pivotal MySQL*Web

PivotalMySQL*Web is a browser based SQL tool rendered using Bootstrap UI for MySQL PCF service instances which allows you to run SQL commands and view schema objects from a browser based interface. Use it to

  • Multiple Command SQL worksheet for DDL and DML
  • Run Explain Plan across SQL Statements
  • View/Run DDL command against Tables/Views/Indexes/Constraints
  • Command History
  • Auto Bind to Pivotal MySQL Services bound to the Application within Pivotal Cloud Foundry 
  • Manage JDBC Connections
  • Load SQL File into SQL Worksheet from Local File System
  • SQL Worksheet with syntax highlighting support
  • HTTP GET request to auto login without a login form
  • Export SQL query results in JSON or CSV formats
  • Generate DDL for schema objects


It does this deployed within Pivotal Cloud Foundry as an application instance and auto binds to the MySQL service for you if you choose to bind it as part of the "cf push" and a manifest.yml which looks as follows

---
applications:
- name: pivotal-mysqlweb
  memory: 512M
  instances: 1
  host: pivotal-mysqlweb-${random-word}
  path: ./target/PivotalMySQLWeb-0.0.1-SNAPSHOT.jar
  services:
    - pas-mysql

Install it as explained in the link below:

  https://github.com/pivotal-cf/PivotalMySQLWeb


Categories: Fusion Middleware

Pages

Subscribe to Oracle FAQ aggregator - Fusion Middleware