Fusion Middleware

Replacing the “V” in Oracle ADF’s MVC design pattern with Oracle JET or other front end framework

This post was written by Fishbowl’s own John Sim – our resident Oracle User Experience expert. From front-end design to user journeys and persona mapping; John has helped numerous customers over 14 years enhance their desktop and mobile experiences with Oracle WebCenter. John is also an Oracle ACE, which recognizes leaders for their technical expertise and community evangelism.

One of our goals at Fishbowl is to continuously enhance and evolve the capabilities of WebCenter for both developers and clients with new tooling capabilities and pre-built custom components that are essential and not available today as part of the OOTB Oracle solution.

We have taken all of our collective knowledge and IP over the years since WebCenter PS3 and created the “Portal Solution Accelerator” previously known as “Intranet In A Box” that takes WebCenter Portal and it’s capabilities to the next level for creating Digital Workplace Portals.

We have taken all of our collective knowledge and IP over the years since WebCenter PS3 and created the “Portal Solution Accelerator” previously known as “Intranet In A Box” that takes WebCenter Portal and it’s capabilities to the next level for creating Digital Workplace Portals.

Today I’m going to cover one of the benefits of using our Portal Solution Accelerator: Replacing the “V” in ADFs MVC design pattern. This enables third party developers, web design agencies, marketers (with basic web design skills) to use other libraries and front end frameworks of their choosing such as Oracle JET, Angular, React, Vue, and Bootstrap – to name a few. By using a different front end library such as JET, you will be able to create more modern and dynamic responsive portals, widgets, and portlets with little to no experience of developing with ADF. You will also be able to leverage the benefits of ADF Model Controller and WebCenter’s Personalisation, Security, Caching and Mashup integration capabilities with other solutions like Oracle E-Business Suite (EBS) and Business Intelligence (BI) on the back end.

So, let’s take a closer look at the Portal Solution Accelerator in the following diagram. You can see it is made up of 2 core components – our back end PSA (Portal Solution Accelerator) component and our front end SPA (Single Page Application) component architecture. One of the things we decided early on is to separate the back end and front end architecture to allow for SPA front end components to be platform agnostic and allow them to work as a Progressive Web App and work on other platforms outside of Portal. This enables us to deploy SPA front end components directly onto BI to provide additional charting capabilities through their narrative components to EBS, SharePoint, and Liferay, as well as onto the cloud. This provides the potential for a hybrid on-premise Portal to Oracle Cloud (Site Cloud Service) Content Experience platform enabling reuse of our portal components and security on the Cloud.

To find out more about our Portal Solution Accelerator head over to our website – https://www.fishbowlsolutions.com/services/oracle-webcenter-portal-consulting/portal-solution-accelerator/

Lets go into a quick dive into WebCenter Portal Taskflows and our Single Page Application (SPA) architecture.

WebCenter Portal – allows you to create Widgets (ADF Taskflows) that can easily be dragged and dropped onto a page by a contributor and can work independently or alongside another taskflow. The interface View is currently generated at the back end with Java processes and can be easily optimised to enable support of adaptive applications. However, you should be aware that this model is very server process intensive.

  • Pros
    • If you know ADF development it makes it extremely fast to create connected web applications using the ADF UI.
    • The ADF generated HTML/JS/CSS UI supports Mobile and desktop browsers.
    • The UI is generated by the application allowing developers to create applications without the need for designers to be involved.
  • Cons
    • If you don’t know ADF or have a UI designed by a third party that does not align with ADFs UI capabilities , it can be very challenging to create complex UI’s using ADF tags, ADF Skins and ADFs Javascript framework.
    • It is a bad practice to combine mix and match open source libraries with ADF tags like jQuery or Bootstrap not supported by Oracle with ADF. This limits the reuse of the largely available open source to create dynamic interactive components and interfaces such as a Carousel etc.
    • It also can be very hard to brand, and is also very server process intensive.

Single Page Applications –  are essentially browser generated applications with Javascript that use AJAX to quickly and easily update and populate the user interface to create fluid and responsive web apps. Instead of the server processing and managing the DOM generated and sent to the client, the client’s browser processes and generates and caches the UI on the fly.

  • Pros
    • All modern front end frameworks allow you to create Single Page Applications and tie into lots of open source front end solutions and interfaces.
  • Cons
    • Can be hard to create Modular Isometric Universal JS applications.
    • You also need to test across browsers and devices your application is looking to support.
    • The front end application can get very large if not managed correctly.

The Portal Solution Accelerator.

What we have done with PSA is create a framework that provides the best of both worlds allowing you to create Modular Single Page Application taskflows that can be dragged and dropped onto a WebCenter Portal page. This allows your web design teams and agencies to manage and develop the front end quickly and effectively with any frameworks and standard HTML5, CSS, and Javascript. You can also use Groovy scripts or Javascript with (Oracle Nashorn) on the server side to create Isometric javascript taskflow applications.

Please note – you cannot create a taskflow that leverages both ADFs View model and our framework together. You can however create 1 taskflow that is pure ADF and drop it on the same page as a taskflow that has been created with a custom front end such as angular using our Portal Solution Accelerator View to replace ADF view. This enables you to use existing OOTB WebCenter Portal taskflows and have them work in conjunction with custom built components.

How Does it work?

Within WebCenter Portal in the composer panel where you can drag and drop in your taskflows onto a page there is a custom taskflow – Fishbowl Single Page Application.

Drop this onto the page and manage its parameters. Here is a quick screenshot of a sample taskflow component for loading in Recent News items.

The Template parameters points to a custom SPA frontend javascript component you would like to load in and inject into the taskflow. You can define custom parameters to pass to this component and these parameters can be dynamic ADF variables via the template parameter panel. The SPA component then handles the magic loading in the template, events, JS libraries CSS and images to be generated from within the taskflow.

Within the SPA API there are custom methods we have created that allow you to pass AJAX JSON calls to the ADF backend groovy or javascript code that enable the app to work and communicate with other services or databases.

ADF Lifecycle… Timeouts.

One of things that often comes up when we present our solution with others who have attempted to integrate JET applications with WebCenter portal is how do you manage the lifecycle and prevent ADF timeouts. For example, if you stay on the same WebCenter Portal page for some time working on a single page application you will get a popup saying you will be automatically logged out. Remember our Portal Solution Accelerator is a taskflow. We are using a similar ADF message queue to pass JSON updates to the ADF lifecycle when a user is working on a complex modular single page application so we don’t run into timeout issues.

Getting out of deployment hell (as well)!!!

One of the downsides with ADF development is having to build your ADF application and deploy stop and start the server to test and find there is a bug that needs to be fixed. And then go through the entire process again. Trust me – it is not quick!

Once you have our framework deployed you can easily deploy / upload standard Javascript Templates, CSS and groovy scripts to Apache or OHS that are automatically consumed by our ADF Taskflow. There is no stop start test. Just upload your updates and refresh the browser!!

I hear Oracle is working to integrate JET with ADF.

Yes, but it’s not there today.
Plus you’re not stuck to just JET with our framework. You can use React or any front end framework or library and you get the benefits of all the additional components, apps, tooling that the Portal Solution Accelerator provide.

Futures

Our next key release that we are working on is to fully support Progressive Web Application Taskflow Development. To find out more on what a progressive web app is head over to google – https://developers.google.com/web/progressive-web-apps/checklist

 

The post Replacing the “V” in Oracle ADF’s MVC design pattern with Oracle JET or other front end framework appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

PTC Windchill Success Story: The Benefits of Moving from PDM to PLM

PTC Windchill Success Story: The Benefits of Moving from PDM to PLM A Prominent Furniture Manufacturer deploys Fishbowl’s System Generated Drawing Automation to Increase Efficiencies with their Enterprise Part deployment within PTC Windchill

Our client has numerous global manufacturing facilities and is using PTC Windchill to streamline eBOM and mBOM processes. However, not all modifications to parts information propagates automatically/accurately at the drawing level. Updating plant specific drawings with enterprise part information was a time-consuming process that was manual, error prone, full of delays and diverted valuable engineering resources away from their value-added work.

The client desired a go-forward approach with their Windchill PLM implementation that would automatically update this critical enterprise part information. They became aware of our System Generated Drawing solution from a presentation at PTC LiveWorx. From the time of first contact the Fishbowl Solutions team worked to deliver a solution that helped them realize their vision.

BUSINESS PROBLEMS
  • Manufacturing waste due to ordering obsolete or incorrect parts
  • Manufacturing delays due to drawing updates needed for non-geometric changes – title block, lifecycle, BOM, as well as environmental/regulatory compliance markings, variant designs, etc.
  • Manually updating product drawings with plant specific parts information took away valuable engineering time
SOLUTION HIGHLIGHTS
  • Fishbowl’s System Generated Drawing Automation Systematically combines data from BOM, CAD, Drawing/Model, Part Attributes and enterprise resource planning (ERP) systems
  • Creates complete, static views of drawings based on multiple event triggers
  • Creates a template-based PDF that is overlaid along with the CAD geometry to produce a final document that can be dynamically stamped along with applicable lifecycle and approval information
  • Real-time watermarking on published PDFs
RESULTS

Increased accuracy of enterprise parts information included on drawings reduced product manufacturing waste
Allowed design changes to move downstream quickly, allowing a increase in design to manufacturing operational efficiencies

 

“Fishbowl’s System Generated Drawing Automation solution is the linchpin to our enterprise processes. It provides us with an automated method to include, update and proliferate accurate parts information throughout the business. This automation has in turn led to better data integrity, less waste, and more process efficiencies.” -PTC Windchill Admin/Developer

 

For more information about Fishbowl’s solution for System Generated Drawing Automation Click Here

The post PTC Windchill Success Story: The Benefits of Moving from PDM to PLM appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Webinar: Improve WebCenter Portal Performance by 30% and get out of Oracle ADF Development Hell

DATE: Thursday, March 30th
TIME: 12:00 PM CST, 1:00 PM EST

Jerry AberJoin Fishbowl’s Enterprise Architect, Jerry Aber, as he shares recommendations on performance improvements for WebCenter-based portals. Jerry has been delivering portal projects for over 15 years, and has been instrumental in developing a technology framework and methodology that provides repeatable and reusable development patterns for portal deployments and their ongoing administration and management. In this webinar, Jerry will share how leveraging modern web development technologies like Oracle JET, instead of ADF taskflows, can dramatically improve the performance of a portal – including the overall time to load the home page, as well as making content or stylistic changes.

Jerry will also share how to architect a portal implementation to include a caching layer that further enhances performance. These topics will all be backed by real world customer metrics Jerry and Fishbowl team have seen through numerous, successful customer deployments.

If you are a WebCenter Portal administrator and are frustrated with challenges of improving your ADF-centric portal, this webinar is for you. Come learn how to overhaul the ADF UI, which will lead to less development complexities and ensure more happy users.

Register today. 

New to Zoom? Go to zoom.us/test to ensure you can access the webinar.

The post Webinar: Improve WebCenter Portal Performance by 30% and get out of Oracle ADF Development Hell appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

Visual Studio Code editor support for Cloud Foundry Manifest files

Pas Apicella - Tue, 2017-03-21 19:14
An early BETA version of the Cloud Foundry (CF) manifest file support is available in Visual Studio Code. To see a video on this support follow the link below which shows how to install the extension and use Code Completion and a bit more follow link.

  https://www.youtube.com/watch?v=Ao6Mx6Q0XKE

With this extension for manifest files, it becomes a pleasure to write and modify those CF manifest files. You get content-assist, validations, and hover help - even for dynamic content like buildpacks and services (it integrates with the CF CLI for that)

Some screen shots of this as follows -






Categories: Fusion Middleware

dotnet publish - ASP.NET Core app deployed to Pivotal Cloud Foundry

Pas Apicella - Tue, 2017-03-21 06:16
I previously showed how to push a ASP .NET Core application to Pivotal Cloud Foundry by just using the source code files itself. It turns out this creates a rather large droplet and hence slows down the deployment. So here we are going to take the same demo and use "dotnet publish" to make this a lot faster. The previous post is here which is the base for this blog entry itself.

ASP.NET Core app deployed to Pivotal Cloud Foundry
http://theblasfrompas.blogspot.com.au/2017/03/aspnet-core-app-deployed-to-pivotal.html

First we need to make some changes to our project

1. Open "dotnet-core-mvc.csproj" and add "RuntimeIdentifiers" inside the "PropertyGroup" tag
  
<PropertyGroup>
<TargetFramework>netcoreapp1.0</TargetFramework>
<RuntimeIdentifiers>osx.10.10-x64;osx.10.11-x64;ubuntu.14.04-x64;ubuntu.15.04-x64;debian.8-x64</RuntimeIdentifiers>
</PropertyGroup>



2. Perform a "dotnet restore" as shown below either form a terminal windows/prompt or from Visual Studio Code itself , this step is vital and is required

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc$ dotnet restore
....

3. Now lets publish this as Release and ensure we target the correct runtime. For Cloud Foundry (CF) that will be "ubuntu.14.04-x64" and the framework version is 1.0 as we created the application using 1.0 , we could of used 1.1 here if we wanted to.

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc$ dotnet publish --output ./publish --configuration Release --runtime ubuntu.14.04-x64  --framework netcoreapp1.0
Microsoft (R) Build Engine version 15.1.548.43366
Copyright (C) Microsoft Corporation. All rights reserved.

  dotnet-core-mvc -> /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvc/bin/Release/netcoreapp1.0/ubuntu.14.04-x64/dotnet-core-mvc.dll

4. Finally cd into the "Publish" folder and verify there are the required DLL's as well as project files, JSON files , everything ready to run your application.

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc/publish$ ls -lartF
total 116848
-rwxr--r--    1 pasapicella  staff    25992 Jun 11  2016 Microsoft.Win32.Primitives.dll*

..

-rwxr--r--    1 pasapicella  staff      168 Mar 16 22:33 appsettings.Development.json*
drwxr-xr-x    7 pasapicella  staff      238 Mar 21 08:01 wwwroot/
-rwxr--r--    1 pasapicella  staff     1332 Mar 21 08:01 dotnet-core-mvc.pdb*
-rwxr--r--    1 pasapicella  staff     8704 Mar 21 08:01 dotnet-core-mvc.dll*
drwxr-xr-x    6 pasapicella  staff      204 Mar 21 08:01 Views/
drwxr-xr-x   16 pasapicella  staff      544 Mar 21 08:01 ../
-rwxr--r--    1 pasapicella  staff      362 Mar 21 08:01 web.config*
drwxr-xr-x   79 pasapicella  staff     2686 Mar 21 08:01 refs/
-rwxr--r--    1 pasapicella  staff       92 Mar 21 08:01 dotnet-core-mvc.runtimeconfig.json*
-rwxr--r--    1 pasapicella  staff   297972 Mar 21 08:01 dotnet-core-mvc.deps.json*
drwxr-xr-x  212 pasapicella  staff     7208 Mar 21 08:01 ./

5. Now this time lets "cf push" using the files in the "Publish" folder and

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc/publish$ cf push pas-dotnetcore-mvc-demo -b https://github.com/cloudfoundry/dotnet-core-buildpack -m 512m
Creating app pas-dotnetcore-mvc-demo in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

Using route pas-dotnetcore-mvc-demo.cfapps.io
Binding pas-dotnetcore-mvc-demo.cfapps.io to pas-dotnetcore-mvc-demo...
OK

Uploading pas-dotnetcore-mvc-demo...
Uploading app files from: /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvc/publish
Uploading 14.8M, 280 files
Done uploading
OK

Starting app pas-dotnetcore-mvc-demo in org apples-pivotal-org / space development as papicella@pivotal.io...
Creating container
Successfully created container
Downloading app package...
Downloaded app package (23.7M)
-----> Buildpack version 1.0.13
ASP.NET Core buildpack version: 1.0.13
ASP.NET Core buildpack starting compile
-----> Restoring files from buildpack cache
       OK
-----> Restoring NuGet packages cache
-----> Extracting libunwind
       libunwind version: 1.2
       OK
       https://buildpacks.cloudfoundry.org/dependencies/manual-binaries/dotnet/libunwind-1.2-linux-x64-f56347d4.tgz
       OK
-----> Saving to buildpack cache
       Copied 38 files from /tmp/app/libunwind to /tmp/cache
       OK
-----> Cleaning staging area
       OK
ASP.NET Core buildpack is done creating the droplet
Exit status 0
Uploading droplet, build artifacts cache...
Uploading build artifacts cache...
Uploading droplet...
Uploaded build artifacts cache (995K)
Uploaded droplet (23.8M)
Uploading complete
Destroying container
Successfully destroyed container

1 of 1 instances running

App started


OK

App pas-dotnetcore-mvc-demo was started using this command `cd . && ./dotnet-core-mvc --server.urls http://0.0.0.0:${PORT}`

Showing health and status for app pas-dotnetcore-mvc-demo in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

requested state: started
instances: 1/1
usage: 512M x 1 instances
urls: pas-dotnetcore-mvc-demo.cfapps.io
last uploaded: Mon Mar 20 21:05:08 UTC 2017
stack: cflinuxfs2
buildpack: https://github.com/cloudfoundry/dotnet-core-buildpack

     state     since                    cpu    memory          disk          details
#0   running   2017-03-21 08:06:05 AM   0.0%   39.2M of 512M   66.9M of 1G

Categories: Fusion Middleware

Welcome to the new Fishbowl Solutions Blog

Out with the old and in with the new.  Welcome to the new home of the Fishbowl Solutions blog! Please enjoy upgraded functionality and integration with our website.  Check back often for new and exciting posts form our talented staff.  If you want automatic updates click the subscribe link to the right and be notified whenever a new post appears.

 

 

 

 

 

 

 

The post Welcome to the new Fishbowl Solutions Blog appeared first on Fishbowl Solutions.

Categories: Fusion Middleware, Other

ASP.NET Core app deployed to Pivotal Cloud Foundry

Pas Apicella - Thu, 2017-03-16 22:37
This post will show you how to write your first ASP.NET Core application on macOS or Linux and push it to Pivotal Cloud Foundry without having to PUBLISH it for deployment.

Before getting started you will need the following

1. Download and install .NET Core
2. Visual Studio Code with the C# extension.
3. CF CLI installed https://github.com/cloudfoundry/cli

Steps

Note: Assumes your already logged into Pivotal Cloud Foundry and connected to Pivotal Web Services (run.pivotal.io), the command below shows I am connected and targeted

pasapicella@pas-macbook:~$ cf target
API endpoint:   https://api.run.pivotal.io
API version:    2.75.0
User:           papicella@pivotal.io
Org:            apples-pivotal-org
Space:          development

1. Create new project

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc$ dotnet new mvc --auth None --framework netcoreapp1.0
Content generation time: 278.4748 ms
The template "ASP.NET Core Web App" created successfully.

2. Restore as follows

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc$ dotnet restore
  Restoring packages for /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvc/dotnet-core-mvc.csproj...
  Generating MSBuild file /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvco/obj/dotnet-core-mvc.csproj.nuget.g.props.
  Generating MSBuild file /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvc/obj/dotnet-core-mvc.csproj.nuget.g.targets.
  Writing lock file to disk. Path: /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvc/obj/project.assets.json
  Restore completed in 1.09 sec for /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvc/dotnet-core-mvc.csproj.

  NuGet Config files used:
      /Users/pasapicella/.nuget/NuGet/NuGet.Config

  Feeds used:
      https://api.nuget.org/v3/index.json

3. At this point we can run the application and see what it looks like in a browser

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc$ dotnet run
Hosting environment: Production
Content root path: /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvc
Now listening on: http://localhost:5000
Application started. Press Ctrl+C to shut down.


Now to prepare this demo for Pivotal Cloud Foundry we need to make some changes to he generated code as shown in the next few steps

4. In Visual Studio Code, under the menu item “File/Open” select the “dotnet-core-mvc” folder and open it. Confirm all messages from Visual Studio Code.



The .NET Core buildpack configures the app web server automatically so you don’t have to handle this yourself, but you have to prepare your app in a way that allows the buildpack to deliver this information via the command line to your app

5. Open "Program.cs" and modify the Main() method as follows adding "var config = ..." and ".UseConfiguration(config)" as shown below
  
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;

namespace dotnet_core_mvc
{
public class Program
{
public static void Main(string[] args)
{
var config = new ConfigurationBuilder()
.AddCommandLine(args)
.Build();

var host = new WebHostBuilder()
.UseKestrel()
.UseConfiguration(config)
.UseContentRoot(Directory.GetCurrentDirectory())
.UseIISIntegration()
.UseStartup<Startup>()
.Build();

host.Run();
}
}
}

6. Open "dotnet-core-mvc.csproj" and add the following dependency "Microsoft.Extensions.Configuration.CommandLine" as shown below
  
<Project Sdk="Microsoft.NET.Sdk.Web">

<PropertyGroup>
<TargetFramework>netcoreapp1.0</TargetFramework>
</PropertyGroup>


<ItemGroup>
<PackageReference Include="Microsoft.AspNetCore" Version="1.0.4" />
<PackageReference Include="Microsoft.AspNetCore.Mvc" Version="1.0.3" />
<PackageReference Include="Microsoft.AspNetCore.StaticFiles" Version="1.0.2" />
<PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="1.0.2" />
<PackageReference Include="Microsoft.Extensions.Configuration.CommandLine" Version="1.0.0" />
<PackageReference Include="Microsoft.VisualStudio.Web.BrowserLink" Version="1.0.1" />
</ItemGroup>


</Project>


7. File -> Save All

8. Jump back out to a terminal windows, you can actually restore from Visual Studio Code IDE BUT I still like to do it from the command line

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc$ dotnet restore
...

9. Deploy to Pivotal Cloud Foundry as follows, you will need to use a unique name so replace "pas" with your own name that should do it.

$ cf push pas-dotnetcore-mvc-demo -b https://github.com/cloudfoundry/dotnet-core-buildpack -m 512m

** Output **

pasapicella@pas-macbook:~/pivotal/software/dotnet/dotnet-core-mvc$ cf push pas-dotnetcore-mvc-demo -b https://github.com/cloudfoundry/dotnet-core-buildpack -m 512m
Creating app pas-dotnetcore-mvc-demo in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

Using route pas-dotnetcore-mvc-demo.cfapps.io
Binding pas-dotnetcore-mvc-demo.cfapps.io to pas-dotnetcore-mvc-demo...
OK

Uploading pas-dotnetcore-mvc-demo...
Uploading app files from: /Users/pasapicella/pivotal/software/dotnet/dotnet-core-mvc
Uploading 208.7K, 84 files
Done uploading
OK

Starting app pas-dotnetcore-mvc-demo in org apples-pivotal-org / space development as papicella@pivotal.io...
Creating container
Successfully created container
Downloading app package...
Downloaded app package (675.5K)
ASP.NET Core buildpack version: 1.0.13
ASP.NET Core buildpack starting compile
-----> Restoring files from buildpack cache
       OK
-----> Restoring NuGet packages cache
       OK
-----> Extracting libunwind
       libunwind version: 1.2
       https://buildpacks.cloudfoundry.org/dependencies/manual-binaries/dotnet/libunwind-1.2-linux-x64-f56347d4.tgz
       OK
-----> Installing .NET SDK
       .NET SDK version: 1.0.1
       OK
-----> Restoring dependencies with Dotnet CLI

       Welcome to .NET Core!
       ---------------------
       Telemetry
       The .NET Core tools collect usage data in order to improve your experience. The data is anonymous and does not include command-line arguments. The data is collected by Microsoft and shared with the community.
       You can opt out of telemetry by setting a DOTNET_CLI_TELEMETRY_OPTOUT environment variable to 1 using your favorite shell.
       You can read more about .NET Core tools telemetry @ https://aka.ms/dotnet-cli-telemetry.
       Configuring...
       -------------------
       A command is running to initially populate your local package cache, to improve restore speed and enable offline access. This command will take up to a minute to complete and will only happen once.
       Decompressing 100% 16050 ms
-----> Buildpack version 1.0.13
       https://buildpacks.cloudfoundry.org/dependencies/dotnet/dotnet.1.0.1.linux-amd64-99324ccc.tar.gz
       Learn more about .NET Core @ https://aka.ms/dotnet-docs. Use dotnet --help to see available commands or go to https://aka.ms/dotnet-cli-docs.

       --------------

       Expanding 100% 13640 ms
         Restoring packages for /tmp/app/dotnet-core-mvc.csproj...
         Installing Microsoft.Extensions.Configuration 1.0.0.
         Installing Microsoft.Extensions.Configuration.CommandLine 1.0.0.
         Generating MSBuild file /tmp/app/obj/dotnet-core-mvc.csproj.nuget.g.props.
         Writing lock file to disk. Path: /tmp/app/obj/project.assets.json
         Restore completed in 2.7 sec for /tmp/app/dotnet-core-mvc.csproj.

         NuGet Config files used:
             /tmp/app/.nuget/NuGet/NuGet.Config

         Feeds used:
             https://api.nuget.org/v3/index.json

         Installed:
             2 package(s) to /tmp/app/dotnet-core-mvc.csproj
       OK
       Detected .NET Core runtime version(s) 1.0.4, 1.1.1 required according to 'dotnet restore'
-----> Installing required .NET Core runtime(s)
       .NET Core runtime 1.0.4 already installed
       .NET Core runtime 1.1.1 already installed
       OK
-----> Publishing application using Dotnet CLI
       Microsoft (R) Build Engine version 15.1.548.43366
       Copyright (C) Microsoft Corporation. All rights reserved.

         dotnet-core-mvc -> /tmp/app/bin/Debug/netcoreapp1.0/dotnet-core-mvc.dll
       Copied 38 files from /tmp/app/libunwind to /tmp/cache
-----> Saving to buildpack cache
       OK
       Copied 850 files from /tmp/app/.dotnet to /tmp/cache
       Copied 19152 files from /tmp/app/.nuget to /tmp/cache
       OK
-----> Cleaning staging area
       Removing /tmp/app/.nuget
       OK
ASP.NET Core buildpack is done creating the droplet
Exit status 0
Uploading droplet, build artifacts cache...
Uploading droplet...
Uploaded build artifacts cache (359.9M)
Uploaded droplet (131.7M)
Uploading complete
Successfully destroyed container

0 of 1 instances running, 1 starting
1 of 1 instances running

App started


OK

App pas-dotnetcore-mvc-demo was started using this command `cd .cloudfoundry/dotnet_publish && dotnet dotnet-core-mvc.dll --server.urls http://0.0.0.0:${PORT}`

Showing health and status for app pas-dotnetcore-mvc-demo in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

requested state: started
instances: 1/1
usage: 512M x 1 instances
urls: pas-dotnetcore-mvc-demo.cfapps.io
last uploaded: Fri Mar 17 03:19:51 UTC 2017
stack: cflinuxfs2
buildpack: https://github.com/cloudfoundry/dotnet-core-buildpack

     state     since                    cpu    memory          disk           details
#0   running   2017-03-17 02:26:03 PM   0.0%   39.1M of 512M   302.7M of 1G

10. Finally invoke the application using the URL which can be determined by the output at the end of the PUSH above or using "cf apps"



More Information

https://docs.microsoft.com/en-us/aspnet/core/tutorials/your-first-mac-aspnet
Categories: Fusion Middleware

Run a Spring Cloud Task from Pivotal Cloud Foundry using Cloud Foundry Tasks

Pas Apicella - Fri, 2017-03-10 02:26
Recently we announced Spring Cloud Task under the umbrella of Spring Cloud through the following blog entry.  In the post below I am going to show you how you would create a Cloud Foundry Task that can invoke this Spring Cloud Task itself.

Spring Cloud Task allows a user to develop and run short lived microservices using Spring Cloud and run them locally, in the cloud, even on Spring Cloud Data Flow. In this example we will run it in the cloud using Pivotal Cloud Foundry (PWS instance run.pivotal.io). For more information on this follow the link below.

https://cloud.spring.io/spring-cloud-task/

For more information on Cloud Foundry Tasks follow the link below

https://docs.cloudfoundry.org/devguide/using-tasks.html

Steps

Note: This demo assumes you are already logged into PCF you can confirm that using a command as follows

pasapicella@pas-macbook:~/temp$ cf target
API endpoint:   https://api.run.pivotal.io
API version:    2.75.0
User:           papicella@pivotal.io
Org:            apples-pivotal-org
Space:          development

Also ensure your using the correct version of CF CLI which at the time of this blog was as follows you will need at least that version.

pasapicella@pas-macbook:~/temp$ cf --version
cf version 6.25.0+787326d95.2017-02-28

You will also need an instance of Pivotal Cloud Foundry which supports Tasks within the Applications Manager UI which Pivotal Web Services (PWS) does

1. Clone the simple Spring Cloud Task as follows

$ git clone https://github.com/papicella/SpringCloudTaskTodaysDate.git

pasapicella@pas-macbook:~/temp$ git clone https://github.com/papicella/SpringCloudTaskTodaysDate.git
Cloning into 'SpringCloudTaskTodaysDate'...
remote: Counting objects: 19, done.
remote: Compressing objects: 100% (12/12), done.
remote: Total 19 (delta 0), reused 19 (delta 0), pack-reused 0
Unpacking objects: 100% (19/19), done.

2. Change into SpringCloudTaskTodaysDate directory

3. If you look at the class "pas.au.pivotal.pa.sct.demo.SpringCloudTaskTodaysDateApplication" you will see it's just a Spring Boot application that has an annotation "@EnableTask". As long as Spring Cloud Task is on the classpath any Spring Boot application with @EnableTask will record the start and finish of the boot application.

4. Package the application using "mvn package"

pasapicella@pas-macbook:~/temp/SpringCloudTaskTodaysDate$ mvn package
[INFO] Scanning for projects...
Downloading: https://repo.spring.io/snapshot/org/springframework/cloud/spring-cloud-task-dependencies/1.2.0.BUILD-SNAPSHOT/maven-metadata.xml
Downloaded: https://repo.spring.io/snapshot/org/springframework/cloud/spring-cloud-task-dependencies/1.2.0.BUILD-SNAPSHOT/maven-metadata.xml (809 B at 0.6 KB/sec)
[INFO]

..

[INFO] Building jar: /Users/pasapicella/temp/SpringCloudTaskTodaysDate/target/springcloudtasktodaysdate-0.0.1-SNAPSHOT.jar
[INFO]
[INFO] --- spring-boot-maven-plugin:1.5.2.RELEASE:repackage (default) @ springcloudtasktodaysdate ---
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10.621 s
[INFO] Finished at: 2017-03-10T18:51:15+11:00
[INFO] Final Memory: 29M/199M
[INFO] ------------------------------------------------------------------------

5.  Push the application as shown below

$ cf push springcloudtask-date --no-route --health-check-type none -p ./target/springcloudtasktodaysdate-0.0.1-SNAPSHOT.jar -m 512m

** Output **

pasapicella@pas-macbook:~/temp/SpringCloudTaskTodaysDate$ cf push springcloudtask-date --no-route --health-check-type none -p ./target/springcloudtasktodaysdate-0.0.1-SNAPSHOT.jar -m 512m

Creating app springcloud-task-date in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

App springcloud-task-date is a worker, skipping route creation
Uploading springcloud-task-date...
Uploading app files from: /var/folders/c3/27vscm613fjb6g8f5jmc2x_w0000gp/T/unzipped-app069139431
Uploading 239.1K, 89 files

...

1 of 1 instances running

App started


OK

App springcloudtask-date was started using this command `CALCULATED_MEMORY=$($PWD/.java-buildpack/open_jdk_jre/bin/java-buildpack-memory-calculator-2.0.2_RELEASE -memorySizes=metaspace:64m..,stack:228k.. -memoryWeights=heap:65,metaspace:10,native:15,stack:10 -memoryInitials=heap:100%,metaspace:100% -stackThreads=300 -totMemory=$MEMORY_LIMIT) && JAVA_OPTS="-Djava.io.tmpdir=$TMPDIR -XX:OnOutOfMemoryError=$PWD/.java-buildpack/open_jdk_jre/bin/killjava.sh $CALCULATED_MEMORY -Djavax.net.ssl.trustStore=$PWD/.java-buildpack/container_certificate_trust_store/truststore.jks -Djavax.net.ssl.trustStorePassword=java-buildpack-trust-store-password" && SERVER_PORT=$PORT eval exec $PWD/.java-buildpack/open_jdk_jre/bin/java $JAVA_OPTS -cp $PWD/. org.springframework.boot.loader.JarLauncher`

Showing health and status for app springcloudtask-date in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

requested state: started
instances: 1/1
usage: 512M x 1 instances
urls:
last uploaded: Fri Mar 10 07:57:17 UTC 2017
stack: cflinuxfs2
buildpack: container-certificate-trust-store=2.0.0_RELEASE java-buildpack=v3.14-offline-https://github.com/cloudfoundry/java-buildpack.git#d5d58c6 java-main open-jdk-like-jre=1.8.0_121 open-jdk-like-memory-calculator=2.0.2_RELEASE spring-auto-reconfiguration=1.10...

     state      since                    cpu    memory         disk         details
#0   starting   2017-03-10 06:58:43 PM   0.0%   936K of 512M   1.3M of 1G


6. Stop the application as we only want to run it as a CF Task when we are ready to run it.

$ cf stop springcloudtask-date

** Output **

pasapicella@pas-macbook:~/temp/SpringCloudTaskTodaysDate$ cf stop springcloudtask-date
Stopping app springcloudtask-date in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

7. In a separate lets tail the logs from the application as follows. Don't worry there is no output yet as the application invocation through a task has not yet occurred.

$ cf logs springcloudtask-date

** Output **

pasapicella@pas-macbook:~$ cf logs springcloudtask-date
Connected, tailing logs for app springcloudtask-date in org apples-pivotal-org / space development as papicella@pivotal.io...


8. Now log into PWS apps manager console and navigate to your application settings page as shown below. On this page you will see the run command for the spring boot application as shown below


9. To invoke the task we run a command as follows using the "invocation command" we get from step #8 above.

Format: cf run-task {app-name} {invocation command}

$ cf run-task springcloudtask-date 'INVOCATION COMMAND from step #8 above'

** Output **

pasapicella@pas-macbook:~/temp/SpringCloudTaskTodaysDate$ cf run-task springcloudtask-date 'CALCULATED_MEMORY=$($PWD/.java-buildpack/open_jdk_jre/bin/java-buildpack-memory-calculator-2.0.2_RELEASE -memorySizes=metaspace:64m..,stack:228k.. -memoryWeights=heap:65,metaspace:10,native:15,stack:10 -memoryInitials=heap:100%,metaspace:100% -stackThreads=300 -totMemory=$MEMORY_LIMIT) && JAVA_OPTS="-Djava.io.tmpdir=$TMPDIR -XX:OnOutOfMemoryError=$PWD/.java-buildpack/open_jdk_jre/bin/killjava.sh $CALCULATED_MEMORY -Djavax.net.ssl.trustStore=$PWD/.java-buildpack/container_certificate_trust_store/truststore.jks -Djavax.net.ssl.trustStorePassword=java-buildpack-trust-store-password" && SERVER_PORT=$PORT eval exec $PWD/.java-buildpack/open_jdk_jre/bin/java $JAVA_OPTS -cp $PWD/. org.springframework.boot.loader.JarLauncher'
Creating task for app springcloudtask-date in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

Task has been submitted successfully for execution.
Task name:   371bb9b1
Task id:     1

10. Return to PWS Applications Manager and click on the "Tasks" tab to verify if was successful


11. Return to the terminal window where we were tailing the logs to verify the task was run

pasapicella@pas-macbook:~$ cf logs springcloudtask-date
Connected, tailing logs for app springcloudtask-date in org apples-pivotal-org / space development as papicella@pivotal.io...

2017-03-10T19:15:29.55+1100 [APP/TASK/371bb9b1/0]OUT Creating container
2017-03-10T19:15:29.89+1100 [APP/TASK/371bb9b1/0]OUT Successfully created container
2017-03-10T19:15:34.45+1100 [APP/TASK/371bb9b1/0]OUT   .   ____          _            __ _ _
2017-03-10T19:15:34.45+1100 [APP/TASK/371bb9b1/0]OUT  /\\ / ___'_ __ _ _(_)_ __  __ _ \ \ \ \
2017-03-10T19:15:34.45+1100 [APP/TASK/371bb9b1/0]OUT  \\/  ___)| |_)| | | | | || (_| |  ) ) ) )
2017-03-10T19:15:34.45+1100 [APP/TASK/371bb9b1/0]OUT   '  |____| .__|_| |_|_| |_\__, | / / / /
2017-03-10T19:15:34.45+1100 [APP/TASK/371bb9b1/0]OUT  :: Spring Boot ::        (v1.5.2.RELEASE)
2017-03-10T19:15:34.45+1100 [APP/TASK/371bb9b1/0]OUT ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
2017-03-10T19:15:34.45+1100 [APP/TASK/371bb9b1/0]OUT  =========|_|==============|___/=/_/_/_/
2017-03-10T19:15:34.71+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:34.706  INFO 7 --- [           main] pertySourceApplicationContextInitializer : Adding 'cloud' PropertySource to ApplicationContext
2017-03-10T19:15:34.85+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:34.853  INFO 7 --- [           main] nfigurationApplicationContextInitializer : Adding cloud service auto-reconfiguration to ApplicationContext
2017-03-10T19:15:34.89+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:34.891  INFO 7 --- [           main] s.d.SpringCloudTaskTodaysDateApplication : The following profiles are active: cloud
2017-03-10T19:15:34.89+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:34.890  INFO 7 --- [           main] s.d.SpringCloudTaskTodaysDateApplication : Starting SpringCloudTaskTodaysDateApplication on b00b045e-dea4-4e66-8298-19dd71edb9c8 with PID 7 (/home/vcap/app/BOOT-INF/classes started by vcap in /home/vcap/app)
2017-03-10T19:15:35.00+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:35.009  INFO 7 --- [           main] s.c.a.AnnotationConfigApplicationContext : Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@7a07c5b4: startup date [Fri Mar 10 08:15:35 UTC 2017]; root of context hierarchy
2017-03-10T19:15:35.91+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:35.912  INFO 7 --- [           main] urceCloudServiceBeanFactoryPostProcessor : Auto-reconfiguring beans of type javax.sql.DataSource
2017-03-10T19:15:35.91+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:35.916  INFO 7 --- [           main] urceCloudServiceBeanFactoryPostProcessor : No beans of type javax.sql.DataSource found. Skipping auto-reconfiguration.
2017-03-10T19:15:36.26+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.259 DEBUG 7 --- [           main] o.s.c.t.c.SimpleTaskConfiguration        : Using org.springframework.cloud.task.configuration.DefaultTaskConfigurer TaskConfigurer
2017-03-10T19:15:36.74+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.748  INFO 7 --- [           main] o.s.j.e.a.AnnotationMBeanExporter        : Registering beans for JMX exposure on startup
2017-03-10T19:15:36.75+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.758 DEBUG 7 --- [           main] o.s.c.t.r.support.SimpleTaskRepository   : Creating: TaskExecution{executionId=0, parentExecutionId=null, exitCode=null, taskName='DateSpringCloudTask:cloud:', startTime=Fri Mar 10 08:15:36 UTC 2017, endTime=null, exitMessage='null', externalExecutionId='null', errorMessage='null', arguments=[]}
2017-03-10T19:15:36.77+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.776 DEBUG 7 --- [           main] o.s.c.t.r.support.SimpleTaskRepository   : Updating: TaskExecution with executionId=0 with the following {exitCode=0, endTime=Fri Mar 10 08:15:36 UTC 2017, exitMessage='null', errorMessage='null'}
2017-03-10T19:15:36.75+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.757  INFO 7 --- [           main] o.s.c.support.DefaultLifecycleProcessor  : Starting beans in phase 0
2017-03-10T19:15:36.77+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.775  INFO 7 --- [           main] s.d.SpringCloudTaskTodaysDateApplication : Executed at : 3/10/17 8:15 AM
2017-03-10T19:15:36.77+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.777  INFO 7 --- [           main] s.c.a.AnnotationConfigApplicationContext : Closing org.springframework.context.annotation.AnnotationConfigApplicationContext@7a07c5b4: startup date [Fri Mar 10 08:15:35 UTC 2017]; root of context hierarchy
2017-03-10T19:15:36.77+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.779  INFO 7 --- [           main] o.s.c.support.DefaultLifecycleProcessor  : Stopping beans in phase 0
2017-03-10T19:15:36.78+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.782  INFO 7 --- [           main] o.s.j.e.a.AnnotationMBeanExporter        : Unregistering JMX-exposed beans on shutdown
2017-03-10T19:15:36.78+1100 [APP/TASK/371bb9b1/0]OUT 2017-03-10 08:15:36.788  INFO 7 --- [           main] s.d.SpringCloudTaskTodaysDateApplication : Started SpringCloudTaskTodaysDateApplication in 3.205 seconds (JVM running for 3.985)
2017-03-10T19:15:36.83+1100 [APP/TASK/371bb9b1/0]OUT Exit status 0
2017-03-10T19:15:36.86+1100 [APP/TASK/371bb9b1/0]OUT Destroying container
2017-03-10T19:15:37.79+1100 [APP/TASK/371bb9b1/0]OUT Successfully destroyed container

12. Finally you can verify tasks using a command as follows

$ cf tasks springcloudtask-date

** Output **

pasapicella@pas-macbook:~/temp/SpringCloudTaskTodaysDate$ cf tasks springcloudtask-date
Getting tasks for app springcloudtask-date in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

id   name       state       start time                      command
1    371bb9b1   SUCCEEDED   Fri, 10 Mar 2017 08:15:28 UTC   CALCULATED_MEMORY=$($PWD/.java-buildpack/open_jdk_jre/bin/java-buildpack-memory-calculator-2.0.2_RELEASE -memorySizes=metaspace:64m..,stack:228k.. -memoryWeights=heap:65,metaspace:10,native:15,stack:10 -memoryInitials=heap:100%,metaspace:100% -stackThreads=300 -totMemory=$MEMORY_LIMIT) && JAVA_OPTS="-Djava.io.tmpdir=$TMPDIR -XX:OnOutOfMemoryError=$PWD/.java-buildpack/open_jdk_jre/bin/killjava.sh $CALCULATED_MEMORY -Djavax.net.ssl.trustStore=$PWD/.java-buildpack/container_certificate_trust_store/truststore.jks -Djavax.net.ssl.trustStorePassword=java-buildpack-trust-store-password" && SERVER_PORT=$PORT eval exec $PWD/.java-buildpack/open_jdk_jre/bin/java $JAVA_OPTS -cp $PWD/. org.springframework.boot.loader.JarLauncher


Categories: Fusion Middleware

ASP .NET Core (CLR) on Pivotal Cloud Foundry

Pas Apicella - Wed, 2017-03-08 04:57
There are two ways to run .NET applications on Pivotal Cloud Foundry. In short it's as follows

  1. Windows 2012 R2 Stack (Windows 2016 coming soon)
  2. Linux Stack - ASP.NET Core CLR only

In the example below I am going to show how you would push a sample ASP.NET Core application using the default Linux stack. I am using run.pivotal.io or better knows as PWS (Pivotal Web Services) instance which only supports a Linux stack. In your own PCF installation an operator may have provided Windows support in which case a "cf stacks" is one way to find out as shown below

$ cf stacks
Getting stacks in org pivot-papicella / space development as papicella@pivotal.io...
OK

name            description
cflinuxfs2      Cloud Foundry Linux-based filesystem

windows2012R2   Microsoft Windows / .Net 64 bit

Steps

1. Clone a demo as shown below

$ git clone https://github.com/bingosummer/aspnet-core-helloworld.git
Cloning into 'aspnet-core-helloworld'...
remote: Counting objects: 206, done.
remote: Total 206 (delta 0), reused 0 (delta 0), pack-reused 206
Receiving objects: 100% (206/206), 43.40 KiB | 0 bytes/s, done.
Resolving deltas: 100% (78/78), done.

2. Change to the right directory as shown below

$ cd aspnet-core-helloworld

3. Edit manifest.yml to use the BETA buildpack as follows. You can list out the build packs using "cf buildpacks"

---
applications:
- name: sample-aspnetcore-helloworld
  random-route: true
  memory: 512M
  buildpack: dotnet_core_buildpack_beta

4. Push as shown below

pasapicella@pas-macbook:~/apps/dotnet/aspnet-core-helloworld$ cf push
Using manifest file /Users/pasapicella/apps/dotnet/aspnet-core-helloworld/manifest.yml

Updating app sample-aspnetcore-helloworld in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

Uploading sample-aspnetcore-helloworld...
Uploading app files from: /Users/pasapicella/pivotal/apps/dotnet/aspnet-core-helloworld
Uploading 21.9K, 15 files
Done uploading
OK

Stopping app sample-aspnetcore-helloworld in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

Starting app sample-aspnetcore-helloworld in org apples-pivotal-org / space development as papicella@pivotal.io...
Downloading dotnet_core_buildpack_beta...
Downloaded dotnet_core_buildpack_beta
Creating container
Successfully created container
Downloading build artifacts cache...
Downloading app package...
Downloaded app package (21.5K)
Downloaded build artifacts cache (157.7M)

...

-----> Saving to buildpack cache
    Copied 0 files from /tmp/app/libunwind to /tmp/cache
    Copied 0 files from /tmp/app/.dotnet to /tmp/cache
    Copied 0 files from /tmp/app/.nuget to /tmp/cache
    OK
ASP.NET Core buildpack is done creating the droplet
Uploading droplet, build artifacts cache...
Uploading build artifacts cache...
Uploading droplet...
Uploaded build artifacts cache (157.7M)
Uploaded droplet (157.7M)
Uploading complete
Destroying container
Successfully destroyed container

0 of 1 instances running, 1 starting
0 of 1 instances running, 1 starting
1 of 1 instances running

App started


OK

App sample-aspnetcore-helloworld was started using this command `dotnet run --project src/dotnetstarter --server.urls http://0.0.0.0:${PORT}`

Showing health and status for app sample-aspnetcore-helloworld in org apples-pivotal-org / space development as papicella@pivotal.io...
OK

requested state: started
instances: 1/1
usage: 512M x 1 instances
urls: sample-aspnetcore-helloworld-gruffier-jackpot.cfapps.io
last uploaded: Wed Mar 8 10:46:44 UTC 2017
stack: cflinuxfs2
buildpack: dotnet_core_buildpack_beta

  state     since                    cpu     memory          disk           details
#0   running   2017-03-08 09:48:29 PM   22.4%   36.7M of 512M   556.8M of 1G


Verify the application using the URL given at the end of the push







Categories: Fusion Middleware

Spring Boot Actuator support added to Pivotal Web Services (PWS) Application Manager

Pas Apicella - Mon, 2017-03-06 17:07
Recently we added "Spring Boot Actuator support" to Pivotal Web Services (PWS) http://run.pivotal.io. If you want to try this out simply use the demo below which is all setup to verify how this works.

https://github.com/papicella/SpringBootPCFPas

Once pushed you will see a Spring Boot icon in the Application Manager UI showing the Actuator support as per below.








Categories: Fusion Middleware

Retrieving ATM location's using the NAB API

Pas Apicella - Mon, 2017-01-30 21:21
NAB have released an API to determine ATM Locations amongst other things based on a GEO location. It's documented here.

https://developer.nab.com.au/docs#locations-api-get-locations-by-geo-coordinates

Here we use this API but I wanted to highlight a few things you would need to know to consume the API

1. You will need to provide the following and this has to be calculated based on a LAT/LONG. The screen shot below shows what a GEO location with a radius of 0.5 KM's would look like. You will see it's starting point is your current location in this case in Melbourne CBD.



2. The NAP API would required the following to be set which can be obtained using a calculation as per the screen shot above

swLat South-West latitude coordinates expressed as decimal
neLat North-East latitude coordinates expressed as decimal
neLng North-East longitude coordinates expressed as decimal
swLng South-West longitude coordinates expressed as decimal

3. The attribute LocationType allows you to filter what type of location your after I set this to "ATM" to only find ATM locations.

4. I also set the attribute Fields to extended as this gives me detailed information

5. Once you have the data here is an example of getting detailed information of ATM locations using the GEO location co-ordinates. In this example CURL is good enough to illustrate that

pasapicella@pas-macbook:~/pivotal$ curl -H "x-nab-key: NABAPI-KEY" "https://api.developer.nab.com.au/v2/locations?locationType=atm&searchCriteria=geo&swLat=-37.81851471355399&swLng=144.95235719310358&neLat=-37.812155549503025&neLng=144.96040686020137&fields=extended&v=1" | jq -r
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 10371  100 10371    0     0   2016      0  0:00:05  0:00:05 --:--:--  2871
{
  "locationSearchResponse": {
    "totalRecords": 16,
    "viewport": {
      "swLat": -37.81586424048582,
      "swLng": 144.9589117502319,
      "neLat": -37.81109231077813,
      "neLng": 144.96758064976802
    },
    "locations": [
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Offsite",
          "isDeposit": false,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "NAB ATM",
          "address3": "Melbourne Central",
          "address4": "Lower Ground floor",
          "id": 5058548,
          "key": "atm_3B46",
          "description": "Melbourne Central",
          "address1": "300 Elizabeth Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.812031,
          "longitude": 144.9621768,
          "hours": "Mon-Thu 10.00am-06.00pm, Fri 10.00am-09.00pm, Sat 10.00am-06.00pm, Sun 10.00am-05.00pm"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Offsite",
          "isDeposit": false,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "NAB ATM",
          "address3": "Melbourne Central",
          "address4": "Ground Floor",
          "address5": "Near La Trobe St Entrance under escalator",
          "id": 5058552,
          "key": "atm_3B56",
          "description": "Melbourne Central",
          "address1": "300 Elizabeth Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.812031,
          "longitude": 144.9621768,
          "hours": "Mon-Thu 10.00am-06.00pm, Fri 10.00am-09.00pm, Sat 10.00am-06.00pm, Sun 10.00am-05.00pm"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Offsite",
          "isDeposit": false,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "NAB ATM",
          "address3": "Queen Victoria Market",
          "address5": "Outside the market facing the street",
          "id": 5058555,
          "key": "atm_3B61",
          "description": "Queen Victoria Market",
          "address1": "Queen Street",
          "address2": "Corner Therry Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8130009,
          "longitude": 144.9597905,
          "hours": "24/7"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Offsite",
          "isDeposit": false,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "NAB ATM",
          "address3": "Target Centre",
          "id": 5058577,
          "key": "atm_3CC7",
          "description": "Target Centre",
          "address1": "236 Bourke Street Mall",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8132227,
          "longitude": 144.9665518,
          "hours": "Mon-Fri 09.00am-05.00pm, Sat-Sun 10.00am-05.00pm"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": true,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "NAB ATM",
          "address3": "Queen Victoria Centre",
          "id": 5058614,
          "key": "atm_3F07",
          "description": "Queen Victoria",
          "address1": "228-234 Lonsdale Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8122729,
          "longitude": 144.9622383,
          "hours": "24/7"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Offsite",
          "isDeposit": false,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "NAB ATM",
          "address3": "University of Melbourne",
          "address5": "Kenneth Myer Building",
          "id": 5058653,
          "key": "atm_3G28",
          "description": "KMB Foyer",
          "address1": "30 Royal Parade",
          "suburb": "Parkville",
          "state": "VIC",
          "postcode": "3052",
          "latitude": -37.8149256,
          "longitude": 144.9643156,
          "hours": "Mon-Fri 07.00am-07.00pm"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": true,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "SmartATM",
          "address3": "Midtown Plaza",
          "address4": "Shop 8",
          "id": 5058783,
          "key": "atm_3S02",
          "description": "Midtown Plaza",
          "address1": "186 Swanston Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8131315,
          "longitude": 144.9654723,
          "hours": "Mon-Fri 09.30am-05.00pm, Sat 10.00am-02.00pm"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": true,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "SmartATM",
          "address3": "Midtown Plaza",
          "address4": "Shop 8",
          "id": 5058784,
          "key": "atm_3S03",
          "description": "Midtown Plaza",
          "address1": "186 Swanston Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8131315,
          "longitude": 144.9654723,
          "hours": "Mon-Fri 09.30am-05.00pm, Sat 10.00am-02.00pm"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": true,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "SmartATM",
          "address3": "Melbourne NAB House",
          "id": 5058814,
          "key": "atm_3S38",
          "description": "National Bank House",
          "address1": "500 Bourke Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8154128,
          "longitude": 144.9590017
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": true,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "SmartATM",
          "address3": "Melbourne NAB House",
          "id": 5058815,
          "key": "atm_3S39",
          "description": "National Bank House",
          "address1": "500 Bourke Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8154128,
          "longitude": 144.9590017,
          "hours": "24/7"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": true,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "SmartATM",
          "address3": "Melbourne NAB House",
          "id": 5058837,
          "key": "atm_3S67",
          "description": "National Bank House",
          "address1": "500 Bourke Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8154128,
          "longitude": 144.9590017,
          "hours": "24/7"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": true,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "SmartATM",
          "address3": "Midtown Plaza",
          "address4": "Shop 8",
          "id": 5058842,
          "key": "atm_3S72",
          "description": "Midtown Plaza",
          "address1": "186 Swanston Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8131315,
          "longitude": 144.9654723,
          "hours": "24/7"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Offsite",
          "isDeposit": false,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "NAB ATM",
          "address3": "Midtown Plaza",
          "id": 5059024,
          "key": "atm_4G04",
          "description": "Midtown Plaza",
          "address1": "194 Swanston Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.8130332,
          "longitude": 144.9654279,
          "hours": "Mon-Tue 09.30am-05.30pm, Wed-Fri 09.30am-09.00pm, Sat 09.30am-05.30pm, Sun 11.00am-05.00pm"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": true,
          "isDisabilityApproved": true,
          "isAudio": false,
          "source": "BOQ ATM",
          "id": 5059452,
          "key": "atm_9036021I",
          "description": "455 Bourke Street",
          "address1": "455 Bourke Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.81518,
          "longitude": 144.960583
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Branch",
          "isDeposit": false,
          "isDisabilityApproved": false,
          "isAudio": false,
          "source": "rediATM",
          "id": 5060659,
          "key": "atm_C11243",
          "description": "Bourke Street",
          "address1": "460 Bourke Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.81494,
          "longitude": 144.96002,
          "hours": "24/7"
        }
      },
      {
        "apiStructType": "atm",
        "atm": {
          "location": "Offsite",
          "isDeposit": false,
          "isDisabilityApproved": false,
          "isAudio": false,
          "source": "rediATM",
          "address3": "Emporium Melbourne",
          "address5": "ATM 02",
          "id": 5060908,
          "key": "atm_C11662",
          "description": "Emporium Melbourne",
          "address1": "269-321 Lonsdale Street",
          "suburb": "Melbourne",
          "state": "VIC",
          "postcode": "3000",
          "latitude": -37.811932,
          "longitude": 144.963648,
          "hours": "Mon-Wed 10.00am-07.00pm, Thu-Fri 10.00am-09.00pm, Sat-Sun 10.00am-07.00pm"
        }
      }
    ]
  },
  "status": {
    "code": "API-200",
    "message": "Success"
  }
}
Categories: Fusion Middleware

Configuring Spring Boot Actuator information in Pivotal Cloud Foundry 1.9 Applications Manager

Pas Apicella - Tue, 2017-01-10 22:12
With the release of Pivotal Cloud Foundry 1.9 we added "Spring Boot Actuator In PCF" through Applications Manager Web UI. In this example below I will show what config you need on your application side to get Apps Manager Web UI to show you this information.

Spring Boot Actuator exposes information about a running Spring Boot application via an http endpoint. Get useful diagnostics about an app programmatically using these RESTful APIs:  health, git commit, build information, and so on. Now, developers can view some of this data in PCF Apps Manager. It’s easier to debug and monitor your apps in production, since Apps Manager shows this diagnostics info in context.

The steps are are described here BUT the steps below are more detailed and provide a sample application to verify this with.

http://docs.pivotal.io/pivotalcf/1-9/console/spring-boot-actuators.html

1. Add the maven dependency "spring-boot-starter-actuator" to your pom.xml
  
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
</dependencies>

2. Ensure your using Spring Boot 1.5 snapshots/release candidates/final as earlier versions won't work.
  
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.5.0.BUILD-SNAPSHOT</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>

https://spring.io/blog/2017/01/06/spring-boot-1-5-0-rc1-available-now

3. Configure the Info Actuator as per link below

http://docs.pivotal.io/pivotalcf/1-9/console/spring-boot-actuators.html#info-endpoint

Note: It will differ depending on if your using Maven or Gradle

Ensure you add this property for GIT integration in your YML file or properties file, YML example

management:
  info:
    git:
      mode: full

4. The following demo below has the required setup for verification application , clone and push to CF as shown below.

Note: review pom.xml for everything that is needed for this to work

https://github.com/papicella/SpringBootPCFPas

$ git clone https://github.com/papicella/SpringBootPCFPas.git
$ mvn package
$ cf push -f manifest-inmemory-db.yml

....

0 of 1 instances running, 1 starting
0 of 1 instances running, 1 starting
1 of 1 instances running

App started


OK

App pas-springboot-pcf was started using this command `CALCULATED_MEMORY=$($PWD/.java-buildpack/open_jdk_jre/bin/java-buildpack-memory-calculator-2.0.2_RELEASE -memorySizes=metaspace:64m..,stack:228k.. -memoryWeights=heap:65,metaspace:10,native:15,stack:10 -memoryInitials=heap:100%,metaspace:100% -stackThreads=300 -totMemory=$MEMORY_LIMIT) && JAVA_OPTS="-Djava.io.tmpdir=$TMPDIR -XX:OnOutOfMemoryError=$PWD/.java-buildpack/open_jdk_jre/bin/killjava.sh $CALCULATED_MEMORY -Djava.security.egd=file:///dev/urando" && SERVER_PORT=$PORT eval exec $PWD/.java-buildpack/open_jdk_jre/bin/java $JAVA_OPTS -cp $PWD/. org.springframework.boot.loader.JarLauncher`

Showing health and status for app pas-springboot-pcf in org pivot-papicella / space development as papicella@pivotal.io...
OK

requested state: started
instances: 1/1
usage: 1G x 1 instances
urls: pas-springboot-pcf-squireless-clicha.cfapps.pez.pivotal.io
last uploaded: Wed Jan 11 03:59:24 UTC 2017
stack: cflinuxfs2
buildpack: java-buildpack=v3.10-offline-https://github.com/cloudfoundry/java-buildpack.git#193d6b7 java-main java-opts open-jdk-like-jre=1.8.0_111 open-jdk-like-memory-calculator=2.0.2_RELEASE spring-auto-reconfiguration=1.10.0_RELEASE

     state     since                    cpu      memory         disk           details
#0   running   2017-01-11 03:00:18 PM   207.1%   489.9M of 1G   161.7M of 1G

5. View the application in Application Manager UI as shown below and you will see that a Spring Icon appears next to the application name



More Screen Shots





More Information

http://docs.pivotal.io/pivotalcf/1-9/console/spring-boot-actuators.html
Categories: Fusion Middleware

Spring Boot Application Consuming NAB FxRates API

Pas Apicella - Sun, 2017-01-08 15:40
National Australia Bank (NAB) recently released a set of BETA API's as per the link below.

https://developer.nab.com.au/

The following example is a Spring Boot Application that consumes the FxRates API. It's all documented on GitHub and you will need a NAB API key to run this demo. It can run stand alone as a FAT JAR through Spring Boot or deployed to Cloud Foundry, instructions appear for both.

https://github.com/papicella/NABApi-fx-demo

 
Categories: Fusion Middleware

Script to tally Application Instance Counts for Pivotal Cloud Foundry

Pas Apicella - Mon, 2016-12-19 20:32
I was recently asked how to determine how many application instances exist in a given ORG at a point in time. The script below can do this using CF CURL command which means you must be logged into your Pivotal Cloud Foundry instance for this to work. You can use the CF REST API itself but I find the CF CURL much easier.

CF REST API https://apidocs.cloudfoundry.org/249/

The script below is assuming an ORG name of "apples-pivotal-org" so it would make sense to pass this in as a script variable which is easy enough to do

Prior to running this script it's worth checking your current TARGET endpoints as shown below.

pasapicella@pas-macbook:~/pivotal/PCF/scripts$ cf target

API endpoint:   https://api.run.pivotal.io (API version: 2.65.0)
User:           papicella@pivotal.io
Org:            apples-pivotal-org
Space:          development

Script:

echo "AI Count for applications in a organization.."
echo ""

guids=$(cf curl /v2/apps?q=organization_guid:`cf org apples-pivotal-org --guid` | jq -r ".resources[].metadata.guid")
total=0
for guid in $guids; do
  name=$(cf curl /v2/apps/$guid | jq -r ".entity.name")
  count=$(cf curl /v2/apps/$guid | jq -r ".entity.instances")
  echo -e "App Name: $name , Instances: $count"
  total=$(( $total + $count ))
done

echo "-----"
echo "Total AI's = $total"
echo ""

Output:

pasapicella@pas-macbook:~/pivotal/PCF/scripts$ ./ai-count-org-details.sh
AI Count for applications in a organization..

App Name: pas-telstrawifiapi-client , Instances: 1
App Name: springboot-bootstrap-employee , Instances: 2
App Name: springboot-employee-feign-client , Instances: 1
App Name: greeting-config , Instances: 1
App Name: employee-feign-client-hystrix , Instances: 1
App Name: pas-albums , Instances: 2
App Name: pas-springboot-pcf , Instances: 2
App Name: springboot-typeahead , Instances: 1
-----
Total AI's = 11
Categories: Fusion Middleware

Spring Boot with Feign and Twitter Typeahead JS library

Pas Apicella - Tue, 2016-12-13 18:35
I previously blogged about a demo using Spring Boot and Feign making an external based REST call to a service I had created. The real purpose of that demo "http://theblasfrompas.blogspot.com.au/2016/12/spring-boot-feign-client-accessing.html" was to use Twitter Typeahead for auto completion which is the demo on Github below. The returned data is now used in an INPUT text input for auto completion as the user types in the Country Name

https://github.com/papicella/FeignClientExternalSpringBoot


Categories: Fusion Middleware

Command Line and Vim Tips from a Java Programmer

I’m always interested in learning more about useful development tools. In college, most programmers get an intro to the Linux command line environment, but I wanted to share some commands I use daily that I’ve learned since graduation.

Being comfortable on the command line is a great skill to have when a customer is looking over your shoulder on a Webex. They could be watching a software demo or deployment to their environment. It can also be useful when learning a new code base or working with a product with a large, unfamiliar directory structure with lots of logs.

If you’re on Windows, you can use Cygwin to get a Unix-like CLI to make these commands available.

Useful Linux commands Find

The command find helps you find files by recursively searching subdirectories. Here are some examples:

find .
    Prints all files and directories under the current directory.

find . -name '*.log'
  Prints all files and directories that end in “.log”.

find /tmp -type f -name '*.log'
   Prints only files in the directory “/tmp” that end in “.log”.

find . -type d
   Prints only directories.

find . -maxdepth 2
     Prints all files and directories under the current directory, and subdirectories (but not sub-subdirectories).

find . -type f -exec ls -la {} \;
     The 
-exec
flag runs a command against each file instead of printing the name. In this example, it will run 
ls -la filename
  on each file under the current directory. The curly braces take the place of the filename.

Grep

The command grep lets you search text for lines that match a specific string. It can be helpful to add your initials to debug statements in your code and then grep for them to find them in the logs.

grep foo filename
  Prints each line in the file “filename” that matches the string “foo”.

grep foo\\\|bar filename
Grep supports regular expressions, so this prints each line in the file that matches “foo” or “bar”.

grep -i foo filename
  Add -i for case insensitive matching.

grep foo *
  Use the shell wildcard, an asterisk, to search all files in the current directory for the string “foo”.

grep -r foo *
  Recursively search all files and directories in the current directory for a string.

grep -rnH foo filename
  Add -n to print line numbers and -H to print the filename on each line.

find . -type f -name '*.log' -exec grep -nH foo {} \;
  Combining find and grep can let you easily search each file that matches a certain name for a string. This will print each line that matches “foo” along with the file name and line number in each file that ends in “.log” under the current directory.

ps -ef | grep processName
  The output of any command can be piped to grep, and the lines of STDOUT that match the expression will be printed. For example, you could use this to find the pid of a process with a known name.

cat file.txt | grep -v foo
  You can also use -v to print all lines that don’t match an expression.

Ln

The command ln lets you create links. I generally use this to create links in my home directory to quickly cd into long directory paths.

ln -s /some/really/long/path foo
  The -s is for symbolic, and the long path is the target. The output of
ls -la
 in this case would be
foo -> /some/really/long/path
 .

Bashrc

The Bashrc is a shell script that gets executed whenever Bash is started in an interactive terminal. It is located in your home directory,

~/.bashrc
 . It provides a place to edit your $PATH, $PS1, or add aliases and functions to simplify commonly used tasks.

Aliases are a way you can define your own command line commands. Here are a couple useful aliases I’ve added to my .bashrc that have saved a lot of keystrokes on a server where I’ve installed Oracle WebCenter:

WC_DOMAIN=/u01/oracle/fmw/user_projects/domains/wc_domain
alias assets="cd /var/www/html"
alias portalLogs="cd $WC_DOMAIN/servers/WC_Spaces/logs"
alias domain="cd $WC_DOMAIN"
alias components="cd $WC_DOMAIN/ucm/cs/custom"
alias rpl="portalLogs; vim -R WC_Spaces.out"

After making changes to your .bashrc, you can load them with

source ~/.bashrc
 . Now I can type
rpl
 , short for Read Portal Logs, from anywhere to quickly jump into the WebCenter portal log file.

alias grep=”grep --color”

This grep alias adds the –color option to all of my grep commands.  All of the above grep commands still work, but now all of the matches will be highlighted.

Vim

Knowing Vim key bindings can be convenient and efficient if you’re already working on the command line. Vim has many built-in shortcuts to make editing files quick and easy.

Run 

vim filename.txt
  to open a file in Vim. Vim starts in Normal Mode, where most characters have a special meeting, and typing a colon,
:
 , lets you run Vim commands. For example, typing 
Shift-G
  will jump to the end of the file, and typing
:q
 while in normal mode will quit Vim. Here is a list of useful commands:

:q
  Quits Vim

:w
  Write the file (save)

:wq
  Write and quit

:q!
  Quit and ignore warnings that you didn’t write the file

:wq!
  Write and quit, ignoring permission warnings

i
  Enter Insert Mode where you can edit the file like a normal text editor

a
  Enter Insert Mode and place the cursor after the current character

o
  Insert a blank line after the current line and enter Insert Mode

[escape]
  The escape button exits insert mode

:150
  Jump to line 150

shift-G
  Jump to the last line

gg
  Jump to the first line

/foo
  Search for the next occurrence of “foo”. Regex patterns work in the search.

?foo
  Search for the previous occurrence of “foo”

n
  Go to the next match

N
Go to the previous match

*
  Search for the next occurrence of the searched word under the cursor

#
  Search for the previous occurrence of the searched word under the cursor

w
  Jump to the next word

b
  Jump to the previous word

``
  Jump to the last action

dw
  Delete the word starting at the cursor

cw
  Delete the word starting at the cursor and enter insert mode

c$
  Delete everything from the cursor to the end of the line and enter insert mode

dd
  Delete the current line

D
  Delete everything from the cursor to the end of the line

u
  Undo the last action

ctrl-r
 
ctrl-r
  Redo the last action

d[up]
  Delete the current line and the line above it. “[up]” is for the up arrow.

d[down]
  Delete the current line and the line below it

d3[down]
  Delete the current line and the three lines below it

r[any character]
  Replace the character under the cursor with another character

~
  Toggle the case (upper or lower) of the character under the cursor

v
  Enter Visual Mode. Use the arrow keys to highlight text.

shift-V
  Enter Visual Mode and highlight whole lines at a time.

ctrl-v
  Enter Visual Mode but highlight blocks of characters.

=
  While in Visual Mode, = will auto format highlighted text.

c
  While in Visual Mode, c will cut the highlighted text.

y
  While in Visual Mode, y will yank (copy) the highlighted text.

p
  In Normal Mode, p will paste the text in the buffer (that’s been yanked or cut).

yw
  Yank the text from the cursor to the end of the current word.

:sort
  Highlight lines in Visual Mode, then use this command to sort them alphabetically.

:s/foo/bar/g
  Highlight lines in Visual Mode, then use search and replace to replace all instances of “foo” with “bar”.

:s/^/#/
  Highlight lines in Visual Mode, then add # at the start of each line. This is useful to comment out blocks of code.

:s/$/;/
Highlight lines in Visual Mode, then add a semicolon at the end of each line.

:set paste
  This will turn off auto indenting. Use it before pasting into Vim from outside the terminal (you’ll want to be in insert mode before you paste).

:set nopaste
  Make auto indenting return to normal.

:set nu
  Turn on line numbers.

:set nonu
  Turn off line numbers.

:r!pwd
  Read the output of a command into Vim. In this example, we’ll read in the current directory.

:r!sed -n 5,10p /path/to/file
  Read lines 5 through 10 from another file in Vim. This can be a good way to copy and paste between files in the terminal.

:[up|down]
  Type a colon and then use the arrow keys to browse through your command history. If you type letters after the colon, it will only go through commands that matched that. (i.e., :se  and then up would help find to “:set paste” quickly).

Vimrc

The Vimrc is a configuration file that Vim loads whenever it starts up, similar to the Bashrc. It is in your home directory.

Here is a basic Vimrc I’d recommend for getting started if you don’t have one already. Run

vim ~/.vimrc
and paste in the following:

set backspace=2         " backspace in insert mode works like normal editor
syntax on               " syntax highlighting
filetype indent on      " activates indenting for files
set autoindent          " auto indenting
set number              " line numbers
colorscheme desert      " colorscheme desert
set listchars=tab:>-,trail:.,extends:>,precedes:<
set list                " Set up whitespace characters
set ic                  " Ignore case by default in searches
set statusline+=%F      " Show the full path to the file
set laststatus=2        " Make the status line always visible

 

Perl

Perl comes installed by default on Linux, so it is worth mentioning that it has some extensive command line capabilities. If you have ever tried to grep for a string that matches a line in a minified Javascript file, you can probably see the benefit of being able to filter out lines longer than 500 characters.

grep -r foo * | perl -nle'print if 500 > length'

Conclusion

I love learning the tools that are available in my development environment, and it is exciting to see how they can help customers as well.

Recently, I was working with a customer and we were running into SSL issues. Java processes can be run with the option 

-Djavax.net.ssl.trustStore=/path/to/trustStore.jks
  to specify which keystore to use for SSL certificates. It was really easy to run
ps -ef | grep trustStore
to quickly identify which keystore we needed to import certificates into.

I’ve also been able to use various find and grep commands to search through unfamiliar directories after exporting metadata from Oracle’s MDS Repository.

Even if you aren’t on the command line, I’d encourage everyone to learn something new about their development environment. Feel free to share your favorite Vim and command line tips in the comments!

Further reading

http://www.vim.org/docs.php

https://www.gnu.org/software/bash/manual/bash.html

http://perldoc.perl.org/perlrun.html

The post Command Line and Vim Tips from a Java Programmer appeared first on Fishbowl Solutions' C4 Blog.

Categories: Fusion Middleware, Other

Webinar Recording: Ryan Companies Leverages Fishbowl’s ControlCenter for Oracle WebCenter to Enhance Document Control Leading to Improved Knowledge Management

On Thursday, December 8th, Fishbowl had the privilege of presenting a webinar with Mike Ernst – VP of Contruction Operations – at Ryan Companies regarding their use case for Fishbowl’s ControlCenter product for controlled document management. Mike was joined by Fishbowl’s ControlCenter product manager, Kim Negaard, who provided an overview of how the solution was implemented and how it is being used at Ryan.

Ryan Companies had been using Oracle WebCenter for many years, but they were looking for some additional document management functionality and a more intuitive interface to help improve knowledge management at the company. Their main initiative was to make it easier for users to access and manage their corporate knowledge documents (policies and procedures), manuals (safety), and real estate documents (leases) throughout each document’s life cycle.

Mike provided some interesting stats that factored into their decision to implement ControlCenter for WebCenter:

  • $16k – the average cost of “reinventing” procedures per project (ex. checklists and templates)
  • $25k – the average cost of estimating incorrect labor rates
  • 3x – salary to onboard someone new when an employee leaves the company

To hear more about how Ryan found knowledge management success with ControlCenter for WebCenter, watch the webinar recording: https://youtu.be/_NNFRV1LPaY

The post Webinar Recording: Ryan Companies Leverages Fishbowl’s ControlCenter for Oracle WebCenter to Enhance Document Control Leading to Improved Knowledge Management appeared first on Fishbowl Solutions' C4 Blog.

Categories: Fusion Middleware, Other

Spring Boot / Feign Client accessing external service

Pas Apicella - Thu, 2016-12-08 17:49
Previously we used Feign to create clients for our own services, which are registered on our Eureka Server using a service name as shown in the previous blog post http://theblasfrompas.blogspot.com.au/2016/11/declarative-rest-client-feign-with_8.html. It's not unusual that you'd want to implement an external rest endpoint, basically an endpoint that's not discoverable by Eureka. In that case, you can use the url property on the @FeignClient annotation,
which gracefully supports property injection. His an example of this.

Full example on GitHub as follows

https://github.com/papicella/FeignClientExternalSpringBoot

1. Start by adding the correct maven dependencies and the one you need is as follows, there would be others if you want to use a web based spring boot project etc.
  
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-feign</artifactId>
</dependency>

2. We are going to consume this external service as follows

http://country.io/names.json

To do that we create a simple interface as follows
  
package pas.au.pivotal.feign.external;

import org.springframework.cloud.netflix.feign.FeignClient;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;

@FeignClient(name = "country-service-client", url = "http://country.io")
public interface CountryServiceClient {

@RequestMapping(method = RequestMethod.GET, value = "/names.json")
String getCountries();
}

3. In this example I have created a RestController to consume this REST service and test it because it's the easiest way to do this. We simply AutoWire the CountryServiceClient interface into the RestController to make those external calls through FEIGN.
  
package pas.au.pivotal.feign.external.controller;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.json.JsonParser;
import org.springframework.boot.json.JsonParserFactory;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;
import pas.au.pivotal.feign.external.CountryServiceClient;

import java.util.Map;

@RestController
public class CountryRest
{
Logger logger = LoggerFactory.getLogger(CountryRest.class);
private static final JsonParser parser = JsonParserFactory.getJsonParser();

@Autowired
private CountryServiceClient countryServiceClient;

@RequestMapping(value = "/countries", method = RequestMethod.GET,
                         produces = "application/json")
public String allCountries()
{
String countries = countryServiceClient.getCountries();

return countries;
}

@RequestMapping(value = "/country_names", method = RequestMethod.GET)
public String[] countryNames ()
{
String countries = countryServiceClient.getCountries();

Map<String, Object> countryMap = parser.parseMap(countries);

String countryArray[] = new String[countryMap.size()];
logger.info("Size of countries " + countryArray.length);

int i = 0;
for (Map.Entry<String, Object> entry : countryMap.entrySet()) {
countryArray[i] = (String) entry.getValue();
i++;
}

return countryArray;

}
}

4. Of course we will have our main class to boot strap the application and it includes the "spring-boot-starter-web" maven repo to start a tomcat server for us.
  
package pas.au.pivotal.feign.external;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.netflix.feign.EnableFeignClients;

@SpringBootApplication
@EnableFeignClients
public class FeignClientExternalSpringBootApplication {

public static void main(String[] args) {
SpringApplication.run(FeignClientExternalSpringBootApplication.class, args);
}
}

5. Ensure your application.properties or application.yml has the following properties to disable timeouts.

feign:
  hystrix:
    enabled: false

hystrix:
  command:
    choose:
      default:
        execution:
          timeout:
            enabled: false

6. Run the main class "FeignClientExternalSpringBootApplication"

Access as follows

http://localhost:8080/countries





Categories: Fusion Middleware

Webinar: Quality, Safety, Knowledge Management with Oracle WebCenter Content and ControlCenter

DATE: THURSDAY, DECEMBER 8, 2016
TIME: 10:00 A.M. PST / 1:00 P.M. EST

Join Ryan Companies Vice President of Construction Operations, Mike Ernst, and Fishbowl Solutions Product Manager, Kim Negaard, to learn how Ryan Companies, a leading national construction firm, found knowledge management success with ControlCenter for Oracle WebCenter Content.

In this webinar, you’ll hear first-hand how ControlCenter has been implemented as part of Ryan’s Integrated Project Delivery Process helping them create a robust knowledge management system to promote consistent and effective operations across multiple regional offices. You’ll also learn how ControlCenter’s intuitive, modern user experience enabled Ryan to easily find documents across devices, implement reoccurring review cycles, and control both company-wide and project-specific documents throughout their lifecycle.

Register today.

Register

 

 

The post Webinar: Quality, Safety, Knowledge Management with Oracle WebCenter Content and ControlCenter appeared first on Fishbowl Solutions' C4 Blog.

Categories: Fusion Middleware, Other

Deploying Spring Boot Applications on Google Application Engine (GAE)

Pas Apicella - Tue, 2016-11-22 02:07
I previously blogged about how to how to deploy a Spring Boot application to Flexible VM's on Google Cloud Platform as shown below.

http://theblasfrompas.blogspot.com.au/2016/09/spring-boot-on-google-cloud-platform-gcp.html

In this example below I use Google Application Engine (GAE) to deploy a Spring Boot application without using a flexible VM which is a lot faster and what I orginally wanted to do when I did this previously. In short this is using the [Standard environment] option for GAE.

Spring Boot uses Servlet 3.0 APIs to initialize the ServletContext (register Servlets etc.) so you can’t use the same application out of the box in a Servlet 2.5 container. It is however possible to run a Spring Boot application on an older container with some special tools. If you include org.springframework.boot:spring-boot-legacy as a dependency (maintained separately to the core of Spring Boot and currently available at 1.0.2.RELEASE), all you should need to do is create a web.xml and declare a context listener to create the application context and your filters and servlets. The context listener is a special purpose one for Spring Boot, but the rest of it is normal for a Spring application in Servlet 2.5

Visit for more Information:

   http://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#howto-servlet-2-5 

Steps

1. In order to use Servlet 2.5 and a web.xml we will need to add spring-boot-legacy dependecany to a local maven repoistory as shown below.

$ git clone https://github.com/scratches/spring-boot-legacy
$ cd spring-boot-legacy
$ mvn install

2. Clone and package the GIT REPO as shown below

$ https://github.com/papicella/GoogleAppEngineSpringBoot.git

3. Edit the file ./src/main/webapp/WEB-INF/appengine-web.xml to specify the correct APPLICATION ID which we will target in step 4 as well.
  
<?xml version="1.0" encoding="utf-8"?>
<appengine-web-app xmlns="http://appengine.google.com/ns/1.0">
<application>fe-papicella</application>
<version>5</version>
<threadsafe>true</threadsafe>
<manual-scaling>
<instances>1</instances>
</manual-scaling>
</appengine-web-app>

4. Package as shown below

$ mvn package

5. Target your project for deployment as follows

pasapicella@pas-macbook:~/piv-projects/GoogleAppEngineSpringBoot$ gcloud projects list
PROJECT_ID              NAME                    PROJECT_NUMBER
bionic-vertex-150302    AppEngineSpringBoot     97889500330
fe-papicella            FE-papicella            1049163203721
pas-spring-boot-on-gcp  Pas Spring Boot on GCP  1043917887789

pasapicella@pas-macbook:~/piv-projects/GoogleAppEngineSpringBoot$ gcloud config set project fe-papicella
Updated property [core/project].

6. Deploy as follows

mvn appengine:deploy

Finally once deployed you can access you application using it's endpoint which is displayed in the dashboard of GCP console





Project in InteiilJ IDEA




NOTE: Google AppEngine does not allow JMX, so you have to switch it off in a Spring Boot app (set spring.jmx.enabled=false in application.properties).

application.properties

spring.jmx.enabled=false

More Information

Full working example with code as follows on GitHub

https://github.com/papicella/GoogleAppEngineSpringBoot
Categories: Fusion Middleware

Pages

Subscribe to Oracle FAQ aggregator - Fusion Middleware