Skip navigation.

Feed aggregator

List of acquisitions by Microsoft a data journey

Nilesh Jethwa - Thu, 2015-08-27 21:21

If we look into the SEC data for Microsoft and other tech companies, Microsoft spends the most in Research and Development from [by dollar]

Image

Read more at: http://www.infocaptor.com/dashboard/list-of-acquisitions-by-microsoft-a-data-journey

Integrating Telstra Public SMS API into Bluemix

Pas Apicella - Thu, 2015-08-27 20:16
In the post below I will show how I integrated Telstra public SMS Api into my Bluemix catalog to be consumed as a service. This was all done from Public Bluemix using the Cloud Integration Service.

Step 1 - Create a T.DEV account

In order to get started you need to create an account on http://dev.telstra.com in order to be granted access to the SMS API. Once access is granted you need to create an application which enables you to add/manage Telstra API keys as shown below.

1.1 Create an account an http://dev.telstra.com

1.2. Once done you should have something as follows which can take up to 24 hours to get approved as shown by the "approved" icon


Step 2 - Test the SMS Telstra API

At this point we want to test the Telstra SMS API using a script, this ensures it's working before we proceed to Integrating it onto Bluemix.

2.1. Create a script called setup.sh as follows

#Obtain these keys from the Telstra Developer Portal
APP_KEY="yyyy-key"
APP_SECRET="yyyy-secret"

curl "https://api.telstra.com/v1/oauth/token?client_id=$APP_KEY&client_secret=$APP_SECRET&grant_type=client_credentials&scope=SMS"

2.2. Edit the script above to use your APP_KEY and APP_SECRET values from the Telstra Developer Portal

2.3. Run as shown below


pas@Pass-MBP:~/ibm/customers/telstra/telstra-apis/test$ ./setup.sh
{ "access_token": "xadMkPqSAE0VG6pSGEi6rHA5vqYi", "expires_in": "3599" }

2.4. Make a note of the token key returned, you will need this to send an SMS message
2.5. Create a script called "sendsms.sh" as shown below.


# * Recipient number should be in the format of "04xxxxxxxx" where x is a digit
# * Authorization header value should be in the format of "Bearer xxx" where xxx is access token returned
# from a previous GET https://api.telstra.com/v1/oauth/token request.
RECIPIENT_NUMBER=0411151350
TOKEN=token-key

curl -H "Content-Type: application/json" \
-H "Authorization: Bearer $TOKEN" \
-d "{\"to\":\"$RECIPIENT_NUMBER\", \"body\":\"Hello, pas sent this message from telstra SMS api!\"}" \
"https://api.telstra.com/v1/sms/messages"

2.6. Replace the token key with what was returned at step 2.4 above
2.7. Replace the RECIPIENT_NUMBER with your own mobile number to test the SMS API.
2.8. Run as shown below.


pas@Pass-MBP:~/ibm/customers/telstra/telstra-apis/test$ ./sendsms.sh
{"messageId":"1370CAB677B59C226705337B95945CD6"}
Step 3 - Creating a REST based service to Call Telstra SMS API

At this point we can now Integrate the Telstra SMS API into Bluemix. To do that I created a simple Spring Boot Application which exposes a RESTful method to call Telstra SMS API using Spring's RestTemplate class. I do this as it's two calls you need to make to call the Telstra SMS API. A REST based call to get a ACCESS_TOKEN , then followed by a call to actually send an SMS message. Creating a Spring Boot application to achieve this allows me to wrap that into one single call making it easy to consume and add to the Bluemix Catalog as a Service.

More Information on The Cloud Integration service can be found here. Cloud Integration allows us to expose RESTful methods from Bluemix applications onto the catalog via one simple screen. We could alos use Bluemix API management service as well.

https://www.ng.bluemix.net/docs/services/CloudIntegration/index.html

Below shows the application being pushed into Bluemix which will then be used to add Telstra SMS API service into the Bluemix catalog.



pas@192-168-1-4:~/ibm/DemoProjects/spring-starter/jazzhub/TelstraSMSAPIDemo$ cf push
Using manifest file /Users/pas/ibm/DemoProjects/spring-starter/jazzhub/TelstraSMSAPIDemo/manifest.yml

Creating app pas-telstrasmsapi in org pasapi@au1.ibm.com / space apple as pasapi@au1.ibm.com...
OK

Using route pas-telstrasmsapi.mybluemix.net
Binding pas-telstrasmsapi.mybluemix.net to pas-telstrasmsapi...
OK

Uploading pas-telstrasmsapi...
Uploading app files from: /Users/pas/ibm/DemoProjects/spring-starter/jazzhub/TelstraSMSAPIDemo/target/TelstraSMSAPI-1.0-SNAPSHOT.jar
Uploading 752.3K, 98 files
Done uploading
OK

Starting app pas-telstrasmsapi in org pasapi@au1.ibm.com / space apple as pasapi@au1.ibm.com...
-----> Downloaded app package (15M)
-----> Liberty Buildpack Version: v1.19.1-20150622-1509
-----> Retrieving IBM 1.8.0_20150617 JRE (ibm-java-jre-8.0-1.0-pxa6480sr1ifx-20150617_03-cloud.tgz) ... (0.0s)
         Expanding JRE to .java ... (1.4s)
-----> Retrieving App Management 1.5.0_20150608-1243 (app-mgmt_v1.5-20150608-1243.zip) ... (0.0s)
         Expanding App Management to .app-management (0.9s)
-----> Downloading Auto Reconfiguration 1.7.0_RELEASE from https://download.run.pivotal.io/auto-reconfiguration/auto-reconfiguration-1.7.0_RELEASE.jar (0.1s)
-----> Liberty buildpack is done creating the droplet

-----> Uploading droplet (90M)

0 of 1 instances running, 1 starting
0 of 1 instances running, 1 starting
0 of 1 instances running, 1 starting
0 of 1 instances running, 1 starting
1 of 1 instances running

App started


OK

App pas-telstrasmsapi was started using this command `$PWD/.java/jre/bin/java -Xtune:virtualized -Xmx384M -Xdump:none -Xdump:heap:defaults:file=./../dumps/heapdump.%Y%m%d.%H%M%S.%pid.%seq.phd -Xdump:java:defaults:file=./../dumps/javacore.%Y%m%d.%H%M%S.%pid.%seq.txt -Xdump:snap:defaults:file=./../dumps/Snap.%Y%m%d.%H%M%S.%pid.%seq.trc -Xdump:heap+java+snap:events=user -Xdump:tool:events=systhrow,filter=java/lang/OutOfMemoryError,request=serial+exclusive,exec=./.buildpack-diagnostics/killjava.sh $JVM_ARGS org.springframework.boot.loader.JarLauncher --server.port=$PORT`

Showing health and status for app pas-telstrasmsapi in org pasapi@au1.ibm.com / space apple as pasapi@au1.ibm.com...
OK

requested state: started
instances: 1/1
usage: 512M x 1 instances
urls: pas-telstrasmsapi.mybluemix.net
last uploaded: Fri Jul 17 11:26:58 UTC 2015

     state     since                    cpu    memory           disk           details
#0   running   2015-07-17 09:28:28 PM   1.0%   150.6M of 512M   148.9M of 1G
Step 4 - Add the RESTful method to IBM Bluemix catalog to invoke the Telstra SMS API

4.1 To expose our RESTful method we simply define the end point using the Cloud Integration service as shown below.


The image showing the Cloud Integration service with the Telstra API exposed and available to be consumed as a Service on Bluemix.





This was created using the Bluemix Dashboard but can also be done using the Cloud Foundry command line "cf create-service ..."

Step 5 - Create a Application client which will invoke the Telstra SMS service

At this point we are going to push a client application onto Bluemix which consumes the Telstra SMS API service and then uses it within the application. WE do this to verify the service works creating a simple HTML based application which invokes the service which has a manifest.yml file indicating it wants to consume the service which is now exposed on the catalog within Bluemix as per above.

5.1. The manifest.yml consumes the service created from the API in the catalog


applications:
- name: pas-telstrasmsapi-client
  memory: 512M
  instances: 1
  host: pas-telstrasmsapi-client
  domain: mybluemix.net
  path: ./target/TelstraSMSApiClient-0.0.1-SNAPSHOT.jar
  env:
   JBP_CONFIG_IBMJDK: "version: 1.8.+"
  services:
    - TelstraSMS-service
5.2. Push the application as shown below.


pas@Pass-MacBook-Pro:~/ibm/DemoProjects/spring-starter/jazzhub/TelstraSMSApiClient$ cf push
Using manifest file /Users/pas/ibm/DemoProjects/spring-starter/jazzhub/TelstraSMSApiClient/manifest.yml

Updating app pas-telstrasmsapi-client in org pasapi@au1.ibm.com / space apple as pasapi@au1.ibm.com...
OK

Using route pas-telstrasmsapi-client.mybluemix.net
Uploading pas-telstrasmsapi-client...
Uploading app files from: /Users/pas/ibm/DemoProjects/spring-starter/jazzhub/TelstraSMSApiClient/target/TelstraSMSApiClient-0.0.1-SNAPSHOT.jar
Uploading 806.8K, 121 files
Done uploading
OK
Binding service TelstraSMS-service to app pas-telstrasmsapi-client in org pasapi@au1.ibm.com / space apple as pasapi@au1.ibm.com...
OK

Stopping app pas-telstrasmsapi-client in org pasapi@au1.ibm.com / space apple as pasapi@au1.ibm.com...
OK

Starting app pas-telstrasmsapi-client in org pasapi@au1.ibm.com / space apple as pasapi@au1.ibm.com...
-----> Downloaded app package (16M)
-----> Downloaded app buildpack cache (1.2M)
-----> Liberty Buildpack Version: v1.19.1-20150622-1509
-----> Retrieving IBM 1.8.0_20150617 JRE (ibm-java-jre-8.0-1.0-pxa6480sr1ifx-20150617_03-cloud.tgz) ... (0.0s)
         Expanding JRE to .java ... (1.5s)
-----> Retrieving App Management 1.5.0_20150608-1243 (app-mgmt_v1.5-20150608-1243.zip) ... (0.0s)
         Expanding App Management to .app-management (0.9s)
-----> Downloading Auto Reconfiguration 1.7.0_RELEASE from https://download.run.pivotal.io/auto-reconfiguration/auto-reconfiguration-1.7.0_RELEASE.jar (0.0s)
-----> Liberty buildpack is done creating the droplet

-----> Uploading droplet (90M)

0 of 1 instances running, 1 starting
0 of 1 instances running, 1 starting
0 of 1 instances running, 1 starting
1 of 1 instances running

App started


OK

App pas-telstrasmsapi-client was started using this command `$PWD/.java/jre/bin/java -Xtune:virtualized -Xmx384M -Xdump:none -Xdump:heap:defaults:file=./../dumps/heapdump.%Y%m%d.%H%M%S.%pid.%seq.phd -Xdump:java:defaults:file=./../dumps/javacore.%Y%m%d.%H%M%S.%pid.%seq.txt -Xdump:snap:defaults:file=./../dumps/Snap.%Y%m%d.%H%M%S.%pid.%seq.trc -Xdump:heap+java+snap:events=user -Xdump:tool:events=systhrow,filter=java/lang/OutOfMemoryError,request=serial+exclusive,exec=./.buildpack-diagnostics/killjava.sh $JVM_ARGS org.springframework.boot.loader.JarLauncher --server.port=$PORT`

Showing health and status for app pas-telstrasmsapi-client in org pasapi@au1.ibm.com / space apple as pasapi@au1.ibm.com...
OK

requested state: started
instances: 1/1
usage: 512M x 1 instances
urls: pas-telstrasmsapi-client.mybluemix.net
last uploaded: Sun Jul 19 15:22:26 UTC 2015

     state     since                    cpu    memory           disk           details
#0   running   2015-07-19 11:23:57 PM   0.8%   144.9M of 512M   149.8M of 1G

Step 6 - Send SMS using Telstra SMS Api from Bluemix Application using the Service

6.1. Navigate to the URL below and send an SMS using the form below.

http://pas-telstrasmsapi-client.mybluemix.net/


6.2. Verify it has sent a message to the phone number entered in the text field as shown below.



More Information 

Getting started with Bluemix is easy, navigate to http://bluemix.net to sign up and get going.http://feeds.feedburner.com/TheBlasFromPas
Categories: Fusion Middleware

Oracle Priority Support Infogram for 27-AUG-2015

Oracle Infogram - Thu, 2015-08-27 14:57

RDBMS
Understanding new In-Memory notes in an execution plan, from Oracle Database In-Memory
Migration IBM AIX ==> SPARC Solaris with Data Guard, from Upgrade your Database - NOW!
Tips on SQL Plan Management and Oracle Database In-Memory - Part 2, from the Oracle Optimizer blog.
Coding Oracle
Yet another CSV -> Table but with pipleline function, from Kris’ Blog.
WebLogic
Extending the Weblogic Console by adding Books, Pages and Portlets , from WebLogic Partner Community EMEA.
Java
9 tools to help you with Java Performance Tunin, from WebLogic Partner Community EMEA.
Extending the Weblogic Console by adding Books, Pages and Portlets 

Asynchronous Support in JAX-RS 2/Java EE 7, from The Aquarium.
And from the same source:
Java API for JSON Binding (JSON-B) 1.0 Early Draft Now Available!
BI Publisher
Page Borders and Title Underlines, from the Oracle BI Publisher blog.
BPM
Getting Started with BPM: Free Oracle University Video Tutorial, from the SOA & BPM Partner Community Blog.
Mobile Computing
New Features : Oracle Mobile Security Suite Integration in Oracle MAF 2.1.3, from The Oracle Mobile Platform Blog.
Ops Center
How Many Systems Can Ops Center Manage?, from the Ops Center blog.
Demantra
How to avoid ORA-06512 and ORA-20000 when Concurrent Statistics Gathering is enabled. New in 12.1 Database, Concurrent Statistics Gathering, Simultaneous for Multiple Tables or Partitions, from the Oracle Demantra blog.
EBS
From the Oracle E-Business Suite Support blog:
New White Paper on Using Web ADI Spread Sheet in the Buyer Work Center Orders (BWC)
From the Oracle E-Business Suite Technology blog:
JRE 1.8.0_60 Certified with Oracle E-Business Suite

Using Application Management Suite With E-Business Suite 12.2

Adaptive Query Optimization in Oracle 12c : Ongoing Updates

Tim Hall - Thu, 2015-08-27 12:09

I’ve said a number of times, the process of writing articles is part of an ongoing learning experience for me. A few days ago my personal tech editor (Jonathan Lewis) asked about a statement I made in the SQL Plan Directive article. On further investigation it turned out the sentence was a complete work of fiction on my part, based on my misunderstanding of something I read in the manual, as well as the assumption that everything that happens must be as a result of a new feature. :)

Anyway, the offending statement has been altered, but the conversation this generated resulted in new article about Automatic Column Group Detection.

The process also highlighted how difficult, at least for me, it is to know what is going on in the optimizer now. It wasn’t always straight forward before, but now with the assorted new optimizations, some beating others to the punch, it is even more difficult. There are a number of timing issues involved also. If a statement runs twice in quick succession, you might get a different sequence of events compared to having a longer gap between the first and second run of the statement. It’s maddening at times. I’m hoping Jonathan will put pen to paper about this, because I think he will do a better job of explaining the issues around the inter-dependencies better than I can.

Anyway, I will be doing another pass through this stuff over the coming days/weeks/months/years to make sure it is consistent with “my current understanding”. :)

Fun, fun, fun…

Cheers

Tim…

Adaptive Query Optimization in Oracle 12c : Ongoing Updates was first posted on August 27, 2015 at 7:09 pm.
©2012 "The ORACLE-BASE Blog". Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement.

Summer Projects and a Celebration

Oracle AppsLab - Thu, 2015-08-27 09:02

If you follow us on Twitter (@theappslab) or on Facebook, you’ve seen some of the Summer projects coming together.

If not, here’s a recap of some of the tinkering going on in OAUX Emerging Technologies land.

Mark (@mvilrokx) caught the IoT bug from Noel (@noelportugal), and he’s been busy destroying and rebuilding a Nerf gun, which is a thing. Search for “nerf gun mods” if you don’t believe me.

11878916_1055356384509414_2509613851779629347_o

Mark installed our favorite chip, the ESP8266, to connect the gun to the internets, and he’s been tinkering from there.

Meanwhile, Raymond (@yuhuaxie) has been busy building a smart thermostat.

11880462_1056118014433251_3359065287035416341_n

11896214_1056118017766584_838064281034754658_n

And finally, completely unrelated to IoT tinkering, earlier this month the Oracle Mexico Development Center (MDC) in Guadalajara celebrated its fifth anniversary. As you know, we have two dudes in that office, Os (@vaini11a) and Luis (@lsgaleana), as well as an extended OAUX family. Congratulations.

11838559_950231585020447_942122106329896825_o

 

11218984_950231691687103_4011713548176238985_nPossibly Related Posts:

Red Samurai ADF Performance Audit Tool v 4.0 - Web Client Request Monitoring and Complete Query Analysis

Andrejus Baranovski - Thu, 2015-08-27 03:12
I'm excited to announce, we have released a new version of our RSA audit tool. This is a major update after previous version released in February 2015 - Red Samurai ADF Performance Audit Tool v 3.4 - ADF Task Flow Statistics with Oracle DMS Servlet Integration.

It is already 3 years, since initial version - Red Samurai Performance Audit Tool - Runtime Diagnosis for ADF Applications. We are using it for many of our customers to monitor ADF performance in both test and production environments. Many new features were added during these years, more features to come.

RSA Audit v4.0 New Features

1. RSA Audit v4.0 dashboard is supporting ADF 12c and Alta UI look


2. Web Client Request Time monitoring. Supported with ADF 11g and 12c. Generic method tracks request time for all ADF UI components. Logged data can be analysed through ADF UI dashboard or directly in the DB. Request time represents complete time from user action in the browser, until request is completed. This includes real user experience - browser processing time, network time, server side time and ADF BC/DB processing times. Runs in VERBOSE logging mode


3. Detail information about ADF fragment, button or other ADF UI component involved into request is being logged together with request processing time and is accessible from audit dashboard. This helps to identify slow actions spanning from Web Client to DB


4. Information about each request is grouped, this allows to compare differences between multiple requests and identify bottlenecks in the application performance


5. Duplicate Queries. Allows to track all executed VO’s, very helpful to identify redundant VO’s executions. Groups VO executions per ECID, this helps to identify VO’s re-executed multiple times during the same request. Runs in MEDIUM logging mode


6. VO’s executed from the same ECID are automatically highlighted - this simplifies redundant queries analysis


7. Number of duplicate executions of VO’s per ECID is calculated and presented in the table and sunburst chart


8. We calculate top VO’s per AM. This helps to set priorities for SQL tuning and understand heavy used VO’s


9. Sunburst chart displays visual representation of duplicate and top VO’s per AM

Page Borders and Title Underlines

Tim Dexter - Wed, 2015-08-26 15:32

I have taken to recording screen grabs to help some folks out on 'how do I' scenarios. Sometimes a 3 minute video saves a couple of thousand words and several screen shots.

So, per chance you need to know:

1. How to add a page border to your output and/or

2. How to add an under line that runs across the page

Watch this!   https://www.youtube.com/watch?v=3UcXHeSF0BM

If you need the template, sample data and output, get them here.

I'm taking requests if you have them.

Categories: BI & Warehousing

The Fraught Interaction Design of Personalized Learning Products

Michael Feldstein - Wed, 2015-08-26 12:49

By Michael FeldsteinMore Posts (1042)

David Wiley has a really interesting post up about Lumen Learning’s new personalized learning platform. Here’s an excerpt:

A typical high-level approach to personalization might include:

  • building up an internal model of what a student knows and can do,
  • algorithmically interrogating that model, and
  • providing the learner with a unique set of learning experiences based on the system’s analysis of the student model

Our thinking about personalization started here. But as we spoke to faculty and students, and pondered what we heard from them and what we have read in the literature, we began to see several problems with this approach. One in particular stood out:

There is no active role for the learner in this “personalized” experience. These systems reduce all the richness and complexity of deciding what a learner should be doing to – sometimes literally – a “Next” button. As these systems painstakingly work to learn how each student learns, the individual students lose out on the opportunity to learn this for themselves. Continued use of a system like this seems likely to create dependency in learners, as they stop stretching their metacognitive muscles and defer all decisions about what, when, and how long to study to The Machine.

Instructure’s Jared Stein really likes Lumen’s approach, writing,

So much work in predictive analytics and adaptive learning seeks to relieve people from the time-consuming work of individual diagnosis and remediation — that’s a two-edged sword: Using technology to increase efficiency can too easily sacrifice humanness — if you’re not deliberate in the design and usage of the technology. This topic came up quickly amongst the #DigPedNetwork group when Jim Groom and I chatted about closed/open learning environments earlier this month, suggesting that we haven’t fully explored this dilemma as educators or educational technologist.

I would add that I have seen very little evidence that either instructors or students place a high value on the adaptivity of these products. Phil and I have talked to a wide range of folks using these products, both in our work on the e-Literate TV case studies and in our general work as analysts. There is a lot of interest in the kind of meta-cognitive dashboarding that David is describing. There is little interest in, and in some cases active hostility toward, adaptivity. For example, Essex County College is using McGraw Hill’s ALEKS, which has one of the more sophisticated adaptive learning approaches on the market. But when we talked to faculty and staff there, the aspects of the program that they highlighted as most useful were a lot more mundane, e.g.,

It’s important for students to spend the time, right? I mean learning takes time, and it’s hard work. Asking students to keep time diaries is a very difficult ask, but when they’re working in an online platform, the platform keeps track of their time. So, on the first class day of the week, that’s goal-setting day. How many hours are you going to spend working on your math? How many topics are you planning to master? How many classes are you not going to be absent from?

I mean these are pretty simple goals, and then we give them a couple goals that they can just write whatever they feel like. And I’ve had students write, “I want to come to class with more energy,” and other such goals. And then, because we’ve got technology as our content delivery system, at the end of the week I can tell them, in a very efficient fashion that doesn’t take up a lot of my time, “You met your time goal, you met your topic goal,” or, “You approached it,” or, “You didn’t.”

So one of the most valuable functions of this system in this context is to reflect back to the students what they have done in terms that make sense to them and are relevant to the students’ self-selected learning goals. The measures are fairly crude—time on task, number of topics covered, and so on—and there is no adaptivity necessary at all.

But I also think that David’s post hints at some of the complexity of the design challenges with these products.

You can think of the family of personalized learning products as having potentially two components: diagnostic and prescriptive. Everybody who likes personalized learning products in any form like the diagnostic component. The foundational value proposition for personalization, (which should not in any way be confused with “personal”), is having the system provide feedback to students and teachers about what the student does well and where the student is struggling. Furthermore, the perceived value of the product is directly related to the confidence that students and teachers have that the product is rendering an accurate diagnosis. That’s why I think products that provide black box diagnoses are doomed to market failure in the long term. As the market matures, students and teachers are going to want to know not only what the diagnosis is but what the basis of the diagnosis is, so that they can judge for themselves whether they think the machine is correct.

Once the system has diagnosed the student’s knowledge or skill gaps—and it is worth calling out that these many of these personalized learning systems work on a deficit model, where the goal is to get students to fill in gaps—the next step is to prescribe actions that will help students to address those gaps. Here again we get into the issue of transparency. As David points out, some vendors hide the rationale for their prescriptions, even going so far as to remove user choice and just hide the adaptivity behind the “next” button. Note that the problem isn’t so much with providing a prescription as it is with the way in which it is provided. The other end of the spectrum, as David argues, is to make recommendations. The full set of statements from a well behaved personalized learning product to a student or teacher might be something like the following:

  1. This is where I think you have skill or knowledge gaps.
  2. This is the evidence and reasoning for my diagnosis.
  3. This is my suggestion for what you might want to do next.
  4. This is my reasoning for why I think it might help you.

It sounds verbose, but it can be done in fairly compact ways. Netflix’s “based on your liking Movie X and Movie Y, we think you would give Movie Z 3.5 stars” is one example of a compact explanation that provides at least some of this information. There are lots of ways that a thoughtful user interface designer can think about progressively revealing some of this information and providing “nudges” that encourage students on certain paths while still giving them the knowledge and freedom they need to make choices for themselves. The degree to which the system should be heavy-handed in its prescription probably depends in part on the pedagogical model. I can see something closer to “here, do this next” feeling appropriate in a self-paced CBE course than in a typical instructor-facilitated course. But even there, I think the Lumen folks are 100% right that the first responsibility of the adaptive learning system should be to help the learner understand what the system is suggesting and why so that the learner can gain better meta-cognitive understanding.

None of which is to say that the fancy adaptive learning algorithms themselves are useless. To the contrary. In an ideal world, the system will be looking at a wide range of evidence to provide more sophisticated evidence-based suggestions to the students. But the key word here is “suggestions.” Both because a critical part of any education is teaching students to be more self-aware of their learning processes and because faulty prescriptions in an educational setting can have serious consequences, personalized learning products need to evolve out of the black box phase as quickly as possible.

 

 

The post The Fraught Interaction Design of Personalized Learning Products appeared first on e-Literate.

Inside View Of Blackboard’s Moodle Strategy In Latin America

Michael Feldstein - Wed, 2015-08-26 11:45

By Phil HillMore Posts (356)

One year ago Blackboard’s strategy for Moodle was floundering. After the 2012 acquisition of Moodlerooms and Netspot, Blackboard had kept its promises of supporting the open source community – and in fact, Blackboard pays much more than 50% of the total revenue going to Moodle HQ[1] – but that does not mean they had a strategy. Key Moodlerooms employees were leaving, and the management was frustrated. Last fall the remaining Moodlerooms management put together an emerging strategy to invest in (through corporate M&A) and grow the Moodle business, mostly outside of the US.

In just the past twelve months, Blackboard has acquired three Moodle-based companies – Remote-Learner UK (Moodle Partner in the UK), X-Ray Analytics (learning analytics for Moodle), and Nivel Siete (Moodle Partner in Colombia). When you add in organic growth to these acquisition, Blackboard has added ~450 new clients using Moodle in this same time period, reaching a current total of ~1400.

This is a change worth exploring. To paraphrase Michael’s statements to me and in his recent BbWorld coverage:

If you want to understand Blackboard and their future, you have to understand what they’re doing internationally. If you want to understand what they’re doing internationally, you have to understand what they’re doing with Moodle.

Based on this perspective, I accepted an invitation from Blackboard to come visit Nivel Siete last week to get a first-hand view of what this acquisition means I also attended the MoodleMoot Colombia #mootco15 conference and talked directly to Moodle customers in Latin America. Let’s first unpack that last phrase.

  • Note that due to the nature of this trip, I “talked directly” with Blackboard employees, Nivel Siete employees, Blackboard resellers, and Nivel Siete customers. They did give me free access to talk privately with whoever I wanted to, but treat this post as somewhat of an inside view rather than one that also includes perspectives from competitors.
  • “Moodle” is very significant in Latin America. It is the default LMS that dominates learning environments. The competition, or alternative solution, there is Blackboard Learn or . . . another route to get Moodle. In this market D2L and Canvas have virtually no presence – each company has just a couple of clients in Latin America and are not currently a factor in LMS decision-making. Schoology has one very large customer in Uruguay service hundreds of thousands of students. Blackboard Learn serves the top of the market – e.g. the top 10% in terms of revenue of Colombian institutions, where they already serves the majority of that sub-market according to the people I talked to. For the remaining 90%, it is pretty much Moodle, Moodle, alternate applications that are not LMSs, or nothing.[2]
  • I chose “customers” instead of “schools” or “institutions” for a reason. What is not understood in much of the education community is that Moodle has a large footprint outside of higher ed and K-12 markets. Approximately 2/3 of Nivel Siete’s clients are in corporate learning, and several others are government. And this situation is quite common for Moodle. In the US, more than 1/3 of Moodlerooms’ and approximately 1/2 of Remote-Learner’s customers are corporate learning. Phill Miller, the VP of International for Moodlerooms, said that for most of the Moodle hosting and service providers he has met, they also are serving corporate clients at similar numbers as education.
  • I chose “Latin America” instead of “Colombia” for a reason. While all but ~12 of Nivel Siete’s existing clients are in Colombia, Blackboard bought the company to act as a center of excellence or support service company for most of Latin America – Colombia, Mexico, Brazil, and Peru in particular. Cognos Online, their current local reseller for Latin America for core Blackboard products (Learn, Collaborate, etc) will become the reseller also for their Moodle customers. Nivel Siete will support a broader set of clients. In other words, this is not a simple acquisition of customers – it is an expansion of international presence.

And while we’re at it, the conference reception included a great opera mini flash mob (make sure to watch past 0:37):

Nivel Siete

Nivel Siete (meaning Level 7, a reference from two of the founders’ college days when a professor talked about need to understand deeper levels of the technology stack than just top-level applications that customers see), is a company of just over 20 employees in Bogota. They have 237+ clients, but that is growing. During the three days while I was there they signed several new contracts. They offer Moodle hosting and service in a cloud environment based on Amazon Web Services (AWS) – not true SaaS, as they allow multiple software versions in production and have not automated all provisioning or upgrade processes. What they primarily offer, according to the founders, is a culture of how to service and support using cloud services and specific marketing and sales techniques.

In Latin America, most customers care more about the local sales and support company than they do about the core software. As one person put it, they believe in skin-to-skin sales, where clients have relationships they trust as long as solutions are provided. Most LMS customers in Latin America do not care as much about the components of that solution as they do about relationships, service, and price. And yet, due to open source software and lightweight infrastructure needs, Moodle is dominant as noted above. The Moodle brand, code base, and code licensing does not matter as much as the Moodle culture and ecosystem. From a commercial standpoint, Nivel Siete’s competitors include a myriad of non Moodle Partner hosting providers – telcos bundling in hosting, mom-and-pop providers, self-hosting – or non-consumption. For a subset of the market, Nivel Siete has competed with Blackboard Learn.

Beyond Cognos Online, Blackboard has another ~9 resellers in Latin America, and Nivel Siete (or whatever they decide to name the new unit) will support all of these resellers. This is actually the biggest motivation other than cash for the company to sell – they were seeking methods to extend their influence, and this opportunity made the most sense.

Blackboard Learn and Ultra

What about that Learn sub-market? Most clients and sales people (resellers as well as Blackboard channel manager) are aware of Learn Ultra, but the market seems to understand already that Ultra is not for them . . . yet. They appear to be taking a ‘talk to me when it’s done and done in Spanish’ approach and not basing current decisions on Ultra. In this sense, the timing for Ultra does not matter all that much, as the market is not waiting on it. Once Ultra is ready for Latin America, Blackboard sales (channel manager and resellers) expect the switchover to be quicker than in the US, as LMS major upgrades (involving major UI and UX changes) or adoptions tend to take weeks or months instead of a year or more as we often see in the states. At least in the near term, Learn Ultra is not a big factor in this market.

What Blackboard is best known for in this market is the large SENA contract running on Learn. SENA (National Service for Learning) is a government organization that runs the majority of all vocational colleges – providing certificates and 2-year vocational degrees mostly for lower-income students, a real rising middle class move that is important in developing countries. Blackboard describes SENA as having 6+ million total enrollment, with ~80% in classrooms and ~20% in distance learning.

Integration

The challenge Blackboard faces is integrating its Learn and Moodle operations through the same groups – Nivel Siete internal group, Cognos Online and other resellers serving both lines – without muddling the message and go-to-market approach. Currently Learn is marketed and sold through traditional enterprise sales methods – multiple meetings, sales calls, large bids – while Nivel Siete’s offering of Moodle is marketed and sold with more of a subscription-based mentality. As described by ForceManagement:

A customer who has moved to a subscription-based model of consumption has completely different expectations about how companies are going interact with them.

How you market to them, how you sell to them, how you bill them, how you nurture the relationship – it’s all affected by the Subscription Economy. The customer’s idea of value has changed. And, if the customer’s idea of value has changed, your value proposition should be aligned accordingly. [snip]

The subscription-based sales process relies less on the closing of a sale and more on the nurturing of a long-term relationship to create lifetime customer value.

One of Nivel Siete’s most effective techniques is their The e-Learner Magazine that highlights customer telling their own stories and lessons in a quasi-independent fashion. The company has relied on inbound calls and quick signups and service startups. There is quite a different cultural difference between enterprise software and subscription-based approaches. While Blackboard themselves are facing such changes due to Ultra and newly-offered SaaS models, the group in Latin America is facing the challenge of two different cultures served by the same organizations today.

To help address this challenge, Cognos Online is planning to have two separate teams selling / servicing mainline Blackboard products and Moodle products. But even then, CEO Fernery Morales described that their biggest risk is muddling the message and integrating appropriately.

Moodle Strategy and Risk

At the same time, this strategy and growth comes at a time where the Moodle community at large appears to be at an inflection point. This inflection point I see comes from a variety of triggers:

  • Blackboard acquisitions causing Moodle HQ, other Moodle Partners, and some subset of users’ concerns about commercialization;
  • Creation of the Moodle Association as well as Moodle Cloud services as alternate paths to Moodle Partners for revenue and setup; and
  • Remote-Learner leaving the Moodle Partner program and planning to join the Moodle Association, with its associated lost revenue and public questioning value.

I don’t have time to fully describe these changes here, but Moodle itself is both an opportunity and a risk mostly based on its own success globally. More of that in a future post.

What Does This Mean Beyond Latin America?

It’s too early to fully know, but here are a few notes.

  • Despite the positioning in the US media, there is no “international” market. There are multiple local or regional markets outside of the US that have tremendous growth opportunities for US and other companies outside of those immediate markets. Addressing these markets puts a high premium on localization – having feet on the ground for people who know the culture, can be trusted in the region, and including product customizations meant for those markets. Much of the ed tech investment boom is built on expectations of international growth, but how many ed tech companies actually know how to address local or regional non-US markets? This focus on localizing international markets is one of Blackboard’s greatest strengths.
  • Based on the above, at least in Latin America Blackboard is building itself up as being the status quo before other learning platforms really get a chance to strategically enter the market. For example, Instructure has clearly not chosen to go after non English-speaking international markets yet, but by the time they do push Canvas into Latin America, and if Blackboard is successful integrating Nivel Siete, for example, it is likely Instructure will face an entrenched competitor and potential clients who by default assume Moodle or Learn as solutions.
  • Blackboard as a company has one big growth opportunity right now – the collection of non-US “international” markets that represent just under 1/4 of the company’s revenue. Domestic higher ed is not growing, K-12 is actually decreasing, but international is growing. These growing markets need Moodle and  traditional Learn 9.1 much more than Ultra. I suspect that this growing importance is creating more and more tension internal to Blackboard, as the company needs to balance Ultra with traditional Learn and Moodle development.
  • While I strongly believe in the mission of US community colleges and low-cost 4-year institutions, in Latin America the importance of education in building up an emerging middle class is much greater than in US. We hear this “importance of education” and “building of middle class” used in generic terms regarding ed tech potential, but seeing this connection more closely by being in country is inspiring. This is a real global need that can and should drive future investment in people and technology to address.
  1. This information based on tweet last spring showing Moodlerooms + Netspot combined were more than 50% of revenue, and that the next largest Moodle Partner, Remote-Learner, has left the program. Since last year I have confirmed this information through multiple sources.
  2. Again, much of this information is from people related to Blackboard, but it also matches my investigation of press releases and public statements about specific customers of D2L and Instructure.

The post Inside View Of Blackboard’s Moodle Strategy In Latin America appeared first on e-Literate.

DELETE is faster than TRUNCATE

Laurent Schneider - Wed, 2015-08-26 07:18

Truncate is useful in some serial batch processing but it breaks the read-write consistency, generates stranges errors and results for running selects, and it needs DROP ANY TABLE when run over a table that you do not own.

But also, DELETE is faster in the following test case.

In 12c, you could have over one million partition in a table, but for the sake of the universe, I’ll try with 10000.


SQL> create table scott.t(x) 
  partition by range(x) 
  interval(1) 
  (partition values less than (0)) 
  as 
  select rownum 
  from dual 
  connect by level<10001;
SQL> select count(*) from scott.t;

  COUNT(*)
----------
     10000

The 10K rows table is created, each row is its partition


SQL> delete scott.t;

10000 rows deleted.

Elapsed: 00:00:04.02
SQL> rollback;

Rollback complete.

Not tuned or parallelized or whatever. It took 4 seconds for 10’000 rows. If you have one billion rows, it is doable in a few hours. But you better do it in chunks then.

Anyway, let’s truncate


SQL> truncate table scott.t;

Table truncated.

Elapsed: 00:05:19.24

Five minutes !!! to truncate that tiny table.

If you have one million partitions and underlying indexes and lobs, it will probably failed with out of memory errors after hours and a large impact on the dictionary, sysaux, undo.

The dictionary changes are here very slow.

Protect Your APEX Application PL/SQL Source Code

Pete Finnigan - Wed, 2015-08-26 04:35

Oracle Application Express is a great rapid application development tool where you can write your applications functionality in PL/SQL and create the interface easily in the APEX UI using all of the tools available to create forms and reports and....[Read More]

Posted by Pete On 21/07/15 At 04:27 PM

Categories: Security Blogs

Oracle Security and Electronics

Pete Finnigan - Wed, 2015-08-26 04:35

How does Oracle Security and Electronic mix together? - Well I started my working life in 1979 as an apprentice electrician in a factory here in York, England where I live. The factory designed and built trains for the national....[Read More]

Posted by Pete On 09/07/15 At 11:24 AM

Categories: Security Blogs

New Conference Speaking Dates Added

Pete Finnigan - Wed, 2015-08-26 04:35

In the last few years I have not done as many conference speaking dates as I used to. This is simply because when offered they usually clashed with pre-booked work. I spoke for the UKOUG in Dublin last year and....[Read More]

Posted by Pete On 06/07/15 At 09:40 AM

Categories: Security Blogs

Happy 10th Belated Birthday to My Oracle Security Blog

Pete Finnigan - Wed, 2015-08-26 04:35

Make a Sad Face..:-( I seemed to have missed my blogs tenth which happened on the 20th September 2014. My last post last year and until very recently was on July 23rd 2014; so actually its been a big gap....[Read More]

Posted by Pete On 03/07/15 At 11:28 AM

Categories: Security Blogs

Oracle Database Vault 12c Paper by Pete Finnigan

Pete Finnigan - Wed, 2015-08-26 04:35

I wrote a paper about Oracle Database Vault in 12c for SANS last year and this was published in January 2015 by SANS on their website. I also prepared and did a webinar about this paper with SANS. The Paper....[Read More]

Posted by Pete On 30/06/15 At 05:38 PM

Categories: Security Blogs

Unique Oracle Security Trainings In York, England, September 2015

Pete Finnigan - Wed, 2015-08-26 04:35

I have just updated all of our Oracle Security training offerings on our company website. I have revamped all class pages and added two page pdf flyers for each of our four training classes. In have also updated the list....[Read More]

Posted by Pete On 25/06/15 At 04:36 PM

Categories: Security Blogs

Coding in PL/SQL in C style, UKOUG, OUG Ireland and more

Pete Finnigan - Wed, 2015-08-26 04:35

My favourite language is hard to pin point; is it C or is it PL/SQL? My first language was C and I love the elegance and expression of C. Our product PFCLScan has its main functionallity written in C. The....[Read More]

Posted by Pete On 23/07/14 At 08:44 PM

Categories: Security Blogs

Integrating PFCLScan and Creating SQL Reports

Pete Finnigan - Wed, 2015-08-26 04:35

We were asked by a customer whether PFCLScan can generate SQL reports instead of the normal HTML, PDF, MS Word reports so that they could potentially scan all of the databases in their estate and then insert either high level....[Read More]

Posted by Pete On 25/06/14 At 09:41 AM

Categories: Security Blogs

On docker, Ubuntu and Oracle RDBMS

Marcelo Ochoa - Tue, 2015-08-25 19:17
I have Oracle RDBMS working on Ubuntu for a long time (12.04 and 14.04) RDBMS versions 10g, 11g and 12c with some tweaks to get it working.
Apart from the effort to get it working, some time requires Makefile modifications, these configurations are not supported by Oracle and for sure you couldn't report any bug.
To solve this, is easy to get VirtualBox working and download a pre-built VM, but it requires a lot of hardware resource :(
Hopefully Docker comes in action, I followed this great post by Frits Hoogland for Installing Oracle Database in Docker.
First We need docker running on Ubuntu, there too many guides about that also by simple installing using apt-get repository, my case.
Following Frits's guide I changed the default repository destination using an USB external disk formatted with btrfs:
# df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/sda1        15G  7,7G  6,5G  55% /
none            4,0K     0  4,0K   0% /sys/fs/cgroup
udev            3,9G   12K  3,9G   1% /dev
tmpfs           789M  1,4M  788M   1% /run
none            5,0M     0  5,0M   0% /run/lock
none            3,9G  344M  3,6G   9% /run/shm
none            100M   56K  100M   1% /run/user
/dev/sda6       442G  384G   36G  92% /home
/dev/sdb        597G   18G  577G   3% /var/lib/dockerAlso I created the directory for the Docker template but with some changes, here the files:root@local:/# cd /var/lib/docker/dockerfiles/build-oracle-12102/
root@local:/var/lib/docker/dockerfiles/build-oracle-12102# ls -l
total 2625148
-rw-r--r-- 1 root root      10976 ago 23 10:07 db_install.dbt
-rw-r--r-- 1 root root      10931 ago 25 16:30 db_install-full.dbt
-rw-r--r-- 1 root root      10972 ago 25 16:30 db_install-simple.dbt
-rw-r--r-- 1 root root       1168 ago 25 11:09 Dockerfile
-rw-r--r-- 1 root root 1673544724 ago 22 20:36 linuxamd64_12102_database_1of2.zip
-rw-r--r-- 1 root root 1014530602 ago 22 20:36 linuxamd64_12102_database_2of2.zip
-rwxr-xr-x 1 root root       1729 ago 25 10:11 manage-oracle.sh
-rw-r--r-- 1 root root      24542 ago 24 20:42 responsefile_oracle12102.rspThe content of Dockerfile is:FROM    oraclelinux:6
MAINTAINER marcelo.ochoa@gmail.com
RUN groupadd -g 54321 oinstall
RUN groupadd -g 54322 dba
RUN useradd -m -g oinstall -G oinstall,dba -u 54321 oracle
RUN yum -y install oracle-rdbms-server-12cR1-preinstall perl wget unzip
RUN mkdir /u01
RUN chown oracle:oinstall /u01
USER    oracle
WORKDIR /home/oracle
COPY linuxamd64_12102_database_1of2.zip /home/oracle/
COPY linuxamd64_12102_database_2of2.zip /home/oracle/
COPY responsefile_oracle12102.rsp /home/oracle/
RUN unzip linuxamd64_12102_database_1of2.zip
RUN unzip linuxamd64_12102_database_2of2.zip
RUN rm linuxamd64_12102_database_1of2.zip linuxamd64_12102_database_2of2.zip
RUN /home/oracle/database/runInstaller -silent -force -waitforcompletion -responsefile /home/oracle/responsefile_oracle12102.rsp -ignoresysprereqs -ignoreprereq
USER    root
RUN /u01/app/oraInventory/orainstRoot.sh
RUN /u01/app/oracle/product/12.1.0.2/dbhome_1/root.sh -silent
RUN rm -rf /home/oracle/responsefile_oracle12102.rsp /home/oracle/database
USER    oracle
WORKDIR /home/oracle
RUN     mkdir -p /u01/app/oracle/data
COPY    manage-oracle.sh /home/oracle/
EXPOSE  1521
CMD sh -c /home/oracle/manage-oracle.shRemarked lines differ from the original post by Frits to not download everything from the web using wget, instead of doing that I downloaded 12c binary distribution from OTN Web Site and I copied these two zip into the directory where Dockerfile resides, also I am download Frits's files responsefile_oracle12102.rsp, manage-oracle.sh and db_install.dbt.The file which is responsible for creating/starting/stopping the DB also was modified to use db_install.dbt from the host machine, here the modified version of manage-oracle.sh:#!/bin/bash
PERSISTENT_DATA=/u01/app/oracle/data
export ORACLE_HOME=/u01/app/oracle/product/12.1.0.2/dbhome_1
export ORACLE_SID=$(hostname)
stop_database() {
$ORACLE_HOME/bin/sqlplus / as sysdba << EOF
shutdown abort
exit
EOF
exit
}
start_database() {
$ORACLE_HOME/bin/sqlplus / as sysdba << EOF
startup
exit
EOF
}
create_pfile() {
$ORACLE_HOME/bin/sqlplus -S / as sysdba << EOF
set echo off pages 0 lines 200 feed off head off sqlblanklines off trimspool on trimout on
spool $PERSISTENT_DATA/init_$(hostname).ora
select 'spfile="'||value||'"' from v\$parameter where name = 'spfile';
spool off
exit
EOF
}
trap stop_database SIGTERM
printf "LISTENER=(DESCRIPTION_LIST=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=$(hostname))(PORT=1521))(ADDRESS=(PROTOCOL=IPC)(KEY=EXTPROC1521))))\n" > $ORACLE_HOME/network/admin/listener.ora
$ORACLE_HOME/bin/lsnrctl start
if [ ! -f ${PERSISTENT_DATA}/DATABASE_IS_SETUP ]; then
sed -i "s/{{ db_create_file_dest }}/\/u01\/app\/oracle\/data\/$(hostname)/" $PERSISTENT_DATA/db_install.dbt
sed -i "s/{{ oracle_base }}/\/u01\/app\/oracle/" $PERSISTENT_DATA/db_install.dbt
sed -i "s/{{ database_name }}/$(hostname)/" $PERSISTENT_DATA/db_install.dbt
$ORACLE_HOME/bin/dbca -silent -createdatabase -templatename $PERSISTENT_DATA/db_install.dbt -gdbname $(hostname) -sid $(hostname) -syspassword oracle -systempassword oracle -dbsnmppassword oracle
create_pfile
if [ $? -eq 0 ]; then
touch ${PERSISTENT_DATA}/DATABASE_IS_SETUP
fi
else
mkdir -p /u01/app/oracle/admin/$(hostname)/adump
$ORACLE_HOME/bin/sqlplus / as sysdba << EOF
startup pfile=$PERSISTENT_DATA/init_$(hostname).ora
exit
EOF
fi
tail -f /u01/app/oracle/diag/rdbms/$(hostname)/*/trace/alert_$(hostname).log &
waitI am doing that to quickly creates different database with some Oracle options enabled or not, for example db_install.dbt file for OLS Searching will have enabled:option name="JSERVER" value="true"
option name="SPATIAL" value="true"
option name="IMEDIA" value="true"
option name="XDB_PROTOCOLS" value="true"
option name="ORACLE_TEXT" value="true"
.....To create the docker template is similar to the original post:# cd /var/lib/docker/dockerfiles/build-oracle-12102
# docker build -t "oracle-12102" .
# docker images
REPOSITORY          TAG                 IMAGE ID            CREATED             VIRTUAL SIZE
oracle-12102        latest              24687eeab73c        8 hours ago         12.26 GB
oraclelinux         6                   cfc75fa9f295        3 weeks ago         156.2 MBFinally to create a full featured Java enabled 12c database I created a directory with the following content:# mkdir -p /var/lib/docker/db/ols
# cp /var/lib/docker/dockerfiles/build-oracle-12102/db_install-full.dbt  /var/lib/docker/db/ols/db_install.dbt
# chown -R 54321:54321 /var/lib/docker/db/olsand executed:# docker run --ipc=host --volume=/var/lib/docker/db/ols:/u01/app/oracle/data --name ols --hostname ols --detach=true oracle-12102
25efb5d26aad31e7b06a8e2707af7c25943e2e42ec5c432dc9fa55f0da0bdaefthe container is started and the database creations works perfect, here the output:# docker logs -f ols
LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 25-AUG-2015 16:35:18
Copyright (c) 1991, 2014, Oracle.  All rights reserved.
Starting /u01/app/oracle/product/12.1.0.2/dbhome_1/bin/tnslsnr: please wait...
TNSLSNR for Linux: Version 12.1.0.2.0 - Production
System parameter file is /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Log messages written to /u01/app/oracle/diag/tnslsnr/ols/listener/alert/log.xml
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ols)(PORT=1521)))
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=ols)(PORT=1521)))
STATUS of the LISTENER
------------------------
Alias                     LISTENER
Version                   TNSLSNR for Linux: Version 12.1.0.2.0 - Production
Start Date                25-AUG-2015 16:35:19
Uptime                    0 days 0 hr. 0 min. 1 sec
Trace Level               off
Security                  ON: Local OS Authentication
SNMP                      OFF
Listener Parameter File   /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Listener Log File         /u01/app/oracle/diag/tnslsnr/ols/listener/alert/log.xml
Listening Endpoints Summary...
  (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ols)(PORT=1521)))
  (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
The listener supports no services
The command completed successfully
LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 25-AUG-2015 16:35:18
Copyright (c) 1991, 2014, Oracle.  All rights reserved.
Starting /u01/app/oracle/product/12.1.0.2/dbhome_1/bin/tnslsnr: please wait...
TNSLSNR for Linux: Version 12.1.0.2.0 - Production
System parameter file is /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Log messages written to /u01/app/oracle/diag/tnslsnr/ols/listener/alert/log.xml
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ols)(PORT=1521)))
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=ols)(PORT=1521)))
STATUS of the LISTENER
------------------------
Alias                     LISTENER
Version                   TNSLSNR for Linux: Version 12.1.0.2.0 - Production
Start Date                25-AUG-2015 16:35:19
Uptime                    0 days 0 hr. 0 min. 1 sec
Trace Level               off
Security                  ON: Local OS Authentication
SNMP                      OFF
Listener Parameter File   /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Listener Log File         /u01/app/oracle/diag/tnslsnr/ols/listener/alert/log.xml
Listening Endpoints Summary...
  (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ols)(PORT=1521)))
  (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
The listener supports no services
The command completed successfully
Creating and starting Oracle instance
1% complete
.....
25% complete
Adding Oracle JVM
32% complete
....
Adding Oracle Text
50% complete
....
Adding Oracle Multimedia
55% complete
....
Adding Oracle Spatial
69% complete
Adding Oracle Spatial
69% complete
....
Adding Oracle Application Express
82% complete
87% complete
Completing Database CreationNext steps installing Scotas OLS for testing and happy Docker/Oracle combination :)

Oracle Sales Cloud is a Dynamic Force at Oracle OpenWorld 2015

Linda Fishman Hoyle - Tue, 2015-08-25 16:29

A Guest Post by Michael Richter (pictured left), Director of Product Management, Oracle

OpenWorld is returning to San Francisco–October 25-29, 2015!

Customer Experience (CX) will take over Moscone West 2nd floor this year, and Oracle Sales Cloud is excited to present this year’s lineup of customer experience sessions and demonstrations for sales professionals.

Learn about the newest enhancements or get the latest product demonstrations from the product management and sales consulting teams.

Visit Sales―CX Central@OpenWorld for details. All Oracle Sales Cloud sessions will take place in rooms 2003 or 2004 in Moscone West on the 2nd floor. The Oracle Sales Cloud demo zone is also situated on the 2nd floor. Dates, times, and room numbers will be published on at the link above in early September.

What’s New and Different?

  • Release essentials and roadmap sessions for Oracle Sales Cloud and Configure, Price, and Quote
  • Pre-configured industry solution session and product demonstrations
  • Partner-led interactive workshops and hands-on labs
  • Sessions showing new ways of how to migrate or integrate with existing applications

Guest Customer and Partner Appearances

Deloitte Digital, Pansonic Manufacturing (UK), Wilsonart, Accenture, Schneider Electric, Accuride, Config Consultants, DB Schenker, Swiss Post, KEC International, Batesville, Serene, ustudio, Hitachi Consultants, KPN, Oceaneering, TMEIC, TH March, GPUK (Global Pay), Prisio Technologies, Perficient, Infosys, Apex IT, e-Verge, General Electric, Tuff Shed, and more!

Kick-Off with the Sales General Session!

Deloitte Digital teams up with GVP Bill Hunt, Oracle Sales Cloud Development, to kick off the CX-Sales track (GEN4525) at 1:00 – 2:15 p.m. in Room 2003. Hear about Oracle’s product strategy and what Deloitte Digital is doing to meet today’s customer experience challenges.

The Release Essentials and Roadmap Conference Sessions

Oracle Sales Cloud will host more than 30 conference sessions this year. The first in a series of seven Release Essentials and Roadmap sessions begins at 2:45 p.m. on Monday, October 26. These sessions are led by Oracle Sales Cloud product management team members and include customer and partner case studies.

  • Oracle Sales Cloud Release 11 Essentials and Roadmap for Analytics [CON4508]
  • Oracle Sales Cloud Release 11 Essentials and Roadmap for Sales Performance Management [CON7951]
  • Oracle Sales Cloud Release 11 Essentials and Roadmap for PRM: High Tech and Manufacturing [CON4529]
  • Oracle Sales Cloud Release11 Essentials and Roadmap for Configuring and Customizing [CON7952]
  • Oracle Sales Cloud Release 11 Essentials and Roadmap for Outlook and IBM Notes [CON7950]
  • Oracle Sales Cloud Release 11 Essentials and Roadmap for Oracle Customer Data Hub and DaaS [CON7958]
  • Oracle Configure, Price, and Quote Cloud Essentials and Roadmap [CON4548]

Sessions of Special Interest

  • Oracle Sales Cloud: A Temperature Check for Customers by Nucleus Research [CON8191]
  • Oracle Sales Cloud: the Path to the Cloud for Siebel Customers [CON4537]
  • Oracle Configure, Price, and Quote Cloud: Driving Sales Productivity at Schneider Electric [CON8187]
  • Oracle Sales Cloud Hands-on Lab: Oracle Sales Cloud Mobile [HOL8735]
  • Oracle Sales Cloud Hands-on Lab: Easy to Use—Anytime, Anywhere [HOL8665]

Customer Panels

We have scheduled two popular customer panels:

  • Oracle Sales Cloud Analytics [CON9140] with Batesville and Fike
  • Oracle Sales Cloud Customer Panel: the Challenges of Digital Transformation [CON7954] with KEC, Tuff Shed, and TH March

Partner Panels

Hear from the experts at the partner panels.

  • Oracle Sales Cloud: Strategies for Successful CRM Implementations [CON8196] with panelists Config Consultants, Infosys, Apex IT, and e-Verge
  • Oracle Sales Cloud: Delivering Key Capabilities of Modern Selling [CON9660] with Accenture, Perficient, Hitachi Consulting, and Serene

Sales Demo Zone

Take part in Oracle Sales Cloud product demonstrations led by members of the Oracle Sales Cloud product management and sales consulting teams. The Buzz House is located next to the CX-Sales Demo so you can relax with a cup of coffee or snack.

  • New enhancements for core sales force automation and mobile solutions
  • How Oracle ensures data integrity with Customer Data Management
  • The latest developments for analytics
  • Streamlining with Configure, Price, and Quote
  • Oracle Sales Cloud integration with MS Outlook and IBM Notes
  • Incentive Compensation, Territory Management, and Quota Management
  • How to configure and customize and the tools available to you
  • Learn about integrations and migration processes and tools
  • Learn what’s in store for midsize companies

At a Glance

Visit Sales―CX Central@OpenWorld Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} for full details on speakers, conference sessions, exhibits and entertainment and experience all that Oracle OpenWorld has to offer.  Dates, times, and room numbers will be published at the link above in early September