Skip navigation.

Feed aggregator

Data Integration Tips: ODI 12.1.3 – Convert to Flow

Rittman Mead Consulting - Thu, 2015-04-16 13:23

The many who have already upgraded Oracle Data Integrator from the 11g version to 12c probably know about this great feature called “convert to flow”. If not, well…here you go!

First, a little background on why I think this is an excellent bit of functionality. The ODI Upgrade Assistant will convert objects from 11g to 12c and it does a pretty decent job of it. When converting Interfaces, the upgrade process creates a Mapping in ODI 12c by taking the logical “mapping” layout and loading it into a Dataset object. I assumed the reason was because it wasn’t easy to convert an Interface directly to a full on flow-based mapping, which you typically would develop in ODI 12.1.3 rather than using the limited Dataset (only joins, filters, and lookups allowed). After the upgrade, you would then be stuck with loads of mappings that are not using the latest flow-based features and components.

interface-and-mapping-ds

Now, in ODI 12.1.3, we have the ability to convert our Dataset into the standard ODI 12c flow based components within the Mapping. With a right-click on the Dataset component, we can see the “Convert to Flow” option.

convert-to-flowconfirm

Select Convert to Flow and accept the warning that our Mapping will be changed forever…and boom! No more Dataset!

This is great for my individual Mappings, but now I want to convert my migrated Reusable Mapping Datasets to flow based components.

reusable-convert-to-flow-missing-ds

Wait, what? No option to Convert to Flow! It looks like the Reusable Mappings (which were upgraded from my ODI 11g Temporary Interfaces) cannot be converted to flow for some reason. Hmm… Well, let’s finish converting my Datasets to flow based components for the rest of my 342 upgraded Mappings…one-by-one. Yikes! Actually, we can find a better way to do this. Time to get Groovy with the ODI SDK!

Using Groovy, I can create a simple script to loop through all of my mappings, find the dataset, and call the convertToFlow function on that dataset component. Here’s a look at the guts of the script.

for (mapping in mappingsList){
  componentsList=mapping.getAllComponentsOfType("DATASET")
  
  for (component in componentsList){

    java.util.List convertIssues = new ArrayList()
    blnConvert = 1
    
    try {
      blnConvert = component.convertToFlow(convertIssues)
      
      if (blnConvert) {
        for (item in convertIssues) {
          out.println item.toString()
        }
      }
      
    } catch (Exception e) {
    
    out.println e;
    
    }
  
    out.println mapping.getName() + " had a dataset converted to flow."
  }
}

Just remember to pass the results list object as a parameter to the convertToFlow call (and make sure the List object is properly instantiated as an ArrayList – as I was humbly reminded by David Allan via Twitter!). Once completed, you should be able to open each mapping and see that the dataset has been removed and only flow-based components exist.

Excellent, now we’ve completed our conversion in no time at all. But wait, what about those Reusable Mappings? Remember, we don’t have the right-click menu option to convert to flow as we did with the standard Mapping. Well, let’s see what our friend the ODI SDK has to say about that!

With a slight tweak to the code, replacing Mapping classes with ReusableMapping classes, we can batch convert our Reusable Mapping dataset components to flow based components in an instant. The reason it works via the API is due to the inheritance of the ReuseableMapping class. It inherits the same component methods from the interface oracle.odi.domain.mapping.IMapComponentOwner, which in turn have the same methods and functions, such as convertToFlow, as we had available in the Mapping class. I’m not quite sure why ODI Studio doesn’t expose “Convert to Flow” in the Reusable Mappings, but I’m sure it’s a simple fix we’ll see in an ODI 12c release down the road.

So there you have it, another Data Integration Tip from Rittman Mead – this time, a little help post-migration from ODI 11g to ODI 12c. If you would like more details on how Rittman Mead can help your migration of Oracle Data Integrator to the latest version, send us a note at info@rittmanmead.com. We’d love to help!

 

Categories: BI & Warehousing

OAUX Emerging Technologies RD&D Strategy

Oracle AppsLab - Thu, 2015-04-16 07:46

Speaking of strategies, Misha (@mishavaughan) asked me to write up an article–not a post, there’s a difference–describing how this team goes about its business, i.e. researching, designing and developing solutions for the emerging technologies that will affect our users in the near and not-so-near future.

eleven

You can, and should, read the resulting article over at the mothership, Usableapps (@usableapps). Check it out:

New emphasis on emerging technology shapes Oracle’s user experience strategy

Floyd (@fteter) read it, and so should you because why not?

Untitled

Surprise, there’s method to the madness. It may look like we just play with toys, and while that’s partially true, we’ve always played with purpose.

Thinking back on the eight years I’ve been doing this, I don’t recall ever outlining and presenting a strategy at this level, and the whole exercise of putting the strategy I have in my head into words and slides was enlightening.

Point of fact, we’ve always had a strategy, and it hasn’t changed much, although the technologies we investigate have.

Serious h/t to Paul (@ppedrazzi) in the early years, and Jeremy (@jrwashley) more recently, for shaping, advancing, and fostering the AppsLab vision.

Anyway, now you know where we invest our time and why, or if you knew that already, you now have a handy article to refer to, should you need a refresher or should be you enlightening someone new to the party.

Enjoy.
Possibly Related Posts:

Interesting Comment on Pearson’s LMS Plans From Customer

Michael Feldstein - Thu, 2015-04-16 06:20

By Phil HillMore Posts (307)

On April 1, long-time eCollege (aka Pearson’s LearningStudio) customer Texas Christian University (TCU) gave an update on their LMS selection process to the student newspaper TCU360. In this article there was an interesting statement[1] worth exploring [emphasis added].

“eCollege” will soon be a thing of the past.

TCU has narrowed its search for a Learning Management System to two platforms, Blackboard and Desire2Learn (D2L).

“We’ve had feedback, from faculty specifically, that it’s time for change,” Assistant Provost of Educational Technology and Faculty Development Romy Hughes said.

TCU has used Pearson’s Learning Studio system since 1999.

“Pearson is out of the learning management system game,” Hughes said. “We need something to evolve with the Academy of Tomorrow and where we’re moving to at TCU.”

That last comment got my attention. The eCollege / LearningStudio platform has been around for a long time, and there have been questions about where Pearson was going in the LMS market based on 2011’s introduction of OpenClass. Would OpenClass replace LearningStudio over time, and would it strongly change the LMS market? Would both OpenClass and LearningStudio continue as standalone LMS products? It is quite clear by now that OpenClass itself has not changed the market, but LearningStudio has a long-time customer base of fully online programs – many in the for-profit sector.

Furthermore, with Pearson’s reorganization around efficacy, their core document states (p. 14):

The overarching idea was that our investments should be driven towards those products which deliver the highest impact for learners while sustaining us financially so we can continue to invest in new models and improvements.

There is a question of whether Pearson’s internal reviews around LearningStudio and OpenClass are leading to strategic changes around their position in the LMS market.

I asked for Pearson to provide official comment, and David Daniels, president of Pearson Education, responded with the following clarification.

Pearson has not left the LMS space and will continue to invest in our current generation MyLabs and support our many customers on LearningStudio into the future. Pearson’s Learning Studio still powers over 3 Million enrollments annually in the fully remote, online learning space. Our commitment to servicing these students and their institutions is unwavering. Our focus has been and will be on how we support these students within the learning environment. Our range of support services includes learning design and assessment support, integration, data and analytics , student retention, tutoring, and technical support.

This statement is quite clear that there is no imminent end-of-life for LearningStudio, and it is also quite clear about their focus on the “fully remote, online learning space”. This system is primarily used by fully online programs, but there have been a handful of campus-wide clients such as TCU still using the system from the early days. That Pearson LearningStudio would not be appropriate for TCU’s future is partially explained by this focus on full online.

The statement does make an interesting distinction, however, between investing in MyLabs and supporting LearningStudio. My read is that Pearson is not investing in LearningStudio in terms of major product advances and next generation plans but is continuing to fully support current customers. My read is also that Pearson would add new customers to LearningStudio if part of a broader deal tied to content or online “enabling” services (such as Embanet), but that there is no plan for the company to compete in pure LMS competitions.

To help back up this reading, I discovered that the TCU360 article was updated as follows:

“Pearson is out of the learning management system game,” Hughes said. “We need something to evolve with the Academy of Tomorrow and where we’re moving to at TCU.”Hughes said Pearson withdrew from the LMS search process for TCU but remains an LMS provider.

At TCU, at least, the competition is down to Blackboard and D2L, with D2L in the driver’s seat. This competition is also notable by Canvas not being one of the finalists (haven’t seen this situation lately).

One final note on TCU’s selection process described in the article.

These percentages were based on a 214-item questionnaire called the Review Request for Information (RFI) document. These questions were used to assess whether or not a system had the features that TCU was looking for.

“Most LMS vendors told us it took them exactly three months to complete [the questionnaire] because there were so many specific details we were looking for,” Hughes said.

I’ve said it before and I’ll say it again – making a strategic platform selection by a laundry list of hundreds of detailed feature requirements is not a healthy process. I would not brag that it took vendors three full months to complete a questionnaire. But we have one more example to clarify Michael’s classic “Dammit, the LMS” post.

Do you want to know why the LMS has barely evolved at all over the last twenty years and will probably barely evolve at all over the next twenty years? It’s not because the terrible, horrible, no-good LMS vendors are trying to suck the blood out of the poor universities. It’s not because the terrible, horrible, no-good university administrators are trying to build a panopticon in which they can oppress the faculty. The reason that we get more of the same year after year is that, year after year, when faculty are given an opportunity to ask for what they want, they ask for more of the same.

I’d be willing to bet that the vast majority of those 214 items in the RFI are detailed features or direct derivatives of what TCU already has. Even if I’m wrong, it makes little sense for a school to specify the future with detailed requirements; they’re selecting a vendor, not specifying a new design. I wish TCU the best in their LMS selection process, but I would recommend that they put more emphasis on strategic analysis and less on counting check-boxes.

  1. Statement from the original article before it was updated.

The post Interesting Comment on Pearson’s LMS Plans From Customer appeared first on e-Literate.

Case Study: Oracle Forms Migration to Formspider

Gerger Consulting - Thu, 2015-04-16 01:55
TEAM GmbH, a Formspider customer from Germany, has successfully migrated their product ProStore from Oracle Forms to Formspider. We recently did a joint webinar with TEAM GmbH about their project. In particular we talked about the following topics:

- What other products did TEAM look at?

- Why did TEAM choose Formspider?

- What are the benefits of using Formspider?

- What is it like to work with the company behind Formspider?

- What was TEAM’s approach to Forms Migration?

- A demo of the application

The webinar was quite popular with over 50 attendees. TEAM Division Manager Frank Zscherlich and Product Manager Michael Wibberg answered many questions from the audience.

Below is the recording of the webinar. If you are looking for a way to migrate your Oracle Forms applications to a newer technology, I highly recommend you to watch it.


Oracle Forms Migration to Formspider from Yalim K. Gerger on Vimeo.

You can also watch a short demo of the migrated application below.


Demo of an Oracle Forms Application Migrated to Formspider from Yalim K. Gerger on Vimeo.
Categories: Development

C15LV: Nothing to BLOG About - Think Again

Yann Neuhaus - Wed, 2015-04-15 19:58

Before going to the airport to come back in Europe, the las session I assisted at IOUG Collaborate is a panel: Nothing to BLOG About - Think Again

Some famous bloggers were there to help, motivate and inspire people that are afraid to blog. The idea is good. The session was good. The panelists have shared ideas, tips, experience. It was very good.

I'm already a blogger, but that inspired me anyway. In the same idea I'll expose how I started blogging, just in case it gives some ideas.

Note that tomorrow I speak about participating in the community (read, share, write, talk, blog, tweet, link,...) at our internal event dbi xChange. Thus, waiting at the airport, it's a good occasion to explain how I start blogging in case it gives some ideas to people that want to start.

You Are Trying To Access a Page That Is No Longer Active.The Referring Page May Have Come From a Previous Session. Please Select Home To Proceed

Vikram Das - Wed, 2015-04-15 16:06
Shahed pinged me about this error.  It was coming after logging in.  This R12.1.3 instance had just migrated from an old server to a new one. Once you logged in this error would be displayed:

You Are Trying To Access a Page That Is No Longer Active.The Referring Page May Have Come From a Previous Session. Please Select Home To Proceed

The hits on support.oracle.com were not helpful, but a gave a clue that it may have something to do with session cookie.  So I used Firefox to check http headers.  If you press Ctrl+Shift+K, you will get a panel at the bottom of the browser. Click on Network tab, click on the AppsLocalLogin.jsp and on the right side of the pane, you'll see a cookie tab.

The domain appearing in the cookie tab was from the old server.  So I checked:

select session_cookie_domain from icx_parameters;
olddomain.justanexample.com

So I nullified it:

update icx_parameters set session_cookie_domain=null;

commit;

Restarted Apache

cd $ADMIN_SCRIPTS_HOME
adapcctl.sh stop
adapcctl.sh start

No more error.  I was able to log in and so was Shahed.
Categories: APPS Blogs

Mobile - Get Proactive!

Joshua Solomin - Wed, 2015-04-15 14:39
Get Proactive!

SQL Monitor report little improvement in 12c

Yann Neuhaus - Wed, 2015-04-15 14:35

This is a very short post about something I though I had already blogged about.

I like SQL Monitoring active reports. The are graphical, and still very light. There is only one thing I don't like about it in 11g and it has been improved in 12c

Oracle APEX 5.0 released today

Dimitri Gielis - Wed, 2015-04-15 14:34
After 2.5 years of development, today is the day APEX 5.0 is publicly released and ready to be downloaded to install on your own environment.

In my view it's the best release ever. Not so much of the new Page Designer - although that is definitely a piece of art and it increased productivity even further - but because it's the first time whole of APEX got refreshed and every piece was put under a radar to see how it could be improved. All the small changes and the new UI, together with the Page Designer makes it a whole new development tool, without losing it's strengths from before.

Also note that APEX 5.0 enables many new features build on top of the Oracle Database 12c features, so if you're on that database, you'll see even more nice features.

If you wonder if you should wait with upgrading to APEX 5.0 because you're afraid that your current APEX applications break, I can only share that I upgraded many of my applications as part of EA/beta and most of my apps kept running without issues. As always you have to try your applications yourself, but the APEX development team spend a lot of time trying to keep things backwards compatible. But make sure to have a look at the APEX 5.0 release notes and known issues as they contain important information about changes, expected behaviour and workarounds.

You can develop online on apex.oracle.com or you can download APEX 5.0 and install into your own environment.
Categories: Development

Oracle Application Express 5 - The Unofficial Announcement

Joel Kallman - Wed, 2015-04-15 13:35
What started on a whiteboard in New York City more than 2 years ago is now finally realized.  I and the other members of the Oracle Application Express team proudly announce the release of Oracle Application Express 5.

The official blog posting and announcement is HERE.  But this is my personal blog, and the thoughts and words are my own, so I can be a bit more free.

Firstly, I don't ever want to see a release of Oracle Application Express take 2.5 years again, ever.  It's not good for Oracle, not good for Oracle Application Express, and certainly not good for the vast Oracle Application Express community.  We're going to strive, going forward, for a cadence of annual release cycles.  But with this said, I'm not about to apologize for the duration of the APEX 5 release cycle either.  It's broader and more ambitious than anything we've ever approached, and it happened the way it was supposed to happen.  Rather than say "redesigned", I'd prefer to use Shakeeb's words of "reimagined", because that's really what has transpired.  Not only has every one of the 1,945 pages that make up "internal APEX" (like the Application Builder) been visited, redesigned, and modernized, but the Page Designer is a radically different yet productive way to build and maintain your applications.  It takes time to iterate to this high level of quality.

At the end of the day, what matters most for developers is what they can produce with Oracle Application Express.  They'd gladly suffer through the non-Page Designer world and click the mouse all day, as long as what they produced and delivered made them a hero.  And I believe we have delivered on this goal of focusing on high-quality results in the applications you create.  I've seen my share of bad-looking APEX applications over the years, and with prior releases of APEX, we've essentially enabled the creation of these rather poor examples of APEX.  Not everyone is a Shakeeb or Marc.  I'm not.  But we've harnessed the talents of some of the brightest minds in the UI world, who also happen to be on the APEX development team, and delivered a framework that makes it easy for ordinary people like me to deliver beautiful, responsive and accessible applications, out-of-the-box.

What I'm most happy about is what this does for the Oracle Database.  I believe APEX 5 will make superheroes out of our Oracle Database and Oracle Database Cloud customers.  There is a massive wealth of functionality for application developers and data architects and citizen developers and everyone in-between, in the Oracle Database.  And all of it is a simple SQL or PL/SQL call away!  The Oracle Database is extraordinarily good at managing large amounts of data and helping people turn data into information.  And now, for customers to be able to easily create elegant UI and be able to beautifully visualize this information using Oracle Application Express 5, well...it's just an awesome combination.

I am blessed to work with some of the brightest, most focused, professional, talented, and yet humble people on the planet.  As my wife likes to say, they're all "quality people".  It truly takes an array of people who are deep in very different technologies to pull this off - Oracle Database design, data modeling, PL/SQL programming, database security, performance tuning, JavaScript programming, accessibility, Web security, HTML 5 design, CSS layout, graphic artistry, globalization, integration, documentation, testing, and on and on.  Both the breadth and depth of the talent to pull this off is staggering.

You might think that we get to take a breath now.  In fact, the fun only begins now and plenty of hard work is ahead for all of us.  But we look forward to the great successes of our many Oracle customers.  The #orclapex community is unrivaled.  And we are committed to making heroes out of every one of them.  That's the least we could do for the #orclapex community, such an amazingly passionate and vibrant collection of professionals and enthusiasts.

When anyone asks about the "watershed event" for Oracle Application Express, you can tell them that the day was April 15, 2015 - when Oracle Application Express 5 was released.

Joel

P.S.  #letswreckthistogether

Chrome and E-Business Suite

Vikram Das - Wed, 2015-04-15 12:23
Dhananjay came to me today.  He said that his users were complaining about forms not launching after upgrading to the latest version of Chrome. On launching forms they got this error:

/dev60cgi/oracle forms engine Main was not found on this server

I recalled that Google Chrome team had announced that they would not support java going forward. Googling with keywords chrome java brought this page:

https://java.com/en/download/faq/chrome.xml#npapichrome

It states that:

NPAPI support by ChromeThe Java plug-in for web browsers relies on the cross platform plugin architecture NPAPI, which has long been, and currently is, supported by all major web browsers. Google announced in September 2013 plans to remove NPAPI support from Chrome by "the end of 2014", thus effectively dropping support for Silverlight, Java, Facebook Video and other similar NPAPI based plugins. Recently, Google has revised their plans and now state that they plan to completely remove NPAPI by late 2015. As it is unclear if these dates will be further extended or not, we strongly recommend Java users consider alternatives to Chrome as soon as possible. Instead, we recommend Firefox, Internet Explorer and Safari as longer-term options. As of April 2015, starting with Chrome Version 42, Google has added an additional step to configuring NPAPI based plugins like Java to run — see the section Enabling NPAPI in Chrome Version 42 and later below.Enabling NPAPI in Chrome Version 42 and laterAs of Chrome Version 42, an additional configuration step is required to continue using NPAPI plugins.
  1. In your URL bar, enter:
    chrome://flags/#enable-npapi 
  2. Click the Enable link for the Enable NPAPI configuration option.
  3. Click the Relaunch button that now appears at the bottom of the configuration page.
Developers and System administrators looking for alternative ways to support users of Chrome should see this blog, in particular "Running Web Start applications outside of a browser" and "Additional Deployment Options" section.Once Dhananjay did the above steps, Chrome started launching forms again.  He quickly gave these steps to all his users who had upgraded to the latest version of Chrome (version 42) and it started working form them too.Oracle doesn't certify E-Business Suite forms on Chrome.  Only self service pages of E-Business Suite are certified on Google Chrome.
Categories: APPS Blogs

Oracle Application Express 5.0 now available

Marc Sewtz - Wed, 2015-04-15 11:44
The result of a two and half year engineering effort, Oracle Application Express 5.0 represents the greatest advancement of Oracle Application Express in its 10-year history.  Oracle Application Express 5.0 enables customers to develop, design and deploy beautiful, responsive, database-driven desktop and mobile applications using only a browser.  

Download APEX 5.0 Today!
Now that APEX 5.0 is finally available, it’s time to spread the word and to get everyone up to speed on what’s new in 5.0, how to make best use of the beautiful, new Universal Theme and how to get the most out of the incredibly powerful new Page Designer. 

So here are some of the events coming up in the near future, where you can learn all about APEX 5.0 and meet many of the APEX development team members in person:

APEXposed 2015
Montreal, Canada - May 6, 2015
  • Keynote: Oracle Application Express 5.0 (Marc Sewtz - 8:30 - 9:30am)
  • Oracle Application Express 5.0 Plug-In Enhancements (Patrick Maniraho - 1:00 - 2:00pm)
  • Transitioning to Oracle Application Express 5.0 (Marc Sewtz - 2:15 - 3:15pm)
UKOUG Application Express SIGSolihull, UK - May 14, 2015
  • Oracle Application Express 5.0 (Anthony Rayner - 2:25-3:35pm)
APEX Connect 2015Düsseldorf, Germany - June 9 - 10, 2015
  • Keynote: Oracle Application Express 5.0 (Marc Sewtz - Jun-09 - 9:30 - 10:30am)
  • Beautiful UI with APEX 5 (Shakeeb Rahman - Jun-09 11:00 - 11:45am)
  • Der Oracle Application Express Entwicklungsprozess (Marc Sewtz - Jun-10 - 11:00 - 11:45am)
BGOUGPlovdiv, Bulgaria - June 12 - 14, 2015
  • Oracle Application Express 5.0 New Features (Marc Sewtz)
  • Transitioning to Oracle Application Express 5.0 (Marc Sewtz)
ODTUG KScope 2015Hollywood, FL, USA - June 21 - 25, 2015
June 21
  • APEX Episode 5:  A New Frontier (Joel Kallman, 8:30 - 9:30am)
  • Need for Speed: Page Designer (Patrick Wolf, 9:30 - 10:30am)
  • Interstellar: The Universal Theme (Shakeeb Rahman, 11:00am - 12:00pm)
  • The Fifth Element: HMTL5 + APEX5 Mobile (Marc Sewtz, 1:00 - 2:00pm)
  • The Matrix Reloaded: Interactive Report (Anthony Rayner, 2:00 - 3:00pm)
  • The Prestige: Converting to APEX 5.0 Universal Theme (David Peake, 3:15 – 4:15pm)
June 22
  • Transitioning to Oracle Application Express 5.0 (David Peake, 08:30am - 09:30pm)
  • Introduction to Oracle Application Express (David Peake, 09:45 -10:45am)
  • You Don't Lack APEX Skills; You Lack Oracle Skills (Joel Kallman, 2:00 - 3:00pm)
June 23
  • Self-Service Application Deployment in a Runtime Instance: This Is How We (Oracle) Do It... (Jason Straub, 4:45 - 5:45pm)
June 24
  • Oracle Application Express 5.0 New Features (Hilary Farrell, 9:45 - 10:45am)
  • Application Express 5.0: Features Nobody Else Will Tell You About (David Peake, 1:45 - 2:45pm)

... and be sure to join one of the numerous APEX Meetup groups world wide - and if there isn't one close to where you live, then consider starting your own local APEX Meetup group - don't know how to do that? Check out the APEX Meetup site for helpful information:
http://apexmeetups.com/

Oracle APEX 5.0 Available for Download!

Patrick Wolf - Wed, 2015-04-15 11:43
After a longer development cycle than usual, Oracle Application Express 5.0 is finally available for download! I think it was worth the wait. It comes with a ton of features, here are just a few marquee features. Page Designer Universal … Continue reading →
Categories: Development

Cartesian join

Jonathan Lewis - Wed, 2015-04-15 11:40

Some time ago I pulled off the apocryphal “from 2 hours to 10 seconds” trick for a client using a technique that is conceptually very simple but, like my example from last week, falls outside the pattern of generic SQL. The problem (with some camouflage) is as follows: we have a data set with 8 “type” attributes which are all mandatory columns. We have a “types” table with the same 8 columns together with two more columns that are used to translate a combination of attributes into a specific category and “level of relevance”. The “type” columns in the types table are, however, allowed to be null although each row must have at least one column that is not null – i.e. there is no row where every “type” column is null.

The task is to match each row in the big data set with all “sufficiently similar” rows in the types table and then pick the most appropriate of the matches – i.e. the match with the largest “level of relevance”. The data table had 500,000 rows in it, the types table has 900 rows. Here’s a very small data set representing the problem client data (cut down from 8 type columns to just 4 type columns):


create table big_table(
	id		number(10,0)	primary key,
	v1		varchar2(30),
	att1		number(6,0),
	att2		number(6,0),
	att3		number(6,0),
	att4		number(6,0),
	padding		varchar2(4000)
);

create table types(
	att1		number(6,0),
	att2		number(6,0),
	att3		number(6,0),
	att4		number(6,0),
	category	varchar2(12)	not null,
	relevance	number(4,0)	not null
);

insert into big_table values(1, 'asdfllkj', 1, 1, 2, 1, rpad('x',4000));
insert into big_table values(2, 'rirweute', 1, 3, 1, 4, rpad('x',4000));

insert into types values(   1, null, null, null, 'XX',  10);
insert into types values(   1, null, null,    1, 'YY',  20);
insert into types values(   1, null,    1, null, 'ZZ',  20);

commit;

A row from the types table is similar to a source row if it matches on all the non-null columns. So if we look at the first row in big_table, it matches the first row in types because att1 = 1 and all the other attN columns are null; it matches the second row because att1 = 1 and att4 = 1 and the other attN columns are null, but it doesn’t match the third row because types.att3 = 1 and big_table.att3 = 2.

Similarly, if we look at the second row in big_table, it matches the first row in types, doesn’t match the second row because types.att4 = 1 and big_table.att4 = 4, but does match the third row. Here’s how we can express the matching requirement in SQL:


select
	bt.id, bt.v1,
	ty.category,
	ty.relevance
from
	big_table	bt,
	types		ty
where
	nvl(ty.att1(+), bt.att1) = bt.att1
and	nvl(ty.att2(+), bt.att2) = bt.att2
and	nvl(ty.att3(+), bt.att3) = bt.att3
and	nvl(ty.att4(+), bt.att4) = bt.att4
;

You’ll realise, of course, that essentially we have to do a Cartesian merge join between the two tables. Since there’s no guaranteed matching column that we could use to join the two tables we have to look at every row in types for every row in big_table … and we have 500,000 rows in big_table and 900 in types, leading to an intermediate workload of 450,000,000 rows (with, in the client case, 8 checks for each of those rows). Runtime for the client was about 2 hours, at 100% CPU.

When you have to do a Cartesian merge join there doesn’t seem to be much scope for reducing the workload, however I didn’t actually know what the data really looked like so I ran a couple of queries to analyse it . The first was a simple “select count (distinct)” query to see how many different combinations of the 8 attributes existed in the client’s data set. It turned out to be slightly less than 400.

Problem solved – get a list of the distinct combinations, join that to the types table to translate to categories, then join the intermediate result set back to the original table. This, of course, is just applying two principles that I’ve discussed before: (a) be selective about using a table twice to reduce the workload, (b) aggregate early if you can reduce the scale of the problem.

Here’s my solution:


with main_data as (
	select
		/*+ materialize */
		id, v1, att1, att2, att3, att4
	from
		big_table
),
distinct_data as (
	select
		/*+ materialize */
		distinct att1, att2, att3, att4
	from	main_data
)
select
	md.id, md.v1, ty.category, ty.relevance
from
	distinct_data	dd,
	types		ty,
	main_data	md
where
	nvl(ty.att1(+), dd.att1) = dd.att1
and	nvl(ty.att2(+), dd.att2) = dd.att2
and	nvl(ty.att3(+), dd.att3) = dd.att3
and	nvl(ty.att4(+), dd.att4) = dd.att4
and	md.att1 = dd.att1
and	md.att2 = dd.att2
and	md.att3 = dd.att3
and	md.att4 = dd.att4
;

And here’s the execution plan.


---------------------------------------------------------------------------------------------------------
| Id  | Operation                  | Name                       | Rows  | Bytes | Cost (%CPU)| Time     |
---------------------------------------------------------------------------------------------------------
|   0 | SELECT STATEMENT           |                            |    12 |  2484 |    11  (10)| 00:00:01 |
|   1 |  TEMP TABLE TRANSFORMATION |                            |       |       |            |          |
|   2 |   LOAD AS SELECT           | SYS_TEMP_0FD9D6619_8FE93F1 |       |       |            |          |
|   3 |    TABLE ACCESS FULL       | BIG_TABLE                  |     2 |   164 |     2   (0)| 00:00:01 |
|   4 |   LOAD AS SELECT           | SYS_TEMP_0FD9D661A_8FE93F1 |       |       |            |          |
|   5 |    HASH UNIQUE             |                            |     2 |   104 |     3  (34)| 00:00:01 |
|   6 |     VIEW                   |                            |     2 |   104 |     2   (0)| 00:00:01 |
|   7 |      TABLE ACCESS FULL     | SYS_TEMP_0FD9D6619_8FE93F1 |     2 |   164 |     2   (0)| 00:00:01 |
|*  8 |   HASH JOIN                |                            |    12 |  2484 |     6   (0)| 00:00:01 |
|   9 |    NESTED LOOPS OUTER      |                            |     6 |   750 |     4   (0)| 00:00:01 |
|  10 |     VIEW                   |                            |     2 |   104 |     2   (0)| 00:00:01 |
|  11 |      TABLE ACCESS FULL     | SYS_TEMP_0FD9D661A_8FE93F1 |     2 |   104 |     2   (0)| 00:00:01 |
|* 12 |     TABLE ACCESS FULL      | TYPES                      |     3 |   219 |     1   (0)| 00:00:01 |
|  13 |    VIEW                    |                            |     2 |   164 |     2   (0)| 00:00:01 |
|  14 |     TABLE ACCESS FULL      | SYS_TEMP_0FD9D6619_8FE93F1 |     2 |   164 |     2   (0)| 00:00:01 |
---------------------------------------------------------------------------------------------------------

Predicate Information (identified by operation id):
---------------------------------------------------
   8 - access("MD"."ATT1"="DD"."ATT1" AND "MD"."ATT2"="DD"."ATT2" AND
              "MD"."ATT3"="DD"."ATT3" AND "MD"."ATT4"="DD"."ATT4")
  12 - filter("DD"."ATT1"=NVL("TY"."ATT1"(+),"DD"."ATT1") AND
              "DD"."ATT2"=NVL("TY"."ATT2"(+),"DD"."ATT2") AND
              "DD"."ATT3"=NVL("TY"."ATT3"(+),"DD"."ATT3") AND
              "DD"."ATT4"=NVL("TY"."ATT4"(+),"DD"."ATT4"))

Critically I’ve taken a Cartesian join that had a source of 500,000 and a target of 900 possible matches, and reduced it to a join between the 400 distinct combinations and the 900 possible matches. Clearly we can expect this to to take something like one twelve-hundredth (400/500,000) of the work of the original join – bringing 7,200 seconds down to roughly 6 seconds. Once this step is complete we have an intermediate result set which is the 4 non-null type columns combined with the matching category and relevance columns – and can use this in a simple and efficient hash join with the original data set.

Logic dictated that the old and new results would be the same – but we did run the two hour query to check that the results matched.

Footnote: I was a little surprised that the optimizer produced a nested loops outer join rather than a Cartesian merge in the plan above – but that’s probably an arterfact of the very small data sizes in my test.There’s presumably little point in transferring the data into the PGA when the volume is so small.

Footnote 2: I haven’t included the extra steps in the SQL to eliminate the reduce the intermediate result to just “the most relevant” – but that’s just an inline view with an analytic function. (The original code actually selected the data with an order by clause and used a client-side filter to eliminate the excess!).

Footnote 3: The application was a multi-company application – and one of the other companies had not yet gone live on the system because they had a data set of 5 million rows to process and this query had never managed to run to completion in the available time window.  I’ll have to get back to the client some day and see if the larger data set also collapsed to a very small number of distinct combinations and how long the rewrite took with that data set.

 


APEX 5.0 will be released today

Denes Kubicek - Wed, 2015-04-15 11:14
Great news. Just finished wathching the Google hangout with the APEX team. They confirmed that APEX will be released today. The download should be made available soon. Stay tuned.

Categories: Development

Faster Download of PeopleSoft Images

Duncan Davies - Wed, 2015-04-15 10:32

With the advent of Selective Adoption, many more people will be downloading the huge PeopleSoft Images every 10 weeks or so. They’re large (circa 35GB) and that’s going to take a while even with a decent connection.

What makes matters worse is that the default method (clicking the Download All button) runs the downloads in serial. Even on a 1MB/sec connection that’s going to take ~10 hours to download all 35GB.

Download All

In addition, the download seems to be throttled somewhat, I’m not sure why. The speed reported in the above window varied wildly between 100KB/s and 500KB/s. Even at the top end of that range, only downloading one at a time it’s going to take almost 24 hours for the complete set.

An alternative is to run the downloads in parallel. Instead of clicking Download All, click WGET Options and download the script instead. After a little modification so that it ran on Windows I was able to run 4 copies of the script side-by-side, giving gains not only by downloading in parallel, but the downloads ran faster too:

Download All parallel wget

You can click for a bigger version, but basically the screenshot is of four downloads with a combined download speed of over 4MB/s! All downloads completed in a touch over 2 hours (and this is on a home broadband connection).

 


Webcast: Adaptive Case Management as a Service

WebCenter Team - Wed, 2015-04-15 09:50
Oracle Corporation Banner Adaptive Case Management as a Service

Often times, organizations find that they have to change the way they manage cases to conform to a new system, rather than the system being open and flexible to accommodate their specific needs. Case Managers do not want to be obligated to follow a once-size-fits-all process for case management. They require open, flexible capabilities to handle their cases that can adapt to fit their needs. Join Sofbang and Oracle to learn more about how Adaptive Case Management as a Service (CMaaS) can provide caseworkers and their clients with an adaptable, flexible, configurable platform based way of managing cases with robust, yet easy-to-use, mobile capabilities.

In this webcast you will:
  • Learn what Adaptive Case Management as a Service (CMaaS) is
  • Understand how to reduce the rigidity of a typical system and streamline the approach to case management
  • Discover mobile, UI friendly approaches to managing cases
  • Find out the Sofbang and Oracle Approach to Adaptive CMaaS
  • See a live demo of CMaaS in action
Duration: 45 minutes for Presentation + 15 minutes for Q&A

About Sofbang

Founded in 2000, Sofbang is an Oracle Platinum Partner specialized in providing Oracle Fusion Middleware, Mobile & Oracle Cloud solutions to clients in the Government, Education and Utilities sectors, as well as the mid-market commercial space. Sofbang provides customers with dynamic business process extensions, enterprise mobility and cloud solutions which extend, integrate and simplify enterprise business applications across an organization. We design solutions with Scalability, Flexibility and Extendibility in mind. We call this concept Designed for Change. Our solutions help organizations reduce costs, increase revenue, enhance end-user experience, promote transparency and improve productivity. Our company was founded with the passion that comes from seeing clients achieve strategic success.

Sofbang has received awards and recognitions for developing innovative solutions and delivering outstanding value, including being recognized by CIO Review as one of the 20 Most Promising Campus Technology Providers of 2014 and the winner of the BPM.com and WfMC 2014 Global Award for Excellence in Business Process Management and Workflow for the Chicago Park District.

As an Oracle Platinum Partner, Sofbang is proud to have achieved the following specialization designations from the Oracle Partner Network, recognizing Sofbang’s continued focus in the Oracle Fusion Middleware stack for over a decade, beginning with BEA Systems Inc. Specializations are achieved through competency development, business results, expertise and proven success.

Oracle Service-Oriented Architecture
Oracle WebLogic Server 12c
Oracle Unified Business Process Management 11g
Oracle Enterprise Manager 12c
Oracle Application Grid

Sofbang is headquartered in Chicago, Illinois and is minority owned. To find out more visit: www.sofbang.com.


Red Button Top Register Now Red Button Bottom Live Webcast Calendar April 28, 2015
11:30 am CST |
12:30 pm EST Featured Speakers:

Danny Asnani
Sofbang

Vivek Ahuja –
Sofbang

Mitchell Palski –
Oracle Oracle Platinum Partner Hardware and Software Engineered to Work Together Copyright © 2015, Oracle Corporation and/or its affiliates.
All rights reserved.
Contact Us | Legal Notices and Terms of Use | Privacy Statement

Ellucian Buys Helix LMS, But Will It Matter?

Michael Feldstein - Wed, 2015-04-15 09:14

By Phil HillMore Posts (307)

At this year’s Ellucian users’ conference #elive15, one of the two big stories has been that Ellucian acquired the Helix LMS, including taking on the development team. I have previously described the Helix LMS in “Helix: View of an LMS designed for competency-based education” as well as the subsequent offer for sale in “Helix Education puts their competency-based LMS up for sale”. The emerging market for CBE-based learning platforms is quickly growing, at least in terms of pilot programs and long-term potential, and Helix is one of the most full-featured, well-designed systems out there.

The Announcement

From the announcement:

Ellucian has acquired Helix Education’s competency-based education LMS and introduced a 2015 development partner program to collaborate with customers on the next-generation, cloud-only solution.

As the non-traditional student stands to make up a significant majority of learners by 2019, Ellucian is investing in technologies that align with priorities of colleges and universities it serves. CBE programs offer a promising new way for institutions to reduce the cost and time of obtaining a high-quality degree that aligns with the skills required by today’s employers.

I had been surprised at the announcement of intent-to-sell in December, noting:

The other side of the market effect will be determined by which company buys the Helix LMS. Will a financial buyer (e.g. private equity) choose to create a standalone CBE platform company? Will a traditional LMS company buy the Helix LMS to broaden their reach in the quickly-growing CBE space (350 programs in development in the US)? Or will an online service provider and partial competitor of Helix Education buy the LMS? It will be interesting to see which companies bid on this product line and who wins.

And I am surprised at the answer – a private equity owned ERP vendor. Throughout the mid 2000s there was talk about the ERP vendors like SunGard Higher Education (SGHE) (which combined with Datatel in 2011 and renamed as Ellucian in 2012) and Oracle entering the LMS market by acquisition, yet this did not materialize beyond the dreaded strategic partnership . . . until perhaps this week. But the Helix LMS was designed specifically for CBE programs, not general usage, so is this really a move into the broader LMS market?

When I interviewed Helix Education about the LMS last summer, they stated several times that the system could be used for non-CBE programs, but there is no evidence that this has actually occurred. I’ll admit that it is more likely to expand a CBE system into general usage than it is to convert a course-based traditional LMS into a CBE system, but it is not clear that the end result of such an expansion would remain a compelling product with user experience appreciated by faculty and students. The path is not risk-free.

Based on briefings yesterday at #elive15, there is evidence that:

  • Ellucian plans to expand the Helix LMS (which will be renamed) beyond CBE; and
  • Ellucian understands that there is development still remaining for this broader usage[1].

Ellucian LMS

Courtesy Ryan Schwiebert:

Support for broad set of delivery models: CBE, Online, Hybrid, Blended, Traditional, CE/WFD

One Challenge: Strategy

But there are already signs that Ellucian is not committed to deliver an LMS with “support for broad set of delivery models”. As described at Inside Higher Ed:

At its user conference in New Orleans, Ellucian announced the acquisition of Helix Education’s learning management system. The company will “blend” the software, which supports nontraditional methods of tracking student progress, into its student information system, said Mark Jones, chief product officer at Ellucian. While he stressed that the company is not planning to become a major learning management system provider, Ellucian will make the system available to departments interested in offering competency-based education.

“The initial goal and focus is on enabling competency-based education programs to flourish,” Jones said. “In terms of being a broader L.M.S. solution, if our customers find value… we will certainly have that conversation.”

I asked Jim Ritchey, president of Delta Initiative and who is attending the conference, for his reaction to Ellucian’s strategy. Jim noted the reaction at the conference to the news “seemed to be more of a curiosity than interest”, and then added:

To me, one of the key questions is how Ellucian will “blend” the software. Do they mean that schools will be able to post the results of the competency based courses to the SIS, or are they talking about leveraging other products within the LMS? For example, some of the capabilities of Pilot could be leveraged to deliver additional capabilities to the LMS. The concern I would have is that tying the LMS to other products will cause the LMS development to be dependent on the roadmaps of the other products. Ellucian will need to find the right level of independence for the LMS so it can grow as a solution while using other products to enhance capabilities. Will the LMS get lost?

In addition there the differing nature of the products to consider. The Helix LMS is centered on the learner and the learner’s schedule, while Banner, Colleague, and PowerCampus are centered on academic terms and courses. These differing design concepts could cause the blending process to remove some of the unique value of the LMS.

Another Challenge: Execution

On paper, this deal seems significant. The company with arguably the greatest number of US higher ed clients now owns an LMS that not only has a modern design but also is targeted at the new wave of CBE programs. The real question, however, is whether Ellucian can pull this off based on their own track record.

Since the 2011 acquisition of SGHE by the private equity firm Hellman & Friedman, Ellucian has endured wave after wave of layoffs and cost cutting measures. I described in 2011 how the SGHE acquisition could pay for itself.

If Hellman & Friedman can achieve reasonable efficiencies by combing SGHE with Datatel, this investment could potentially justify itself in 5 – 7 years by focusing on cash flow operating income, even without SGHE finding a way to reverse its decline in revenue.

Add to this Ellucian’s poor track record of delivering on major product upgrades. The transition from Banner 8 to Banner 9, or later to Banner XE, was described in 2008, promised in 2010, re-promised in 2011, and updated in 2012 / 2013. Banner XE is actually a strategy and not a product. To a degree, this is more a statement of the administrative systems / ERP market in general than just on Ellucian, but the point is that this is a company in a slow-moving market. Workday’s entry into the higher education ERP market has shaken up the current vendors – primarily Ellucian and Oracle / Peoplesoft – and I suspect that many of Ellucian’s changes are in direct response to Workday’s new market power.

Ellucian has bought itself a very good LMS and a solid development team. But will Ellucian have the management discipline to finish the product development and integration that hits the sweet spot for at least some customers? Furthermore, will the Ellucian sales staff sell effectively into the academic systems market?

A related question is why Ellucian is trying to expand into this adjacent market. It seems that Ellucian is suffering from having too many products, and the LMS addition that from the outset requires a new set of development could be a distraction. As Ritchey described after the 2012 conference (paraphrasing what he heard from other attendees):

The approach makes sense, but the hard decisions have not been made. Supporting every product is easy to say and not easy to deliver. At some point in time, they will finalize the strategy and that is when we will begin to learn the future.

In The End . . .

The best argument I have read for this acquisition was provided by Education Dive.

Ellucian is already one of the largest providers of cloud-based software and this latest shift with Banner and Colleague will allow its higher education clients to do even more remotely. Enterprise resource planning systems help colleges and universities increase efficiency with technology. Ellucian touts its ERPs as solutions for automating admissions, creating a student portal for services as well as a faculty portal for grades and institutional information, simplifying records management, managing records, and tracking institutional metrics. The LMS acquisition is expected to take the data analytics piece even further, giving clients more information about students to aid in retention and other initiatives.

But these benefits will matter if and only if Ellucian can overcome its history and deliver focused product improvements. The signals I’m getting so far are that Ellucian has not figured out its strategy and has not demonstrated its ability to execute in this area. Color me watchful but skeptical.

  1. See the “development partner program” part of the announcement.

The post Ellucian Buys Helix LMS, But Will It Matter? appeared first on e-Literate.

Monitoring Page Load Time on ADF UI Client Side

Andrejus Baranovski - Wed, 2015-04-15 03:26
In certain situations, it might be useful to monitor ADF page load time. This is pretty easy to achieve with Navigation Timing API and Java Script. Navigation Timing API is supported by modern browsers and allows to retrieve client side load time. It takes into account data transfer time and actual rendering in the browser - real time it took for a user to see the content.

We could use ADF clientListener operation with load type, to identify when ADF UI is loaded. This listener should be added to the ADF UI document tag and it will be invoked at the end of page rendering. Through clientListener, we could invoke our custom JavaScript method, where we could calculate page load time on ADF UI client side:


The most important thing here, is to get page load start time. This value is retrieved from Navigation Timing API (mentioned above) - performance.timing.navigationStart. The rest is easy - we can get load time:


This is how it looks like on runtime. When I recompile ADF application and redeploy it on the server, first load is obviously slower. ADF UI is rendered on the client side (starting from page load request) in 10 seconds (look at top right corner):


Second access is a way faster - page load on the client side happens in 1 second:


You can test it yourself, download sample application (redsam/welcome1) - ADFAltaApp_v2.zip.