Skip navigation.

Feed aggregator

Monitoring PPR Request Time on ADF UI Client Side

Andrejus Baranovski - Fri, 2015-04-17 12:55
We can measure how long it takes to process request on the server side, however it is equally important to measure how long PPR request takes on the client side. Mainly because this will be a key factor for application performance exposed to the end user. There is relatively easy approach in JSF 2.0 to measure PPR request time on client side - with a special ajax tag. ADF 11g R2 and ADF 12c are based on JSF 2.0, this means we can use such tag and measure request performance. Read in my previous post how to monitor page load time in ADF UI client - Monitoring Page Load Time on ADF UI Client Side.

Here is the example of ajax tag. It provides special property called onevent, this property points to custom JavaScript method, which will be invoked by the framework when PPR request starts and ends:


Ajax tag can be used for various ADF UI components, initiating request. Below you can see example of ADF Faces button configured with onevent monitoring, it points to custom JavaScript monitor method:


JavaScript monitor method is invoked automatically, when request starts and succeeds. This means we can get start and end time, calculate total time taken to process PPR request from click to rendered response:


I would like to emphasise importance of this approach, based on example of ADF TF opening. Task Flow is a server side concept, on runtime its all is converted to HTML and rendered in the browser. When user clicks on the button, to render ADF TF content, he waits until it is initialised on the server side, business logic is executed and finally response is rendered. My example contains Method Call activity with delay code intentionally, to demonstrate how PPR request time measurement works:


ExecuteDelay method call invokes Java method, where thread is paused for 5 seconds:


Let's see how it works on runtime. Home page contains a list of employees, there is team hierarchy link available for each employee. On user click, it loads ADF TF with Hierarchy viewer (ADF TF explained above, with thread delay):


PPR request time starts when user clicks on the link and ends when ADF TF UI fragment content is rendered. This gets close to 6 seconds (there is added 5 seconds delay time from TF method call). We can measure, how long it really takes to see the content for the user, starting from the first click:


As soon as PPR request is completed, Hierarchy viewer component renders team structure:


Navigation back to the list is measured as well, it takes below 1 second:


PPR requests time for Save/Cancel/Back buttons in edit screen is measured in the same way:


Download sample application with implementation of described approach - ADFAltaApp_v3.zip.

Log Buffer #419: A Carnival of the Vanities for DBAs

Pythian Group - Fri, 2015-04-17 10:51

This Log Buffer Edition covers Oracle, MySQL, SQL Server blog posts from around the world.

Oracle:

  • Why the Internet of Things should matter to you
  • Modifying Sales Behavior Using Oracle SPM – Written by Tyrice Johnson
  • SQLcl: Run a Query Over and Over, Refresh the Screen
  • Data Integration Tips: ODI 12.1.3 – Convert to Flow
  • JRE 1.8.0_45 Certified with Oracle E-Business Suite

SQL Server:

  • What’s this, a conditional WHERE clause that doesn’t use dynamic SQL?
  • The job of a DBA requires a fusion of skill and knowledge. To acquire this requires a craftsman mindset. Craftsmen find that the better they get at the work, the more enjoyable the work gets, and the more successful they become.
  • Using SQL to perform cluster analysis to gain insight into data with unknown groups
  • There are times when you don’t what to return a complete set of records. When you have this kind of requirement to only select the TOP X number of items Transact SQL (TSQL) has the TOP clause to meet your needs.
  • Spatial Data in SQL Server has special indexing because it has to perform specialised functions.

MySQL:

Profiling MySQL queries from Performance Schema

How to Easily Identify Tables With Temporal Types in Old Format!

The Perfect Server – CentOS 7.1 with Apache2, Postfix, Dovecot, Pure-FTPD, BIND and ISPConfig 3

Database Security – How to fully SSL-encrypt MySQL Galera Cluster and ClusterControl

MDX: retrieving the entire hierarchy path with Ancestors()

Categories: DBA Blogs

JSON for APEX Developers (part 3): Querying JSON in Oracle DB 12c

Dimitri Gielis - Fri, 2015-04-17 09:34
In previous blog posts I talked about JSON for APEX Developers:
In this post I want to show how you store and query JSON data straight in the database.

To start, create a table to store the JSON object:

CREATE TABLE ssn_json
   (id          NUMBER GENERATED ALWAYS AS IDENTITY NOT NULL,
    date_loaded DATE,
    json_document CLOB
    CONSTRAINT ensure_json CHECK (json_document IS JSON));

I inserted a couple of records in the table:


Here's in more detail the JSON in the json_document column:

Now to query the JSON data I can do :


You basically say j (table) . json_document (column) . analyses (first entry in JSON) . ... (other fields of the hierarchy you want to navigate to)
 
You see the data I get back is actually two records as I actually get back the JSON array.

To go into the array and see for example the first record, I can use json_value:


Note that the array start with 0.

And finally to see both records that are in the JSON array I can make use of json_table like this:


The above is just a grasp of what you can do with JSON directly in the database (12c) by using SQL.
If you're interested to read more about how to manipulate JSON in the database, have a look at the documentation, it contains a wealth on information with great examples.

As you can basically query the JSON with SQL you can use this SQL in your APEX reports... in the coming days I'll show you a real case where I used the above techniques to do some interesting searching in data.
Categories: Development

run sudo, ssh, password, su in simulated interactive mode

Laurent Schneider - Fri, 2015-04-17 07:58

Some commands do not like non-interactive mode


$ passwd <<EOF
> oldpassword
> newpassword
> newpassword
> EOF
Changing password for user lsc.
Current password for lsc@example.com: passwd: Authentication token manipulation error
$ echo oraclepassword | su - oracle
standard in must be a tty
$ echo sudopassword | sudo su - oracle
[sudo] password for lsc:
sudo: sorry, you must have a tty to run sudo

But ok, if you really want to run those in a script, you have plenty of clean (no hack there) ways of doing this.

For instance, let’s use a screen called BAR.


$ xterm -e "screen -S BAR" &
[1]     31732

Now we have an interactive terminal. It could be redirected to a frame buffer device if no x server is started.

Not really a human device, but an interactive terminal.

Now let’s send stuff


$ CR="$(echo '\r')"
$ screen -S BAR -X stuff "sudo su - oracle$CR"
$ screen -S BAR -X stuff "sudopassword$CR"
$ screen -S BAR -X stuff "id > /tmp/xxx$CR"
$ screen -S BAR -X stuff "exit$CR"
$ screen -S BAR -X stuff "exit$CR"
[1] +  Done                    xterm -e "screen -S BAR" &
$ cat /tmp/xxx
uid=100(oracle) gid=100(dba) groups=100(dba)

Usual disclaimer: it is a bad security practice to hardcode your passwords in scripts. Use this only if you really understand security. Read man openssl about how to use openssl to encrypt your password. Ask your security friends before trying

Repeating another SQLCL EA release

Kris Rice - Fri, 2015-04-17 06:57
    It's been a month and a lot of bugs have been fixed in that time and new features added. That adds up to time for another EA for our new sqlcl.  The third sqlcl EA is out now and can be found here on the SQL Developer 4.1 download page.     Not sure what this sqlcl thing is ? Check here. LOAD:  Making CSV loading to the db very easy Here's a very quick example of the possibilities.

Notes on indexes and index-like structures

DBMS2 - Thu, 2015-04-16 16:42

Indexes are central to database management.

  • My first-ever stock analyst report, in 1982, correctly predicted that index-based DBMS would supplant linked-list ones …
  • … and to this day, if one wants to retrieve a small fraction of a database, indexes are generally the most efficient way to go.
  • Recently, I’ve had numerous conversations in which indexing strategies played a central role.

Perhaps it’s time for a round-up post on indexing. :)

1. First, let’s review some basics. Classically:

  • An index is a DBMS data structure that you probe to discover where to find the data you really want.
  • Indexes make data retrieval much more selective and hence faster.
  • While indexes make queries cheaper, they make writes more expensive — because when you write data, you need to update your index as well.
  • Indexes also induce costs in database size and administrative efforts. (Manual index management is often the biggest hurdle for “zero-DBA” RDBMS installations.)

2. Further: 

  • A DBMS or other system can index data it doesn’t control.
    • This is common in the case of text indexing, and not just in public search engines like Google. Performance design might speak against recopying text documents. So might security.
    • This capability overlaps with but isn’t exactly the same thing as an “external tables” feature in an RDBMS.
  • Indexes can be updated in batch mode, rather than real time.
    • Most famously, this is why Google invented MapReduce.
    • Indeed, in cases where you index external data, it’s almost mandatory.
  • Indexes written in real-time are often cleaned up in batch, or at least asynchronously with the writes.
    • The most famous example is probably the rebalancing of B-trees.
    • Append-only index writes call for later clean-up as well.

3. There are numerous short-request RDBMS indexing strategies, with various advantages and drawbacks. But better indexing, as a general rule, does not a major DBMS product make.

  • The latest example is my former clients at Tokutek, who just got sold to Percona in a presumably small deal — regrettably without having yet paid me all the money I’m owed. (By the way, the press release for that acquisition highlights TokuDB’s advantages in compression much more than it mentions straight performance.)
  • In a recent conversation with my clients at MemSQL, I basically heard from Nikita Shamgunov that:
    • He felt that lockless indexes were essential to scale-out, and to that end …
    • … he picked skip lists, not because they were the optimal lockless index, but because they were good enough and a lot easier to implement than the alternatives.
  • Red-black trees are said to be better than B-trees. But they come up so rarely that I don’t really understand how they work.
  • solidDB did something cool with Patricia tries years ago. McObject and ScaleDB tried them too. Few people noticed or cared.

I’ll try to explain this paradox below.

4. The analytic RDBMS vendors who arose in the previous decade were generally index-averse. Netezza famously does not use indexes at all. Neither does Vertica, although the columns themselves played some of the role of indexes, especially give the flexibility in their sort orders. Others got by with much less indexing than was common in, for example, Oracle data warehouses.

Some of the reason was indexes’ drawbacks in terms of storage space and administrative overhead. Also, sequential scans can be much faster from spinning disk than more selective retrieval, so table scans often outperformed index-driven retrieval.

5. It is worth remembering that almost any data access method brings back more data than you really need, at least as an intermediate step. For starters, data is usually retrieved in whole pages, whether you need all their contents or not. But some indexing and index-alternative technologies go well beyond that.

  • To avoid doing true full table scans, Netezza relies on “zone maps”. These are a prominent example of what is now often called data skipping.
  • Bloom filters in essence hash data into a short string of bits. If there’s a hash collision, excess data is returned.
  • Geospatial queries often want to return data for regions that have no simple representation in the database. So instead they bring back data for a superset of the desired region, which the DBMS does know how to return.

6. Geospatial indexing is actually one of the examples that gave me the urge to write this post. There are two main geospatial indexing strategies I hear about. One is the R-tree, which basically divides things up into rectangles, rectangles within those rectangles, rectangles within those smaller rectangles, and so on. A query initially brings back the data within a set of rectangles whose union contains the desired region; that intermediate result is then checked row by row for whether it belongs in the final result set.

The other main approach to geospatial indexing is the space-filling curve. The idea behind this form of geospatial indexing is roughly:

  • For computational purposes, a geographic region is of course a lattice of points rather than a true 2-dimensional continuum.
  • So you take a lattice — perhaps in the overall shape of a square — and arrange its points in a sequence, so that each point is adjacent in some way to its predecessor.
  • Then regions on a plane are covered by subsequences (or unions of same).

The idea gets its name because, if you trace a path through the sequence of points, what you get is an approximation to a true space-filling curve.

7. And finally — mature DBMS use multiple indexing strategies. One of the best examples of a DBMS winning largely on the basis of its indexing approach is Sybase IQ, which popularized bitmap indexing. But when last I asked, some years ago, Sybase IQ actually used 9 different kinds of indexing. Oracle surely has yet more. This illustrates that different kinds of indexes are good in different use cases, which in turn suggests obvious reasons why clever indexing rarely gives a great competitive advantage.

Categories: Other

Want to Outperform Your Competitors? 4 Ways to Serve Up a Cloud Advantage

Linda Fishman Hoyle - Thu, 2015-04-16 13:42

A Guest Post by Rod Johson (pictured left), Group Vice President, Applications Business Group, Oracle

With more and more companies moving towards the cloud we wanted to find out how this cloud adoption impacted the success of these businesses. To do this, Oracle sponsored a global study, “Cloud Computing Comes of Age,” conducted by Harvard Business Review Analytic Services.

What did we find out?

  1. IT is not your opponent. IT and LOB share equal responsibility for cloud in high performing organizations.
  2. Move further, faster. Cloud leaders are not only more likely to use cloud across the top five functions (recruiting, marketing, sales, training, travel/expense management), but are also much more likely to be pushing cloud into more core business functions including procurement, supply chain and accounting.
  3. Be a role model. More than twice as many cloud leader respondents said that their CIOs had taken a leadership role in the move to the cloud compared to cloud novices (62 percent to 31 percent). These CIOs value the agility and efficiency cloud provides and they’ve made it a part of every conversation.
  4. Play to win. 72 percent of cloud leaders launched new products, 62 percent entered new markets, 55 percent expanded geographically, and 39 percent launched new business over the past 3 years.

The survey shows the clear business benefits of adopting cloud computing, but also highlights important insights for organizations that are looking to capitalize on the opportunities presented by the cloud.  IT and business leaders must work together to promote a more holistic cloud strategy if their organizations are to benefit from the next wave of cloud computing.

 Find out more about the study here.

Oracle Priority Support Infogram for 16-APR-2015

Oracle Infogram - Thu, 2015-04-16 13:36

Time to Patch!
From the Oracle E-Business Suite Technology blog:
Critical Patch Update for April 2015 Now Available
APEX
From OTN DBA/DEV Watercooler: Oracle APEX 5 Is Here
SOA
SOA Suite 12c: Topology Suggestions, from the SOA & BPM Partner Community Blog.
From the AMIS Technology Blog: Use Oracle Stream Explorer and the Service Execution Reporter policy to analyze service behavior – find too-late-closing flights on Saibot Airport
Data Integrator
Oracle Data Integrator Enterprise Edition Advanced Big Data Option Part 1- Overview and 12.1.3.0.1 install, from RittmanMead.
Java
YouTube: WADL-Driven REST Client Generation, from Geertjan’s Blog.
At The Aquariumeveryone seems to be getting ready for   Java EE @ GIDS 2015
From the same blog:
JSON-P 1.1 (JSR 374) update: JSON Pointer & JSON Patch
Last Call to Submit to the JavaOne Java EE Track
Java EE Security API (JSR 375) Update
WebLogic
WLS Tip: Installing WebLogic with Chef, from The WebLogic Server Blog.
Analytics
Oracle Business Analytics on YouTube.
Data Warehouse
Statement of Direction -- Big Data Management System, from The Data Warehouse Insider.
MAF
From WebLogic Partner Community EMEA:
Reading MAF iOS Simulator Logging Output
MAF 2.0: Using Local Database
Coherence
From Making Software Work: Warming a Coherence Cache using HotCache
IT Opinion
On Choosing Mature Technologies, from Oracle's MySQL Blog.
EBS
From the Oracle E-Business Suite Support blog:
OM Suite 12.1.3+ Recommended Patch Collections Now Available!
A Brief Discussion of Globalization Profile Options
New to the Procurement Accounting Space - Introducing the EBS Procurement Accrual Accounting Analyzer
Functional Impact Assessment Documents List for E-Business Suite Release 12.1 & 12.2
From the Oracle E-Business Suite Technology blog:
Critical Patch Update for April 2015 Now Available

EBS OA Extensions for JDeveloper Certified with Windows 8.1

Data Integration Tips: ODI 12.1.3 – Convert to Flow

Rittman Mead Consulting - Thu, 2015-04-16 13:23

The many who have already upgraded Oracle Data Integrator from the 11g version to 12c probably know about this great feature called “convert to flow”. If not, well…here you go!

First, a little background on why I think this is an excellent bit of functionality. The ODI Upgrade Assistant will convert objects from 11g to 12c and it does a pretty decent job of it. When converting Interfaces, the upgrade process creates a Mapping in ODI 12c by taking the logical “mapping” layout and loading it into a Dataset object. I assumed the reason was because it wasn’t easy to convert an Interface directly to a full on flow-based mapping, which you typically would develop in ODI 12.1.3 rather than using the limited Dataset (only joins, filters, and lookups allowed). After the upgrade, you would then be stuck with loads of mappings that are not using the latest flow-based features and components.

interface-and-mapping-ds

Now, in ODI 12.1.3, we have the ability to convert our Dataset into the standard ODI 12c flow based components within the Mapping. With a right-click on the Dataset component, we can see the “Convert to Flow” option.

convert-to-flowconfirm

Select Convert to Flow and accept the warning that our Mapping will be changed forever…and boom! No more Dataset!

This is great for my individual Mappings, but now I want to convert my migrated Reusable Mapping Datasets to flow based components.

reusable-convert-to-flow-missing-ds

Wait, what? No option to Convert to Flow! It looks like the Reusable Mappings (which were upgraded from my ODI 11g Temporary Interfaces) cannot be converted to flow for some reason. Hmm… Well, let’s finish converting my Datasets to flow based components for the rest of my 342 upgraded Mappings…one-by-one. Yikes! Actually, we can find a better way to do this. Time to get Groovy with the ODI SDK!

Using Groovy, I can create a simple script to loop through all of my mappings, find the dataset, and call the convertToFlow function on that dataset component. Here’s a look at the guts of the script.

for (mapping in mappingsList){
  componentsList=mapping.getAllComponentsOfType("DATASET")
  
  for (component in componentsList){

    java.util.List convertIssues = new ArrayList()
    blnConvert = 1
    
    try {
      blnConvert = component.convertToFlow(convertIssues)
      
      if (blnConvert) {
        for (item in convertIssues) {
          out.println item.toString()
        }
      }
      
    } catch (Exception e) {
    
    out.println e;
    
    }
  
    out.println mapping.getName() + " had a dataset converted to flow."
  }
}

Just remember to pass the results list object as a parameter to the convertToFlow call (and make sure the List object is properly instantiated as an ArrayList – as I was humbly reminded by David Allan via Twitter!). Once completed, you should be able to open each mapping and see that the dataset has been removed and only flow-based components exist.

Excellent, now we’ve completed our conversion in no time at all. But wait, what about those Reusable Mappings? Remember, we don’t have the right-click menu option to convert to flow as we did with the standard Mapping. Well, let’s see what our friend the ODI SDK has to say about that!

With a slight tweak to the code, replacing Mapping classes with ReusableMapping classes, we can batch convert our Reusable Mapping dataset components to flow based components in an instant. The reason it works via the API is due to the inheritance of the ReuseableMapping class. It inherits the same component methods from the interface oracle.odi.domain.mapping.IMapComponentOwner, which in turn have the same methods and functions, such as convertToFlow, as we had available in the Mapping class. I’m not quite sure why ODI Studio doesn’t expose “Convert to Flow” in the Reusable Mappings, but I’m sure it’s a simple fix we’ll see in an ODI 12c release down the road.

So there you have it, another Data Integration Tip from Rittman Mead – this time, a little help post-migration from ODI 11g to ODI 12c. If you would like more details on how Rittman Mead can help your migration of Oracle Data Integrator to the latest version, send us a note at info@rittmanmead.com. We’d love to help!

 

Categories: BI & Warehousing

OAUX Emerging Technologies RD&D Strategy

Oracle AppsLab - Thu, 2015-04-16 07:46

Speaking of strategies, Misha (@mishavaughan) asked me to write up an article–not a post, there’s a difference–describing how this team goes about its business, i.e. researching, designing and developing solutions for the emerging technologies that will affect our users in the near and not-so-near future.

eleven

You can, and should, read the resulting article over at the mothership, Usableapps (@usableapps). Check it out:

New emphasis on emerging technology shapes Oracle’s user experience strategy

Floyd (@fteter) read it, and so should you because why not?

Untitled

Surprise, there’s method to the madness. It may look like we just play with toys, and while that’s partially true, we’ve always played with purpose.

Thinking back on the eight years I’ve been doing this, I don’t recall ever outlining and presenting a strategy at this level, and the whole exercise of putting the strategy I have in my head into words and slides was enlightening.

Point of fact, we’ve always had a strategy, and it hasn’t changed much, although the technologies we investigate have.

Serious h/t to Paul (@ppedrazzi) in the early years, and Jeremy (@jrwashley) more recently, for shaping, advancing, and fostering the AppsLab vision.

Anyway, now you know where we invest our time and why, or if you knew that already, you now have a handy article to refer to, should you need a refresher or should be you enlightening someone new to the party.

Enjoy.
Possibly Related Posts:

Interesting Comment on Pearson’s LMS Plans From Customer

Michael Feldstein - Thu, 2015-04-16 06:20

By Phil HillMore Posts (304)

On April 1, long-time eCollege (aka Pearson’s LearningStudio) customer Texas Christian University (TCU) gave an update on their LMS selection process to the student newspaper TCU360. In this article there was an interesting statement[1] worth exploring [emphasis added].

“eCollege” will soon be a thing of the past.

TCU has narrowed its search for a Learning Management System to two platforms, Blackboard and Desire2Learn (D2L).

“We’ve had feedback, from faculty specifically, that it’s time for change,” Assistant Provost of Educational Technology and Faculty Development Romy Hughes said.

TCU has used Pearson’s Learning Studio system since 1999.

“Pearson is out of the learning management system game,” Hughes said. “We need something to evolve with the Academy of Tomorrow and where we’re moving to at TCU.”

That last comment got my attention. The eCollege / LearningStudio platform has been around for a long time, and there have been questions about where Pearson was going in the LMS market based on 2011’s introduction of OpenClass. Would OpenClass replace LearningStudio over time, and would it strongly change the LMS market? Would both OpenClass and LearningStudio continue as standalone LMS products? It is quite clear by now that OpenClass itself has not changed the market, but LearningStudio has a long-time customer base of fully online programs – many in the for-profit sector.

Furthermore, with Pearson’s reorganization around efficacy, their core document states (p. 14):

The overarching idea was that our investments should be driven towards those products which deliver the highest impact for learners while sustaining us financially so we can continue to invest in new models and improvements.

There is a question of whether Pearson’s internal reviews around LearningStudio and OpenClass are leading to strategic changes around their position in the LMS market.

I asked for Pearson to provide official comment, and David Daniels, president of Pearson Education, responded with the following clarification.

Pearson has not left the LMS space and will continue to invest in our current generation MyLabs and support our many customers on LearningStudio into the future. Pearson’s Learning Studio still powers over 3 Million enrollments annually in the fully remote, online learning space. Our commitment to servicing these students and their institutions is unwavering. Our focus has been and will be on how we support these students within the learning environment. Our range of support services includes learning design and assessment support, integration, data and analytics , student retention, tutoring, and technical support.

This statement is quite clear that there is no imminent end-of-life for LearningStudio, and it is also quite clear about their focus on the “fully remote, online learning space”. This system is primarily used by fully online programs, but there have been a handful of campus-wide clients such as TCU still using the system from the early days. That Pearson LearningStudio would not be appropriate for TCU’s future is partially explained by this focus on full online.

The statement does make an interesting distinction, however, between investing in MyLabs and supporting LearningStudio. My read is that Pearson is not investing in LearningStudio in terms of major product advances and next generation plans but is continuing to fully support current customers. My read is also that Pearson would add new customers to LearningStudio if part of a broader deal tied to content or online “enabling” services (such as Embanet), but that there is no plan for the company to compete in pure LMS competitions.

To help back up this reading, I discovered that the TCU360 article was updated as follows:

“Pearson is out of the learning management system game,” Hughes said. “We need something to evolve with the Academy of Tomorrow and where we’re moving to at TCU.”Hughes said Pearson withdrew from the LMS search process for TCU but remains an LMS provider.

At TCU, at least, the competition is down to Blackboard and D2L, with D2L in the driver’s seat. This competition is also notable by Canvas not being one of the finalists (haven’t seen this situation lately).

One final note on TCU’s selection process described in the article.

These percentages were based on a 214-item questionnaire called the Review Request for Information (RFI) document. These questions were used to assess whether or not a system had the features that TCU was looking for.

“Most LMS vendors told us it took them exactly three months to complete [the questionnaire] because there were so many specific details we were looking for,” Hughes said.

I’ve said it before and I’ll say it again – making a strategic platform selection by a laundry list of hundreds of detailed feature requirements is not a healthy process. I would not brag that it took vendors three full months to complete a questionnaire. But we have one more example to clarify Michael’s classic “Dammit, the LMS” post.

Do you want to know why the LMS has barely evolved at all over the last twenty years and will probably barely evolve at all over the next twenty years? It’s not because the terrible, horrible, no-good LMS vendors are trying to suck the blood out of the poor universities. It’s not because the terrible, horrible, no-good university administrators are trying to build a panopticon in which they can oppress the faculty. The reason that we get more of the same year after year is that, year after year, when faculty are given an opportunity to ask for what they want, they ask for more of the same.

I’d be willing to bet that the vast majority of those 214 items in the RFI are detailed features or direct derivatives of what TCU already has. Even if I’m wrong, it makes little sense for a school to specify the future with detailed requirements; they’re selecting a vendor, not specifying a new design. I wish TCU the best in their LMS selection process, but I would recommend that they put more emphasis on strategic analysis and less on counting check-boxes.

  1. Statement from the original article before it was updated.

The post Interesting Comment on Pearson’s LMS Plans From Customer appeared first on e-Literate.

Case Study: Oracle Forms Migration to Formspider

Gerger Consulting - Thu, 2015-04-16 01:55
TEAM GmbH, a Formspider customer from Germany, has successfully migrated their product ProStore from Oracle Forms to Formspider. We recently did a joint webinar with TEAM GmbH about their project. In particular we talked about the following topics:

- What other products did TEAM look at?

- Why did TEAM choose Formspider?

- What are the benefits of using Formspider?

- What is it like to work with the company behind Formspider?

- What was TEAM’s approach to Forms Migration?

- A demo of the application

The webinar was quite popular with over 50 attendees. TEAM Division Manager Frank Zscherlich and Product Manager Michael Wibberg answered many questions from the audience.

Below is the recording of the webinar. If you are looking for a way to migrate your Oracle Forms applications to a newer technology, I highly recommend you to watch it.


Oracle Forms Migration to Formspider from Yalim K. Gerger on Vimeo.

You can also watch a short demo of the migrated application below.


Demo of an Oracle Forms Application Migrated to Formspider from Yalim K. Gerger on Vimeo.
Categories: Development

C15LV: Nothing to BLOG About - Think Again

Yann Neuhaus - Wed, 2015-04-15 19:58

Before going to the airport to come back in Europe, the las session I assisted at IOUG Collaborate is a panel: Nothing to BLOG About - Think Again

Some famous bloggers were there to help, motivate and inspire people that are afraid to blog. The idea is good. The session was good. The panelists have shared ideas, tips, experience. It was very good.

I'm already a blogger, but that inspired me anyway. In the same idea I'll expose how I started blogging, just in case it gives some ideas.

Note that tomorrow I speak about participating in the community (read, share, write, talk, blog, tweet, link,...) at our internal event dbi xChange. Thus, waiting at the airport, it's a good occasion to explain how I start blogging in case it gives some ideas to people that want to start.

You Are Trying To Access a Page That Is No Longer Active.The Referring Page May Have Come From a Previous Session. Please Select Home To Proceed

Vikram Das - Wed, 2015-04-15 16:06
Shahed pinged me about this error.  It was coming after logging in.  This R12.1.3 instance had just migrated from an old server to a new one. Once you logged in this error would be displayed:

You Are Trying To Access a Page That Is No Longer Active.The Referring Page May Have Come From a Previous Session. Please Select Home To Proceed

The hits on support.oracle.com were not helpful, but a gave a clue that it may have something to do with session cookie.  So I used Firefox to check http headers.  If you press Ctrl+Shift+K, you will get a panel at the bottom of the browser. Click on Network tab, click on the AppsLocalLogin.jsp and on the right side of the pane, you'll see a cookie tab.

The domain appearing in the cookie tab was from the old server.  So I checked:

select session_cookie_domain from icx_parameters;
olddomain.justanexample.com

So I nullified it:

update icx_parameters set session_cookie_domain=null;

commit;

Restarted Apache

cd $ADMIN_SCRIPTS_HOME
adapcctl.sh stop
adapcctl.sh start

No more error.  I was able to log in and so was Shahed.
Categories: APPS Blogs

Mobile - Get Proactive!

Joshua Solomin - Wed, 2015-04-15 14:39
Get Proactive!

SQL Monitor report little improvement in 12c

Yann Neuhaus - Wed, 2015-04-15 14:35

This is a very short post about something I though I had already blogged about.

I like SQL Monitoring active reports. The are graphical, and still very light. There is only one thing I don't like about it in 11g and it has been improved in 12c

Oracle APEX 5.0 released today

Dimitri Gielis - Wed, 2015-04-15 14:34
After 2.5 years of development, today is the day APEX 5.0 is publicly released and ready to be downloaded to install on your own environment.

In my view it's the best release ever. Not so much of the new Page Designer - although that is definitely a piece of art and it increased productivity even further - but because it's the first time whole of APEX got refreshed and every piece was put under a radar to see how it could be improved. All the small changes and the new UI, together with the Page Designer makes it a whole new development tool, without losing it's strengths from before.

Also note that APEX 5.0 enables many new features build on top of the Oracle Database 12c features, so if you're on that database, you'll see even more nice features.

If you wonder if you should wait with upgrading to APEX 5.0 because you're afraid that your current APEX applications break, I can only share that I upgraded many of my applications as part of EA/beta and most of my apps kept running without issues. As always you have to try your applications yourself, but the APEX development team spend a lot of time trying to keep things backwards compatible. But make sure to have a look at the APEX 5.0 release notes and known issues as they contain important information about changes, expected behaviour and workarounds.

You can develop online on apex.oracle.com or you can download APEX 5.0 and install into your own environment.
Categories: Development

Oracle Application Express 5 - The Unofficial Announcement

Joel Kallman - Wed, 2015-04-15 13:35
What started on a whiteboard in New York City more than 2 years ago is now finally realized.  I and the other members of the Oracle Application Express team proudly announce the release of Oracle Application Express 5.

The official blog posting and announcement is HERE.  But this is my personal blog, and the thoughts and words are my own, so I can be a bit more free.

Firstly, I don't ever want to see a release of Oracle Application Express take 2.5 years again, ever.  It's not good for Oracle, not good for Oracle Application Express, and certainly not good for the vast Oracle Application Express community.  We're going to strive, going forward, for a cadence of annual release cycles.  But with this said, I'm not about to apologize for the duration of the APEX 5 release cycle either.  It's broader and more ambitious than anything we've ever approached, and it happened the way it was supposed to happen.  Rather than say "redesigned", I'd prefer to use Shakeeb's words of "reimagined", because that's really what has transpired.  Not only has every one of the 1,945 pages that make up "internal APEX" (like the Application Builder) been visited, redesigned, and modernized, but the Page Designer is a radically different yet productive way to build and maintain your applications.  It takes time to iterate to this high level of quality.

At the end of the day, what matters most for developers is what they can produce with Oracle Application Express.  They'd gladly suffer through the non-Page Designer world and click the mouse all day, as long as what they produced and delivered made them a hero.  And I believe we have delivered on this goal of focusing on high-quality results in the applications you create.  I've seen my share of bad-looking APEX applications over the years, and with prior releases of APEX, we've essentially enabled the creation of these rather poor examples of APEX.  Not everyone is a Shakeeb or Marc.  I'm not.  But we've harnessed the talents of some of the brightest minds in the UI world, who also happen to be on the APEX development team, and delivered a framework that makes it easy for ordinary people like me to deliver beautiful, responsive and accessible applications, out-of-the-box.

What I'm most happy about is what this does for the Oracle Database.  I believe APEX 5 will make superheroes out of our Oracle Database and Oracle Database Cloud customers.  There is a massive wealth of functionality for application developers and data architects and citizen developers and everyone in-between, in the Oracle Database.  And all of it is a simple SQL or PL/SQL call away!  The Oracle Database is extraordinarily good at managing large amounts of data and helping people turn data into information.  And now, for customers to be able to easily create elegant UI and be able to beautifully visualize this information using Oracle Application Express 5, well...it's just an awesome combination.

I am blessed to work with some of the brightest, most focused, professional, talented, and yet humble people on the planet.  As my wife likes to say, they're all "quality people".  It truly takes an array of people who are deep in very different technologies to pull this off - Oracle Database design, data modeling, PL/SQL programming, database security, performance tuning, JavaScript programming, accessibility, Web security, HTML 5 design, CSS layout, graphic artistry, globalization, integration, documentation, testing, and on and on.  Both the breadth and depth of the talent to pull this off is staggering.

You might think that we get to take a breath now.  In fact, the fun only begins now and plenty of hard work is ahead for all of us.  But we look forward to the great successes of our many Oracle customers.  The #orclapex community is unrivaled.  And we are committed to making heroes out of every one of them.  That's the least we could do for the #orclapex community, such an amazingly passionate and vibrant collection of professionals and enthusiasts.

When anyone asks about the "watershed event" for Oracle Application Express, you can tell them that the day was April 15, 2015 - when Oracle Application Express 5 was released.

Joel

P.S.  #letswreckthistogether

Chrome and E-Business Suite

Vikram Das - Wed, 2015-04-15 12:23
Dhananjay came to me today.  He said that his users were complaining about forms not launching after upgrading to the latest version of Chrome. On launching forms they got this error:

/dev60cgi/oracle forms engine Main was not found on this server

I recalled that Google Chrome team had announced that they would not support java going forward. Googling with keywords chrome java brought this page:

https://java.com/en/download/faq/chrome.xml#npapichrome

It states that:

NPAPI support by ChromeThe Java plug-in for web browsers relies on the cross platform plugin architecture NPAPI, which has long been, and currently is, supported by all major web browsers. Google announced in September 2013 plans to remove NPAPI support from Chrome by "the end of 2014", thus effectively dropping support for Silverlight, Java, Facebook Video and other similar NPAPI based plugins. Recently, Google has revised their plans and now state that they plan to completely remove NPAPI by late 2015. As it is unclear if these dates will be further extended or not, we strongly recommend Java users consider alternatives to Chrome as soon as possible. Instead, we recommend Firefox, Internet Explorer and Safari as longer-term options. As of April 2015, starting with Chrome Version 42, Google has added an additional step to configuring NPAPI based plugins like Java to run — see the section Enabling NPAPI in Chrome Version 42 and later below.Enabling NPAPI in Chrome Version 42 and laterAs of Chrome Version 42, an additional configuration step is required to continue using NPAPI plugins.
  1. In your URL bar, enter:
    chrome://flags/#enable-npapi 
  2. Click the Enable link for the Enable NPAPI configuration option.
  3. Click the Relaunch button that now appears at the bottom of the configuration page.
Developers and System administrators looking for alternative ways to support users of Chrome should see this blog, in particular "Running Web Start applications outside of a browser" and "Additional Deployment Options" section.Once Dhananjay did the above steps, Chrome started launching forms again.  He quickly gave these steps to all his users who had upgraded to the latest version of Chrome (version 42) and it started working form them too.Oracle doesn't certify E-Business Suite forms on Chrome.  Only self service pages of E-Business Suite are certified on Google Chrome.
Categories: APPS Blogs