Feed aggregator

China May Lead Patenting, How About Innovation?

Khanderao Kand - Wed, 2010-10-06 19:08
Title is self explanatory. My following blog refers the stories..

http://texploration.wordpress.com/2010/10/06/china-may-lead-patenting-how-about-innovation/

Larry, Fusion Apps, and SOA Middleware Technology

Khanderao Kand - Wed, 2010-10-06 17:36
At Oracle Open World 2010, Larry Ellison glowingly referred Oracle Fusion Apps (OFA) in his eagerly watched keynote as the largest engineering project of Oracle in the recent times. He covered some important aspects of the Oracle Fusion Apps. He said that it is the first ERP Application completely based on standards. While terming it as "Never done before", he specifically mentioned BPEL! He mentioned that the Fusion Apps is all about Intelligence and not just process automation. Per Larry, the Fusion Apps has wonderful web 2.0 based easy to use UI and search capabilities. Most importantly, he specifically mentioned that the Fusion Applications is based on SOA technology. In fact, this was on his slide deck. All of us who worked on SOA - BPM stack must have been proud to hear Larry saying that.

At the end of his keynpte, Larry invited Steve Miranda, In-charge of Fusion Apps, to demonstrated the Fusion Apps. you must have seen the Work List apps and might have immediately recognized the SOA suite serving the business process. After an intense work for years, hearing such great comments is a rejuvenating memorable moment of all of us who worked on the product as well as helped to happen the Fusion Apps!

Resolving timeouts while using ejb beans for SOA invocation - deployment

Khanderao Kand - Tue, 2010-10-05 16:41
If you are using AS11 BPEL's apis for deployment or invocations and facing time-outs, and the task really needs more time, then you may need to increase the ejb timeout.


1. Find ejb_ob_engine_wls.jar in your deployment.
2. Modify META-INF/weblogic-ejb-jar.xml,
3. Increase time-out
4. rejar and replace the original jar

Selectively turning off the service engines in Oracle SOA AS11

Khanderao Kand - Tue, 2010-10-05 16:05
Thanks to the Spring based implementation of Oracle SOA AS11, if you are not using B2B or mediator or decision or any other service engine, you can turn it off by removing their reference beans in fabric's Spring config (fabric-config.xml). Just be careful though! You may accidently remove something that is needed :-) This may bring the memory consumption down by 100s of MBs.

Integration technologies for Cloud

Khanderao Kand - Tue, 2010-10-05 15:20
2010 is seeing Cloud computing and mobile computing at their 'tipping points'. There are many startups being launched in these areas. With many applications being SaaS based hosted on Cloud, the next requirement would be how to connect them securely and reliably. What could be the right technologies for the same. This need would definitely turn into Cloud based Integration technologies - SOA technologies.

Back to Presenting...

Lisa Dobson - Tue, 2010-10-05 13:02
The UKOUG Technology and E-Business Suite Conference 2010 is getting closer and the agenda is now live.I was lucky enough to get a paper accepted so I will be presenting on the Monday afternoon.My work situation last year meant that I was unable to commit to submitting a paper, although I was still able to attend in the end as a delegate, so I'm really looking forward to getting back to Lisahttp://www.blogger.com/profile/16434297444320005874noreply@blogger.com0

ADF Faces: How to Prevent Closing of Dialog When Pressing Enter Key

JHeadstart - Tue, 2010-10-05 00:05

The af:dialog element supports a number of built in buttons that can be displayed at the bottom of the dialog. The type property governs which buttons are displayed, like OK/Cancel or Yes/No. If the type property is not set, an OK and Cancel button is displayed by default.
When using the built in buttons, the first button is the default button. If you press the Enter key the dialog will close because the default button action is executed.

If your dialog window contains an af:subform with a default command button, and the cursor is in an input component inside the af:subform, clicking enter closes the dialog, it does not execute the default command associated with the subform.

To solve this, you can set the type property to "none" and use the buttonBar facet to include custom OK and Cancel buttons.
Here is an example:

<af:popup id="p0" contentDelivery="lazyUncached" eventContext="launcher">
  <af:dialog modal="true" title="My Dialog" type="none" id="dialog2"
             resize="on" stretchChildren="first">
    <af:region value="#{bindings.myTaskflowWithSubForm.regionModel}" id="region2"/>
    <f:facet name="buttonBar">
      <af:group id="g1">
        <af:commandButton text="OK" actionListener="#{myDialogBean.handleOK}" id="cb1"/>
        <af:commandButton text="Cancel" actionListener="#{myDialogBean.handleCancel}" id="cb2"/>
      </af:group>
    </f:facet>
  </af:dialog>
</af:popup>

JHeadstart users who use JHeadstart-generated List of Values, will see this enhanced behavior in the upcoming 11.1.1.3 release, which means you can use the Enter key to perform a quick search in the LOV dialog window.

Categories: Development

ruby-plsql-spec gem and code coverage reporting

Raimonds Simanovskis - Mon, 2010-10-04 16:00

During recent Oracle OpenWorld conference I presented session PL/SQL unit testing can be fun! where I demonstrated how to do PL/SQL unit testing with Ruby:

Audience was quite interested and had a lot of questions and therefore it motivated me to do some more improvements to ruby-plsql-spec to make it easier for newcomers.

ruby-plsql-spec gem and plsql-spec command line utility

Initially ruby-plsql-spec was just repository of sample tests and if you wanted to start to use it in your project you had to manually pick necessary files and copy them to your project directory.

Now ruby-plsql-spec is released as a gem which includes all necessary dependencies (except ruby-oci8 which you should install if using MRI Ruby implementation) and you can install it with

gem install ruby-plsql-spec

See more information about installation in README file or see specific installation instructions on Windows.

When you have installed ruby-plsql-spec gem and want to start to use it in your existing project then go to your project directory and from command line execute

plsql-spec init

It will create spec subdirectory in current directory where all initial supporting files will be created. The main configuration file which should be updated is spec/database.yml where you should specify username, password and database connection string that should be used when running tests:

default:
  username: hr
  password: hr
  database: orcl

If you specify just database: name then it will be used as TNS connection string (and TNS_ADMIN environment variable should point to directory where tnsnames.ora file is located) or you can also provide hostname: and if necessary also port: parameters and then you can connect to database without tnsnames.ora file.

Now you can start to create your tests in spec directory and your tests file names should end with _spec.rb. You can see some examples at ruby-plsql-spec examples directory

To validate your installation you can try to create simple dummy test in spec/dummy_spec.rb:

require "spec_helper"

describe "test installation" do
  it "should get SYSDATE" do
    plsql.sysdate.should_not == NULL
  end
end

And now from command line you can try to run your test with:

plsql-spec run

If everything is fine you should see something similar like this:

Running all specs from spec/
.

Finished in 0.033782 seconds

1 example, 0 failures
Code coverage reporting

During my Oracle OpenWorld presentation I also showed how to get PL/SQL code coverage report (which shows which PL/SQL code lines were executed during tests run). It might be useful when you want to identify which existing PL/SQL code is not yet covered by unit tests.

Now code coverage reporting is even easier with new ruby-plsql-spec gem. It uses Oracle database DBMS_PROFILER package to collect code coverage information and I took rcov reports HTML and CSS files to present results (so that they would be very similar to Ruby code coverage reports).

To try code coverage reporting let’s create simple PL/SQL function:

CREATE OR REPLACE FUNCTION test_profiler RETURN VARCHAR2 IS
BEGIN
  RETURN 'test_profiler';
EXCEPTION
  WHEN OTHERS THEN
    RETURN 'others';
END;

and simple test to verify code coverage reporting:

require "spec_helper"

describe "test code coverage" do
  it "should get result" do
    plsql.test_profiler.should == 'test_profiler'
  end
end

And now you can run tests with --coverage option which will produce code coverage report:

plsql-spec run --coverage

As a result code coverage reports are created in coverage/ subdirectory. Open coverage/index.html in your browser and click on TEST_PROFILER function and you should see something similar like this report:

You can see that RETURN 'test_profiler'; line (with green background) was executed by test but RETURN 'others'; line (with red background) was not. Lines with light background are ignored by DBMS_PROFILER and I do not take them into account when calculating code coverage percentage (but they are taken into account when calculating total coverage percentage).

Questions or feedback

If you have any other questions about using ruby-plsql-spec for PL/SQL unit testing then please post comments here or if you find any issues when using ruby-plsql-spec then please report them at GitHub issues page.

Categories: Development

ADF Faces: Avoid Use of &lt;jsp:include&gt; Tag!

JHeadstart - Wed, 2010-09-29 00:50

When reviewing ADF applications, I frequently see ADF Faces pages that use the <jsp:include/> tag to include another ADF Faces page fragment.

This is really old-school JSP programming and should be avoided:

  • Component id's in the included page fragment might conflict with component id's in the base page.
  • You cannot use it as a partial source or partial target in partial page refresh
  • You cannot use JSF expression language in the <jsp:param/&gt since it was designed for JSP, not JSF
  • You cannot customize this tag using MDS

It is much better to use the native ADF Faces <af:declarativeComponent> tag, which is a first-class UI Component:
  • It is a naming container with its own id property, you will never have id conflicts with the page using the component
  • It has a partialTriggers property and can be specified as partial target component by other UI components
  • You can pass parameters using custom properties (see below) or the <f:attribute> tag, and you can use JSF expression language to specify parameter values.
  • You can customize the component using MDS

So, replace code like this:

<jsp:include page="/includes/myIncludeFragment.jsff">
  <jsp:param name="myParam" value="myValue"/>
 </jsp:include>

with this:

<af:declarativeComponent id="dc0" viewId="/includes/myIncludeFragment.jsff" myParam="myValue"/>

Note the use of custom property "myParam" inside the tag, this property should be defined as attribute in the declarative component definition as shown below:

<?xml version='1.0' encoding='UTF-8'?>
<jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" xmlns:f="http://java.sun.com/jsf/core"
          xmlns:af="http://xmlns.oracle.com/adf/faces/rich" version="2.1">
    <af:componentDef var="attrs">
        <af:xmlContent>
            <component xmlns="http://xmlns.oracle.com/adf/faces/rich/component">
                <description>my fragment</description>
                <attribute>
                    <attribute-name>myParam</attribute-name>
                    <attribute-class>java.lang.String</attribute-class>
                    <required>true</required>
                </attribute>
            </component>
        </af:xmlContent>
        <!-- Actual content goes here-->
        <af:outputText value="#{attrs.myParam}"/>
    </af:componentDef>
</jsp:root>

Categories: Development

PwC 2011 Global State of Information Security Survey

Simon Thorpe - Tue, 2010-09-28 06:57

PWC-logo.pngPwC has just released the findings of an information security survey by PricewaterhouseCoopers, CIO Magazine and CSO Magazine. The survey contains responses from more than 12,840 CEOs, CFOs, CIOs, CSOs, vice presidents and directors of IT and information security in 135 countries. Quite a wide audience. The report focuses on the business drivers for information security spending and reveals that in general spending on security has flat lined or at least dropped in the past 12 months. Mostly due to lack of funds after a wildly unpredictable economic financial climate. There were some elements of the report I found intriguing given my knowledge of IRM and the problems it solves.

 

While the impacts of the downturn linger, risks associated with weaker partners have increased
So whilst organizations are not spending money on security, they do recognize that the risks of sharing information externally with partners is increasing because... their partners are also not investing in adequate security. It is a very obvious point to make, everyone is not adequately investing in security and yet there is a growing trend to outsourcing where more and more of your information is shared beyond your existing security perimeter. There is now much higher risk when relying more on external partners for your business to be effective but its a necessary evil. What if that partner is your cloud storage provider and you are about to undertake a migration of your content into their platform? Will it be secure?

 

 

Visibility of security breaches/incidents is increasing, as are the costs
The report also finds a healthy increase in the knowledge of security incidents. I would guess this is primarily an impact of regulatory requirements forcing the issue. More and more companies have to report data loss incidents and therefore they are deploying technologies and processes to become more visible of the events.

 

PWC-2010-SecurityIncidients12Months.png

Yet growing in the other direction is the cost awareness of data loss. In three years this number has doubled. So it's a simple summary. People know a lot more about the loss/breach of important information and it is costing them more. The graph below shows the significant increase in both the area of financial loss to the business as well as the loss of critical intellectual property. These results tally with the issues we've seen in the news over the past year. GM losing masses of hybrid research, Ford also losing lots of intellectual property. The health care industry is also reporting data loss incidents at an alarming rate.

PWC-2010-ImpactsToOrganization.png

Another main areas this risk is coming from is, and i'll quote the report "traced to employees and former employees, in line with the higher risks to security associated with salary freezes, job instability, layoffs and terminations and other HR challenges that rise during economic stress." The technology that is presenting the greatest risk is the social network. The channels of communication into and out of your business environment are increasing dramatically. No longer is it appropriate to monitor just email and the firewall. But you have to worry about USB devices, web based storage, social networks... and a lot of this activity happens outside the office whilst people are at home, in a hotel or on the move with their cell phones.

 

How does IRM help?
So where does a document security solution like IRM play into this? First let me summarize up what I think all the research is telling us...

 

Companies are more aware of security incidents and the threat is moving to the partners who are not spending enough to secure your information. The costs of losing information are increasing from both the impact to the business and the technology you need to buy to defend against the loss in the first place. More and more ways to lose information are now invading the enterprise and often they are beyond your control.

So consider the following advantages of a document security solution like Oracle IRM.

  • IRM moves your perimeter of security to the information itself. Instead of buying and deploying DLP, hard disk encryption, encrypted USB devices, simply deploy IRM and no matter where your sensitive documents and emails end up, they are only accessible by authorized persons and encrypted no matter where they are stored.
  • IRM can allow users to open, edit and review documents but prevent them from copying information from the document into an untrusted environment... Facebook, LinkedIn, unprotected Word and Excel documents. Of course it may not take much for a user to retype the information but one of the biggest issues around security is that of awareness. If a user can't easily copy information from a document, they know the information must be confidential.
  • Every single time an IRM protected document is created, opened, printed or saved, it is audited. This dramatically increases the visibility of who is doing what with your information. Also when end users know that by opening IRM documents they are leaving a trail of access, it decreases the likelihood they are going to misuse that information.
  • IRM is easy to deploy. The biggest advantage of IRM by far is that once a document has been secured, you have total control over who can open it. So the simplest deployment where you create one single classification for your entire business and secure all your confidential documents to it for use only by internal employees is quick and easy to do. Right now most organizations have millions, nay billions of documents floating around on partner file shares, employee laptops and the internet. IRM in one simple deployment brings a massive amount of value.
  • IRM does not suddenly impact your business effectiveness. Core to its design is a usable and scalable rights and classification model that puts the decision making on user access into the business. Enormous effect has been invested in making the use of Oracle IRM protected documents simple and easy for authorized users.

 

Have a look at some of the videos on our YouTube channel, or get in touch if you'd like to know more about how this solution works.

Configure Oracle dbconsole...

Bas Klaassen - Tue, 2010-09-28 04:38
For a 10g database I wanted to check something in the database console. When trying to start this console, I received the following error : OC4J Configuration issue. Oracle/ora102/oc4j/j2ee/OC4J_DBConsole_.local_ not foundThe ORACLE_HOME is a shared one, meaning different databases use the same ORACLE_HOME. So, because of this there were some other directories but not the one I needed. I Bas Klaassenhttp://www.blogger.com/profile/04080547141637579116noreply@blogger.com0
Categories: APPS Blogs

UIShell with Dynamic Tabs: Marking the Current Tab Dirty

JHeadstart - Mon, 2010-09-27 20:26

THIS POST IS OBSOLETE, THE TECHNIQUE DESCRIBED BELOW HAS SOME ISSUES. A NEW AND BETTER IMPLEMENTATION IS DISCUSSED HERE:  http://blogs.oracle.com/jheadstart/entry/core_adf11_uishell_with_dynamic

The Dynamic Tabs UI Shell Template includes an API to mark the current tab as dirty or clean.
By marking a tab as dirty, the tab label will appear in italics, and when closing the tab, the end user will get an alert that pending changes will be lost. Nice functionality, but it is left to the ADF developer to write custom code to call this API.

In the upcoming 11.1.1.3 release of JHeadstart, we added support to automatically call this API by checking the data control state of the current tab. This is done using a custom phase listener, that contains the following code:

package oracle.jheadstart.controller.jsf.listener;

import javax.faces.event.PhaseEvent;
import javax.faces.event.PhaseId;
import javax.faces.event.PhaseListener;

import oracle.adf.controller.internal.binding.DCTaskFlowBinding;
import oracle.adf.model.BindingContext;
import oracle.adf.model.binding.DCBindingContainer;
import oracle.adf.model.binding.DCDataControl;

import oracle.adf.view.rich.context.AdfFacesContext;

import oracle.binding.BindingContainer;

import oracle.ui.pattern.dynamicShell.Tab;
import oracle.ui.pattern.dynamicShell.TabContext;

import org.apache.log4j.Logger;

/**
* JHeadstart JSF PhaseListener.
* When using Dynamic tabs template, the current tab state is set to dirty or clean,
* depending on the state of the taskflow data control of the current tab
*/
public class JhsPhaseListener
implements PhaseListener
{

private static Logger sLog = Logger.getLogger(JhsPhaseListener.class);

public static final String PAGE_TEMPLATE_BINDING = "pageTemplateBinding";

@Override
/**
* While refreshing the curent tab seems more effective to do only just before render response
* it turns out that when closing a tab, and then returning to a dirty tab, the data control
* of the dirty tab is no longer seen as dirty, so we do it after the three phases that can
* change the state of the current tab: apply request values, update model values and invoke application
*/
public void afterPhase(PhaseEvent event)
{
if (event.getPhaseId() == PhaseId.APPLY_REQUEST_VALUES
|| event.getPhaseId() == PhaseId.UPDATE_MODEL_VALUES
|| event.getPhaseId() == PhaseId.INVOKE_APPLICATION)
{
checkCurrentTabDirtyState();
}
}

@Override
public void beforePhase(PhaseEvent event)
{
}

@Override
public PhaseId getPhaseId()
{
return PhaseId.ANY_PHASE;
}

public void checkCurrentTabDirtyState()
{
TabContext tabContext = TabContext.getCurrentInstance();
if (tabContext == null || tabContext.getSelectedTabIndex() < 0)
{
return;
}
BindingContainer bc =BindingContext.getCurrent().getCurrentBindingsEntry();
if (bc==null)
{
return;
}
DCBindingContainer pageTemplateBc =
(DCBindingContainer) bc.get(PAGE_TEMPLATE_BINDING);
if (pageTemplateBc == null)
{
return;
}
DCTaskFlowBinding tfb =
(DCTaskFlowBinding) pageTemplateBc.get("r" + tabContext.getSelectedTabIndex());
if (tfb == null || tfb.getExecutableBindings() == null ||
tfb.getExecutableBindings().size() == 0)
{
return;
}
DCBindingContainer taskFlowBc =
(DCBindingContainer) tfb.getExecutableBindings().get(0);
DCDataControl dc = taskFlowBc.getDataControl();
if (dc==null)
{
// no data control, we cannot detect pending changes
return;
}
boolean isDirty = dc != null && (dc.isTransactionDirty() || dc.isTransactionModified());
// calling covenience method markCurrentTabDirty adds content area as partial target,
// causing any popups currently displayed to be hidden again.
// Therefore retrieve current tab instance and call setDirty directly
// tabContext.markCurrentTabDirty(isDirty);
Tab tab = tabContext.getTabs().get(tabContext.getSelectedTabIndex());
if (tab.isDirty() != isDirty)
{
sLog.debug("Setting dirty state of dynamic tab with index " +
tab.getIndex() + " to " + isDirty);
tab.setDirty(isDirty);
AdfFacesContext.getCurrentInstance().addPartialTarget(tabContext.getTabsNavigationPane());
}
}
}

Just copy and past this code to create your own phase listener class, and register the phase listener in faces-config.xml like this (substitute with your own class name):


<lifecycle>
<phase-listener>oracle.jheadstart.controller.jsf.listener.JhsPhaseListener</phase-listener>
</lifecycle>

And make sure that the taskflows that are accessed through the dynamic tabs have the data-control-scope element set to shared.
That's all you need to do to enjoy this nice functionality.

Categories: Development

New release of Oracle IRM Wrapper version 1.4.0

Simon Thorpe - Mon, 2010-09-27 16:03

Yet another release of a highly useful tool in the Oracle IRM kit bag. The Oracle IRM Wrapper is a Java based client which demonstrates some of the powerful ways in which you can create extensions of the Oracle IRM technology to extend the protection of files beyond the out of the box features. The IRM Wrapper uses the IRM API to allow for the encryption of any file, similar in nature to functionality as PGP, however with the difference that the rights control of decrypting files is the same powerful classification system that is used across the usual gambit of IRM files.

In this release support for existing sealed extensions has been added. This is a significant feature because it means that files wrapped by Oracle IRM Wrapper can be opened by the Oracle IRM desktop, and files sealed by the Oracle IRM desktop can be unwrapped by Oracle IRM Wrapper. In a mixed community of end users, where most have the Oracle IRM desktop installed but some do not (they may be on MacOS or Linux), no users need be excluded from workflows - they can all use the same sealed files! It is only necessary to add the Wrapper users to a special group assigned a role with unrestricted export rights.

Download this latest version from here.

 


  • NEW! Support for sealed extensions, e.g. .sdoc, .spdf
  • Installation scripts for easy installation on Windows and MacOS X
  • Written in 100% pure Java so runs on any Java-compatible operating system
  • Internationalized and localized into English, Japanese and (bad) schoolboy French
  • Right-click wrapping and unwrapping
  • Easily configure per-context drag-n-drop wrapping icons on your desktop
  • Automatically checks that you have the rights to wrap and unwrap files
  • Automatically select a default context
  • More robust configuration handling (ignores leading or trailing whitespace)

 

And a few screen shots of the tool running in Windows and Linux.

IRM_Wrapper_On_Linux.png
IRM_Wrapper_On_Windows.png

Security in the Enterprise 2.0 World: Conflicts of collaboration

Simon Thorpe - Mon, 2010-09-27 10:03

CMSWireLogo_v2-02.pngI was recently asked to be a guest author on CMSwire.com to comment on the challenges of security in the Enterprise 2.0 (E2.0) world. Having worked in both E2.0 and security, I have a good perspective.

As E2.0 brings web 2.0 into the enterprise it runs directly into the issues of security, compliance and regulation. It's a big challenge and a big contradiction. The business wants to use all these amazing new ways to share content, but the same business also needs to ensure that only the right people can get access to it.
And What About the Cloud?

Then there is the cloud. Cloud, cloud, cloud, it's on every webcast, in every article. The cloud has many advantages. Why wouldn't you want to outsource all your costs of network management, storage, system administration? The cloud makes perfect sense but has one massive concern... security. Wouldn't it be nice if someone else could host your content, provide the search functionality, upgrade the systems, manage backups and the network access and yet you could have persistent control over the actual information itself?

Read the full article over on cmswire.com.

New: Reusable ADF Taskflows Generator!

JHeadstart - Sun, 2010-09-26 11:11

Last week I attended Oracle Open World 2010. I was pleased to see how much focus there was on the Oracle Application Development Framework. With about 40 sessions, ADF was a major topic on this conference, as it should be!
Overall, the ADF sessions were of high quality and well attended. The drawback of so many sessions is overlap, sometime there were 2 or even 3 ADF sessions at the same time. For my session on JHeadstart, I had to compete with a presentation on functional patterns in ADF and an ADF hands on session. Thanks to a bit of twitter marketing and retweeting by JDeveloper, I nevertheless had about 35 people, while only 20 people preregistered for my session.
This time, I slightly changed the way I presented and positioned JHeadstart. In previous sessions I usually demoed for about 50 minutes, hurrying through the intro and wrap up slides. With earlier versions of ADF, my motto "seeing is believing" was enough, and the audience left mumbling things like "What you just showed me in 1 hour, has taken me 6 months to build".
Now, ADF11 is so impressive by itself, that I noticed somehow people start to think there is less of a need for JHeadstart. And of course, you can perfectly build ADF 11 applications without JHeadstart. But hey, don't you want to get things right the first time, and as fast as possible? Well, I do, and I am convinced that the above statement about saving 6 months is more true than ever with ADF11. ADF11 is immensely powerful when used in the proper way, following best practices, proven design patterns and advanced techniques. And this is the crux, despite more than 2000 pages of official documentation, and a couple of books on ADF, most of this info to really get it right is scattered around the web, in OTN articles and blogs of various ADF experts, or simply not yet documented.
It takes a couple of months before you figured it all out. Unfortunately, the typical project manager does not give you so much time, so you don't get it right the first time and you'll discover along the way that you should have chosen a different approach, applying other, more advanced techniques. And then you have a nasty choice to make: leave it as it is, a-less-functional-less-flexible-and-costlier-application to maintain, or go for a major refactoring......
Well, with JHeadstart you will get it right the first time! Which brings me to the title of this blog entry, I did a "major announcement" during my session, introducing the "Reusable ADF Taskflows Generator" (RATG). And the good news is, you don't have to wait for the next JHeadstart release to use it, it is already there in the current release! Only the name is slighty different: "JHeadstart Application Generator" (JAG). We have chosen the name JAG 8 years ago, when we launched JHeadstart, and ADF did not even exist. But RATG is probably a better name right now. Sure, JHeadstart still generates a complete and fully functional ADF web application, but the core of the application consists of highly reusable and highly configurable task flows.

The generated taskflows can be (re-)used

  • in a "traditional" tabbed menu (also generated with JHeadstart)
  • in the more modern "UI Shell with dynamic tabs". I demoed the excellent JHeadstart support for this pattern in the upcoming JHeadstart 11.1.1.3 release. (You can use the pattern already with current release, requires a bit of coding)
  • in handbuilt drag-and-drop pages, see the JHeadstart tutorial for an example
  • as a child in another parent region, minimizing duplication of UI logic. Again, see the JHeadstart tutorial for an example.
  • in combination with ADF Human workflow tasklist, deeplinking to the task specific object instance within the taskflow
  • in pop up windows. And again, see the JHeadstart tutorial for an example.
  • as WebCenter portlets
  • in WebCenter Composer allowing the end user to add taskflows at runtime

To enable all these reuse cases, the generated taskflows are highly configurable. They
can be configured
  • to start in create mode (displaying a new row)
  • to show one specific row (deeplinking)
  • initially launch a summary page or specific deeplink detail page
  • to hide action buttons (ike Save, Cancel, Browse)
  • to hide search region
  • to show in read-only mode

I also emphasized the concept of the generator templates, which allows you to customize anything you want. For example, you can easily customize the generator to add additional taskflow configuration options.
And if you don't care for 100% generation, you have the freedom to skip generation at any time, and proceed with the visual design editor in JDeveloper to finish off your pages.

I believe that this message of advanced design patterns, reuse, and superior productivity without compromizing on flexibility, was well understood by the crowd. And the demo although shorter was still impressive enough, showing off many of the built in features that will save you additional months of development time. The crowd was "sold" as a couple op attendees said at the end, and made clear in tweets from respected ADF experts like Chris Muir and Wilfred van der Deijl!

Yes, OOW10 was inspiring and provided me with new energy to blog about JHeadstart.
OOW10 made me realize we should communicate more about many of the smart ADF techniques used in JHeadstart. There are still way too many people not using it, ignorant of all the good stuff. So, keep an eye on this blog in the next weeks!

Slides of my presentation can be viewed and downloaded here.

Categories: Development

Generating A Gap-free Series Of Numbers — Not Always A Problem

David Aldridge - Fri, 2010-09-24 07:25
Well, it’s happened again. Someone has asked how to implement a requirement to generate a gap-free series of numbers and a swarm of nay-sayers have descended on them to say (and here I paraphrase slightly) that this will kill system performance, that’s it’s rarely a valid requirement, that whoever wrote the requirement is an idiot […]
Categories: BI & Warehousing

Finding the Cost of an Index

Andries Hanekom - Thu, 2010-09-23 05:41
Building and maintaining an index structure can be expensive, and it can consume resources such as disk space, CPU, and I/O capacity.

According to the Oracle Database Performance Tuning Guide 10g Release 2 (10.2): "...each index maintained by an INSERT, DELETE, or UPDATE of the indexed keys requires about three times as much resource as the actual DML operation on the table. What this means is that if you INSERT into a table with three indexes, then it will be approximately 10 times slower than an INSERT into a table with no indexes".

So let’s attempt to demonstrate this estimate. Let’s start out with one of the largest tables in the Oracle E-Business Suite; ONT.OE_ORDER_LINES_ALL the table stores sales order lines for the Order Management module. I have recreated the table and associated indexes from a 11.5.10.2 installation on a Oracle 10.2 XE database.

Some facts on my testing:
  • The ONT.OE_ORDER_LINES_ALL table contains 340 columns.
  • Standard indexes consists of 1 unique reverse key index and 17 non unique indexes, together they index 26 columns.
  • There are 14 custom non unique indexes covering 35 columns (9 function based).
Test 1: Insert 20000 rows using Insert with select sub query.

With no indexes: Avg 2.5 Seconds.
With standard indexes: Avg 6 Seconds.
With standard and custom indexes: Avg 18 Seconds.

Test 2: Insert 20000 rows using Bulk Collect Cursor and For Loop with Insert:

With standard indexes: Avg 8 Seconds.
With standard and custom indexes: Avg 22 Seconds.

I did not experience the 10 times slower estimate made by the Oracle Performance Tuning guide but the custom indexes significantly degraded insert performance.

So what is custom indexing doing to your Oracle EBS performance?

Working Around an Internal Error Based on Source Mnemonics

David Aldridge - Thu, 2010-09-23 05:16
My frenzied hacking through the unexplored territory of Expression Filters was derailed (to mix my metaphors) by an internal error when using the Evaluate() function in a rather complex query that uses an inline view, analytic functions etc.. The following was extracted from the alert log: ORA-07445: exception encountered: core dump [kkqsCkLegalEqvExpCB()+199] [SIGSEGV] [Address not […]
Categories: BI & Warehousing

finally back!

Nuno Souto - Wed, 2010-09-22 07:31
A thousand apologies for the lack of news. Well, it's been a wild ride since the last post, way back in March! On the work front:We finally finished the new DR site. And it's all working fine. Not just that, though: a LOT more!If you go here you'll see in the files area a presentation on how we do SAN-driven DR database replication. It works with SE Oracle, EE, or any other kind of Oracle Noonsnoreply@blogger.com6

xml:lang versus lang

Ramkumar Menon - Tue, 2010-09-21 13:58

I was following an email thread when I hit this interesting link.

http://www.w3.org/TR/i18n-html-tech-lang/#ri20040429.092928424

It talks about best practices while specifying language for content served on HTML/XHTML/XML pages.
Quoting

"For HTML use the lang attribute only, for XHTML 1.0 served as text/html use the lang and xml:lang attributes, and for XHTML served as XML use the xml:lang attribute only."

e.g.

1. HTML

<helloWorld lang="en-US"/>

2.XHTML served as text/html
<helloWorld lang="en-US" xml:lang="en-US"/>

3. XHTML served as xml
<helloWorld xml:lang="en-US"/>

Bottomline -
Browser understands only "lang" for text/html content.
Dont use "lang" with XHTML 1.1. Its not in the spec. [Its okay in 1.0]

Pages

Subscribe to Oracle FAQ aggregator