Feed aggregator

Changing Salary Basis

RameshKumar Shanmugam - Sun, 2007-10-07 23:37
An interesting scenario came in one of my current projects with my customer who are already using Oracle HRMS for past three years
They wanted to change the salary basis from Monthly to Period (Biweekly). It is a common scenario that many company may change their Salary basis from monthly to biweekly or Bimonthly, but the interesting thing in this project is they wanted to modify the history salary information for all the employees from Monthly to Biweekly.

The first thing that came in my mind was the Auditing issue that we might run since we are touching the most sensitive information of the employee data (SALARY), But the HR Users clearly informed us they can provide us with the exact data that need to be loaded.
Another advantage was the customer is not using Oracle Payroll,Oracle HRMS system is not a transactional system

We followed the following approach
  • Remove the salary proposal for the employee for whom we need to modify.
  • Update the Assignment with the new Salary Basis in Correction mode
  • Reload the salary with the same Change date as we unloaded

We used WebADI for unloading Salary information, updating assignment with the new salary basis and loading the salary proposal for the new salary basis.

We used WebADI custom Integrator with the following API

  • HR_MAINTAIN_PROPOSAL_API.DELETE_SALARY_PROPOSAL
  • HR_ASSIGNMENT_API.UPDATE_EMP_ASG_CRITERIA
  • HR_MAINTAIN_PROPOSAL_API.UPDATE_SALARY_PROPOSAL

Try this out!!!

Categories: APPS Blogs

Statement Cache causes OutOfMemoryError (JDBC)

Vlad Sadilovskiy - Sun, 2007-10-07 22:39

How efficient parse one – execute many in JDBC can get? So obvious recommendation to keep PreparedStatement open as long as possible is not easy. Testing custom built statement cache solution proved it to be not very well scalable. Increasing number of concurrent connections or the size of the cache of open statements lead to ineviable OutOfMemoryError exceptions.

I noticed during my testing by fetching from different “width” tables and playing with prefetch row count, that the wider the row or higher prefetch count were the faster my cache had grown. I debugged my code for many hours looking whether I was closing result sets or for something else very obvious. Nope, no luck – everything was correct. And yet OOME was consistent.

In the end it appeared that JDBC OracleStatement kept reference to so called accessors’ arrays (like a cache of prefetched rows data) for a result set that had been last executed even when the result set was already closed. The size of that data was proportional to the product of the fetched data (in my tests table) width and prefetch row count. I’ve tried then JDBC implicit and explicit caching and was showing precisely same memory footprint.

Opened SR and after few weeks was advised by a nice Oracle TS lady that there could be one workaround (which is mentioned above). It was to use property FreeMemoryOnEnterImplicitCache and so instead of this:

conn = (OracleConnection) DriverManager.getConnection(url, user, password);

use this

OracleDataSource ods = new OracleDataSource();Properties props = new Properties();
props.put("user", user);
props.put("password", password);
props.put("oracle.jdbc.FreeMemoryOnEnterImplicitCache", true);
ods.setConnectionProperties(props);
ods.setURL(url);
conn = ods.getConnection();

During testing I didn’t find a way a custom or explicit cache with key can be used effectively, especially if they are LRU caches (the later appeared to be LRU type of cache) – that residue data is going to get into OldGeneration before the statement is reused and the arrays are repopulated. I really hope the “bug” is going to be fixed in JDBC 10.2.0.4.

Metalink bug reference: 6082980


What sort of twisted fuck does something like this?

Fake Larry Ellison - Sun, 2007-10-07 14:05

Much love to dear reader John who sent in this snapshot of my defaced Wikipedia page (above). Look, people. I know some of you dislike my products. Maybe your Oracle database isn't working correctly. Maybe you don't understand our tech support personnel in Hong Kong or Nairobi or wherever. But that's not my fault. Oracle makes the best databases in the world and all of our tech support personnel take a rigorous course in speaking English. And anyway, nothing gives you a license to call me a Jew right in the subtitle of my Wikipedia page. It's antisemitism. It's hatred. So knock it off, will you? If you want to pick on somebody you should log onto Facebook and virtually knee your friends in the nuts.

"You're not going to believe this, but we're giving away servers now too."

Fake Larry Ellison - Fri, 2007-10-05 13:15

Peter "you don't always have to be right" Burrows over at Business Week has hit another grand slam with his newest article about Jonathan Schwartz and Sun Microsystems. The best quote actually comes at the end of the article when Jim Zemlin at the Linux Foundation talks about Sun's recent moves: "It's like back in high school, when I'd throw these big parties and I'd think I was so cool, only no one would come." Which actually says a lot about both Jonathan and Jim when you think about it. Anyway, my question for Schwartz is this: How the fuck are you supposed to make money when you just give everything away? Peter Burrows seems to think that we're living in some new kind of economy where you don't really have to sell anything to make any money. In other words, if you keep throwing free software to the crowd and jumping and dancing around shiny servers long enough, good things are bound to happen. Um, yeah.

Pete, I want you to do something for me, bro. Just humor me, okay? Surf on over to Google and search for "google servers." Then take a good look at those pictures. See how Google is duck-taping hard drives onto motherboards? See how they're using velcro to strap the motherboards to the racks? Those are Google's servers, dude. Those are what power the world's biggest search engine. Google does that to save money and make components hot-swappable. I'm not sure, but it might have something to do with their stock price these days. So, yeah. Not sure where that leaves Sun, especially when you consider that hundreds of companies are now building servers Google-style.

MERGE and Database Link Problem

Mike Moore - Fri, 2007-10-05 11:27
Using 10g R2 database.

When trying to do a MERGE where the target table is on the remote database and the source table is a sub-select on the local database, the following error was raised:

ORA-01008: not all variables bound

The MERGE statement in question had no bind variables.

Furthermore, if I change the target table to an identical local table it works just fine, so it's not a syntax problem.

I searched all over the web and all over technet and oracle.com but found only a few scant references to this problem. I was unable to find any restriction documented under DISTRIBUTED TRANSACTIONS in the Oracle documentation. I now believe that MERGE will only work over a DB Link when the target is on the local database and the source table is on the remote database.

If you have information about this restriction, please leave a comment.

Shit for Brains is splashing into SaaS

Fake Larry Ellison - Fri, 2007-10-05 11:24

See here. Yes, I know this is old news. What makes it interesting now is that Microsoft is competing with a growing number of companies in this market, including SAP, Salesforce, and NetSuite. Don't laugh at the Microturds yet, though. When the tough get going, the going gets tough. I know SFB is just another butthead fighting over the SaaS table scraps, but Silverlight is actually pretty good. It's nothing to sneeze at anyway.

And I'll tell you this. The only thing worse than Google-envy is Salesforce.com-envy. Don't worry: Stevie B. will want Salesforce.com. When he starts making his little mating call and doing that thing with his tongue, we'll start dropping the hints that we're looking into buying Salesforce. It's our jack-the-price up strategy. We don't really want Salesforce, but we do want to fuck Microsoft. So we'll pretend to want them and plant a bunch of articles in the press and drop some outrageous figure like $10 billion. Then SFB will get all huffy and try to outbid us. It works every time. Just like it did with Red Hat and JBoss. And just like JBoss, Salesforce.com would drag Microsoft down. It would be a terrible buy for Microsoft. But that's Shit for Brains for you. I swear the dude's exactly like the teenage girls I see shopping at the mall. He's always buying a bunch of shit he doesn't really want or need.

Oracle Enterprise Content Management (ECM)

Bex Huff - Thu, 2007-10-04 19:49

This section of the blog contains articles about the Oracle suite of Enterprise Content Management applications. This includes Universal Content Management (UCM), Web Content Management (WCM), Universal Records Management (URM), and a little bit of Information Rights Management (IRM). I helped create several of these products, and thus am very opinionated about how they should be used... I also cover technologies and topics relevant to content management in general, such as enterprise search, and identity management.

Besides the articles in this section, you may also benefit from the following sources:

Primary Oracle ECM Sources Oracle ECM Blogs
  • Fusion ECM: the official Oracle blog on content management, starring Billy Cripe, and occasionally Raoul Miller.
  • Oracle IRM: the official Oracle blog for Information Rights Management, starring Simon Thorpe.
  • Kyle's Blog On UCM: another Oracle "Best Practices" blog, staring Kyle Hatlestad
  • ECM Alerts: Oracle ECM product marketing blog, for official news about events and new product launches
  • Content On Content Management: David Roe's blog on Oracle Content Management.
  • webmonkeymagic: mikeyc7m's tips and rants on web content management, Site Studio, SSPU, and the like.
  • John Sim's blog: web content management, and web design blog. Lots of Site Studio tips.
  • Jason Stortz's blog: IdocScript, Components, Site Studio, and general tips and tricks.
  • Ryan Sullivan's blog: the usual ECM stuff!
Additional Oracle ECM Resources

If you know of another notable Oracle ECM site, send me an email! I'd define "notable" as any "official" site, or a site that posts useful information at least once per month...

read more

Categories: Fusion Middleware

Revenge of the Nerds coming to Xbox

Fake Larry Ellison - Thu, 2007-10-04 14:07

Doesn't it seem like Microsoft takes a hit every time Red Hat releases their numbers? It sure seems that way to me. Red Hat announces record profits and Microsoft's stock goes down. Red Hat announces that they've gained market share and Microsoft fires a couple of their vice presidents. Don't ask me why it's like that. When you really look at the numbers you realize Microsoft has nothing to fear. But they're still afraid. It's like the elephant jumping out of its skin every time it sees the mouse. This last time Microsoft tried something new. Shit for Brains said, well, look. If we release Halo 3 on the same week Red Hat announces quarterly earnings, we'll probably be okay. And they were. In fact, Halo 3 actually did much better than they thought it would. Which is kind of surprising when you consider that the game sucks ass.

So the new plan is to release an Xbox game every time Red Hat announces quarterly earnings. That way, according to Shit for Brains, Microsoft can weather the storm. So SFB has directed Bungie to start working on a new game. This one is going to be a lot like the Revenge of the Nerds movie, except it's going to star younger Silicon Valley greats. Gates and Jobs are in there, of course. So am I. Larry made it in, people. Microsoft sent a couple developers down here to look at pictures of me when I was younger (above and to right) so they could get me just right. The evil frat boys are going to be the gentlemen from the SEC.

And you probably won't believe this, but Shit for Brains actually has hair in the game. I tripped out the first time I saw his character. I didn't even recognize the dude. Seriously. But here's the best part. I bribed the Microsoft guys to put in this cheat code that makes Shit for Brains go postal. Once they put it in the game, all you'll have to do is just punch in the code and Stevie B. will start throwing chairs and shit. Priceless.

These Alinghi guys are toast

Fake Larry Ellison - Thu, 2007-10-04 12:49

I think Swiss drug lord Ernesto Bertarelli has finally figured out that he picked on the wrong dude. He’s used to fucking with poor people who sail toy boats in places like Dubai and Kennebunkport. He’s won so many times over the years that it probably never occurred to him that he shouldn’t challenge a guy like Larry Ellison. He thought I was just going to roll over when he proposed new rules that would completely destroy the America’s Cup. Um, yeah. Fat chance, crack head. We're going to shut you down. You can cry and kick and scream and shoot up all you want, but it's not going to make any difference. We've already won. You know how I know? Because articles like this one keep appearing and they're making you look real bad. Check out this one quote: "Making American billionaire Larry Ellison 'look like the good guy is hard to do,' yet Swiss billionaire Ernesto Bertarelli has accomplished that, said Gary Jobson." You can say that again, Gary. Listen, Ernie. If you raise the white flag and give up now I'll let you sail in the America's Cup in 2009. Seriously. This isn't a joke, dude. Because everybody knows that I don't just win. I make everybody else lose. It's a little philosophy of mine. Just saying. You can do you whatever you want, Ernie. Okay? I just wanted you to know before we go to court.

Automotive Parts Company in Thailand Implements Oracle Flow Manufacturing to Enable Lean Manufacturing

Chris Grillone - Thu, 2007-10-04 10:48
Flow has a beachhead overseas with niche markets, demonstrating the bowling pin effect to gain adoption by the early majority.

Worflow Background Process errors with ORA-00600 internal error code, arguments: [4414]

Madhu Thatamsetty - Thu, 2007-10-04 10:12
Workflow Background Process concurrent request fails with the following error message when the request is submitted with Order Management Item type.Symptom:ORA-00600: internal error code, arguments: [4414], [12226], [1], [12226], [1], [], [],ORA-01403: no data foundWorkflow Background Process concurrent request is a Workflow Background Engine. Workflow Backedground Process concurrent request Madhu Sudhanhttp://www.blogger.com/profile/11947987602520523332noreply@blogger.com2

Oracle Validated Configuration RPM for OEL5

Renaps' Blog - Wed, 2007-10-03 21:30

In the Oracle® Database Installation Guide for 11g there was a mention of Oracle Validated Configuration RPM.

When I first installed OEL5, the RPM’s for that o/s version were not yet released.

But since Oracle has now released the oracle-validated RPM’s for OEL5 on Unbreakable Linux Network. I decided to give it a try.

I installed OEL5 with the default installation. I then registered my machine with the up2date utility.

Do not forget to register to the proper channel on Unbreakable Linux Network or the following error will happen:

[root@oracletest ~]# up2date oracle-validated

Fetching Obsoletes list for channel: el5_i386_latest…

Fetching rpm headers…
########################################

Name Version Rel
———————————————————-

The following packages you requested were not found:
oracle-validated

Once registered, I validated the channel I was registered to:

[root@oracletest db_1]# up2date –nox –show-channels
el5_i386_oracle
el5_i386_latest

Here is my installation output:

[root@oracletest ~]# up2date oracle-validated

Fetching Obsoletes list for channel: el5_i386_oracle…
########################################

Fetching Obsoletes list for channel: el5_i386_latest…

Fetching rpm headers…
########################################

Name Version Rel
———————————————————-
oracle-validated 1.0.0 3.el5 i386

Testing package set / solving RPM inter-dependencies…
########################################
compat-db-4.2.52-5.1.i386.r ########################## Done.
compat-gcc-34-3.4.6-4.i386. ########################## Done.
compat-gcc-34-c++-3.4.6-4.i ########################## Done.
compat-libstdc++-33-3.2.3-6 ########################## Done.
elfutils-libelf-devel-0.125 ########################## Done.
gcc-4.1.1-52.el5.2.i386.rpm ########################## Done.
gcc-c++-4.1.1-52.el5.2.i386 ########################## Done.
glibc-devel-2.5-12.i386.rpm ########################## Done.
glibc-headers-2.5-12.i386.r ########################## Done.
libXp-1.0.0-8.i386.rpm: ########################## Done.
libaio-devel-0.3.106-3.2.i3 ########################## Done.
libstdc++-devel-4.1.1-52.el ########################## Done.
oracle-validated-1.0.0-3.el ########################## Done.
sysstat-7.0.0-3.el5.i386.rp ########################## Done.
unixODBC-2.2.11-7.1.i386.rp ########################## Done.
unixODBC-devel-2.2.11-7.1.i ########################## Done.
cpp-4.1.1-52.el5.2.i386.rpm ########################## Done.
elfutils-libelf-devel-stati ########################## Done.
libgcc-4.1.1-52.el5.2.i386. ########################## Done.
libgomp-4.1.1-52.el5.2.i386 ########################## Done.
libstdc++-4.1.1-52.el5.2.i3 ########################## Done.
Preparing ########################################### [100%]

Installing…
1:libgcc ########################################### [100%]
2:libstdc++ ########################################### [100%]
3:libstdc++-devel ########################################### [100%]
4:unixODBC ########################################### [100%]
5:glibc-headers ########################################### [100%]
6:glibc-devel ########################################### [100%]
7:compat-gcc-34 ########################################### [100%]
8:compat-gcc-34-c++ ########################################### [100%]
9:unixODBC-devel ########################################### [100%]
10:compat-db ########################################### [100%]
11:compat-libstdc++-33 ########################################### [100%]
12:libgomp ########################################### [100%]
13:cpp ########################################### [100%]
14:gcc ########################################### [100%]
15:gcc-c++ ########################################### [100%]
16:sysstat ########################################### [100%]
17:libaio-devel ########################################### [100%]
18:libXp ########################################### [100%]
19:elfutils-libelf-devel ########################################### [100%]
20:oracle-validated ########################################### [100%]
21:elfutils-libelf-devel-s########################################### [100%]
The following packages were added to your selection to satisfy dependencies:

Name Version Release
————————————————————–
compat-db 4.2.52 5.1
compat-gcc-34 3.4.6 4
compat-gcc-34-c++ 3.4.6 4
compat-libstdc++-33 3.2.3 61
elfutils-libelf-devel 0.125 3.el5
gcc 4.1.1 52.el5.2
gcc-c++ 4.1.1 52.el5.2
glibc-devel 2.5 12
glibc-headers 2.5 12
libXp 1.0.0 8
libaio-devel 0.3.106 3.2
libstdc++-devel 4.1.1 52.el5.2
sysstat 7.0.0 3.el5
unixODBC 2.2.11 7.1
unixODBC-devel 2.2.11 7.1
cpp 4.1.1 52.el5.2
elfutils-libelf-devel-static 0.125 3.el5
libgcc 4.1.1 52.el5.2
libgomp 4.1.1 52.el5.2
libstdc++ 4.1.1 52.el5.2

Oracle Validated Configuration did not only install the necessary RPM’s, it also created the oracle user/group’s, along with a couple of Oracle installation’s Requirements file modifications.

I was able to see my system configuration changes with this log file:

/etc/sysconfig/oracle-validated/results/orakernel.log

Creating oracle user passed

Verifying kernel parameters as per Oracle recommendations…
fs.file-max 327679
kernel.msgmni 2878
kernel.msgmax 65536 8192
kernel.sem 250 32000 100 142
kernel.shmmni 4096
kernel.shmall 268435456 3279547
kernel.sysrq 0 1
net.core.rmem_default 262144
net.core.rmem_max 2097152
net.core.wmem_default 262144
net.core.wmem_max 262144
fs.aio-max-nr 3145728
net.ipv4.ip_local_port_range 1024 65000
Setting kernel parameters as per oracle recommendations…
Altered file /etc/sysctl.conf
Original file backed up at /etc/sysctl.conf.orabackup
Verifying & setting of kernel parameters passed

Verifying oracle user OS limits as per Oracle recommendations…
oracle soft nofile 131072
oracle hard nofile 131072
oracle soft nproc 131072
oracle hard nproc 131072
oracle soft core unlimited
oracle hard core unlimited
oracle soft memlock 3500000
oracle hard memlock 3500000
Setting oracle user OS limits as per Oracle recommendations…
Altered file /etc/security/limits.conf
Original file backed up at /etc/security/limits.conf.orabackup
Verifying & setting of user limits passed

Verifying kernel boot parameters as per Oracle recommendations…
Setting kernel boot parameters as per Oracle recommendations…
Boot parameters will be effected on next reboot
Altered file /boot/grub/grub.conf
Original file backed up at /boot/grub/grub.conf.orabackup
Verifying & setting of boot parameters passed

Verifying module parameters as per Oracle recommendations…
Setting module parameters as per Oracle recommendations…
Altered file /etc/modprobe.conf
Original file backed up at /etc/modprobe.conf.orabackup
e1000 settings modified to include FlowControl=1
insmod /lib/modules/2.6.18-8.el5/kernel/drivers/char/hangcheck-timer.ko hangcheck_reboot=1
hangcheck-timer module settings modified to include hangcheck_reboot=1
Oct 2 12:01:35 oracletest ntpd[3529]: time reset -0.771217 s
Oct 2 12:05:38 oracletest ntpd[3529]: synchronized to LOCAL(0), stratum 10

Note that every modified files are backup up by the installation.

All this automation is very neat and save some time. However, some manual work still needs to be done afterwards.
The following files were not modified by the oracle-validated-1.0.0-3.el5.i386 RPM:

–/etc/pam.d/login

–/etc/profile

connected as user oracle:

–.bash_profile

Setting Shell Limits for the oracle User and Configuring the oracle User’s Environment section in the Oracle® Database Installation Guide recommend to configure those files above.

Also, Oracle 11g installer failed the Prerequisites Checks for 2 kernel parameters.

Checking operating system requirements …
Checking kernel parameters
Checking for rmem_default=4194304; rmem_default=262144. Failed <<<<
Checking for rmem_max=4194304; rmem_max=2097152. Failed <<<<

I had to manually change those 2 parameters in the /etc/sysctl.conf file.

References:

Wim Coekaerts Blog

Kevin Closson’s Oracle Blog

Oracle and Linux on OTN


Categories: DBA Blogs

The one constant

Mike Keith - Wed, 2007-10-03 10:44

I like the common adage that “The only constant is change”. Those of us in the software industry are not only familiar with this concept, we live and breathe it.

The software business has got to be one of the most dynamic of careers, and I say that with the utmost of pride and contempt.

It is unquestionably a fun ride when the technology is constantly changing and there is a continuous stream of new approaches and options for solving problems. Even the problem set, itself, is not fixed. The ideas and innovation that arises in our world is one of the factors that keep some of us doing it. Few days are the same as any other, and new opportunities of exploration continually present themselves.

But sometimes it is just plain detestable. You successfully ace a technology that you really enjoy and completely understand, and the next thing you know it is not cool anymore and the industry has moved on (Smalltalkers are cringing right now.) Huge applications consisting of thousands and sometimes even millions of lines of code are sometimes rewritten simply because they need to be in a more contemporary programming language in order to be maintained and extended. The cost of changing software trends is astronomical, and sometimes the reasons are more about change for change’s sake rather than any real need to do so. It’s almost like the industry feeds off of it, nourishing itself at the often huge expense of governments and large corporations.

Software designers and developers have a rather daunting task keeping up with all of the technologies that they are supposed to be using. In fact, when people learn that I am in the software industry, and the commitment that entails in terms of the extra-curricular time investment for learning and staying current, they look at me with that “funny-looking-animal-in-the-zoo” kind of curiosity. I just know they are wondering what would possess someone to want to be a part of something so demanding and consuming, and frankly, sometimes I wonder the same thing myself. Then I remember that I seem to be better at starting kitchen fires than putting them out, I could never be trusted to keep the peace if I can’t keep my own kids from squabbling, and would not even be very good at garbage pickup since I can’t remember to do it on a weekly basis at my home. I guess software was my last resort.

The problem is that our very jobs that demand we learn new things can actually be a hindrance to us doing it. The day-to-day crises that must be dealt with “ASAP” are always more important than learning something that is not immediately relevant. So one of the challenges is actually setting aside the time to investigate and try new technologies, and still be able to attend (or play) an occasional hockey game or participate in whatever other non-technical interest you may have. Sometimes I feel like this business has cheated me out of having hobbies.

I have found that one of the best ways to solve this problem and keep current with technology is to attend conferences. It puts you in a milieu that removes the distractions that are often a barrier to learning something new. It offers you a chance to do something that software people must do, but are often prevented from doing. Just as importantly, they give you the chance to learn these things from people that already really know them well, so you don’t have to waste time wading through resources trying to find the worthwhile ones. Add to this the fact that the experience is interactive and the value of going to conferences becomes pretty obvious.

Of course, not all conferences are created equal. I speak at a lot of them, and when I talk to attendees I find that they tend to favor the ones that are more technical and have a culture of speakers mixing in and socializing with the attendees. In fact, for those that are looking to attend a conference in the near future I would recommend the Colorado Software Summit, which is always held in late October. It is a week-long retreat in the mountains of Colorado, where you can learn about many of the new technologies and then throughout the week talk to the experts directly, or ask questions that you may have about the things that you have learned. I have spoken at this conference for the last 3 or 4 years and highly recommend it.

As far as change goes… I really love it. And I hate it. Enough said.

Package 'CSI_ITEM_INSTANCE_PVT_W' Invalid after 11.5.10.2 Upgrade

Madhu Thatamsetty - Wed, 2007-10-03 03:49
Symptom:If the existing .pls in your environment has the following version & you have just upgraded to Oracle Apps 11.5.10.2, then read further.$ strings -a ./patch/115/sql/csiviiws.pls | grep -i HeaderREM $Header: csiviiws.pls 115.4.1159.2 2006/03/10 10:04:55 abhgupta ship $/* $Header: csiviiws.pls 115.4.1159.2 2006/03/10 10:04:55 abhgupta ship $ */procedure construct_inst_header_rec(p_inst_id Madhu Sudhanhttp://www.blogger.com/profile/11947987602520523332noreply@blogger.com2

Oracle Fights for its Share of Manufacturing - Friday, June 15, 2007 (AMR)

Chris Grillone - Tue, 2007-10-02 16:45
49% of the more than 200 respondents in these industries claim to be using Oracle for manufacturing execution, while only 36% indicate they’ve deployed SAP in this capacity.

Balance Adjustment

RameshKumar Shanmugam - Sat, 2007-09-29 16:29
When we process the payroll, there are times that the users might have deducted excess tax or other deduction, So we might need to correct the Balances for the correct YTD and the for year end reporting

There are two methods of doing it.
By defining an Balance Adjustment Element or if we know the exact amount to be corrected then we can able to adjust using the Adjust Balance Form. The latter one is very simple and it is an easy option to adjust the balance

Follow the below steps to adjust the balance using the Adjust Balance form, if we know the exact amount to be adjusted

Fast Path > Adjust Balance > Select the Assignment for which we need to adjust the balance
Set the Effective date
  • Select the element who's pay or the Input Value need to be adjusted
  • Select the Consolidation set to control the further Post-run Processing
  • Check the costing Check box if you need the balance adjustment to be costed
  • Save the work

Try this out!!!

Categories: APPS Blogs

The Battle for Relevance Rages On

Ken Pulverman - Fri, 2007-09-28 19:20
I’m following up my opening rant with some news from the front. There were two very interesting developments in the battle for your attention this week that play exactly in to what I waxed on about last week.

First of all, the offensive move. ThePudding.com made some fanfare in the news this week. Their new solution offers people the opportunity to make calls over the Internet for free if you don’t mind big brother listening in and attempting to serve you relevant ads at the same time. Early reviews are that the creep factor is more funny than creepy as a computer tries to pick out key words of wisdom from your blather and serve up ads that pertain to what you are wrapping about. I’ve signed up for the beta, so as soon as they let me in, I’ll be able to tell you more from a first hand perspective.

The interesting thing about ThePudding, is not what they are, but what they might be. They don’t plan to be a consumer-facing site in the medium term, but rather an offering for companies that provide voIP/ Internet telephony solutions. Free calls may soon be coming to Skype, Yahoo and others via a Pudding back end that enables these companies to make enough from advertising to more than cover the costs of Ma Bell’s connection charges and the thickened chocolate milk that makes it all possible. So are you willing to let a computer eavesdrop if its owner promises not to store anything you’ve said? These guys are hoping ThePudding is sweet enough to convince you to do so.

Next, the defensive move. If you tuned into NPR’s On the Media on Sunday, you were treated to a great story on Ad Block Plus, a plug in for Firefox that blocks almost all the ads you see while surfing the Internet. This one I did get to try. It works so well it is scary. Why scary? You can just imagine an advertising World War III where advertisers come up with ever more ingenious ways to serve us ads while an underground resistance figures out ways to block every new approach. The problem I foresee is that we are going to be the innocent civilians caught in the middle of this crossfire.

Ad Block Plus works by keeping a list of all of the sites that serve ads to websites and blocks those feeds, much like a virus list. Virus is an apt analogy for at least 3/4ths of the stuff I see. The folk at Internet hegemon, Google, must be seriously pissed. You buy DoubleClick for a king’s ransom as part of your attempt to sew up the entire digital marketing landscape and some joker with a .org figures out a way to block your ad feed. Sure, you can sue him to the center of the earth, but the cat is now out of the bag.

In the On the Media show, The NPR crew broached the topic of whether or not blocking ads was a form of stealing. The logic went that it costs real money to serve up these sites. If you block off their revenue source, you are essentially taking something for free – their offering – be it news, maps, reviews, or e-mail. On the surface, I agree that we do have a real problem if we bite the Internet hands that feed us. On the other hand, I’ve had to look at the same horribly annoying flashing low interest rate banner above my Yahoo e-mail inbox for the better part of a year. Yahoo must be so proud of themselves knowing that I have a mortgage. Now if they just knew when it was time for me to refinance or how unlikely I was to click on a chartreuse banner that looks like it was designed by some ex-used car salesmen. Part of me says it serves them right. Violate our trust long enough and if we are given a way to opt out, we do. Just ask the churn department at your wireless carrier.

This particular battle will be fascinating to watch. I predict détente will come in the form of permission-based marketing on sites we frequent fueled by a whole series of new companies that get much much better at behavioral targeting. Truly relevant ads? What a novel idea. It’s so novel, it’s novious.

Perspective

Mary Ann Davidson - Thu, 2007-09-27 17:42

A few errata correction on my last blog entry before I go any further: 1) my mother insists that I was closer to 3, not 4 years of age when I threw a fit and demanded (and got) my own library card* and 2) the name of the devastating fire in Ketchum, Idaho last month was the Castle Rock Fire, not Castle Creek Fire. I can only plead brain fuzziness based on the amount of smoke I inhaled over the two weeks it was burning. 

 

Now that the fire is over, I have a newfound appreciation for the beautiful, clean, cool and pristine air in Idaho. For the two weeks the 46,000-acre Castle Rock Fire was burning, dense smoke and haze clouded the sky to the point that I could see neither the ski runs at Sun Valley (just a hoot and holler from my house) or the Boulder Mountains to the north of me. You will never know how beautiful clean air can be until you've lived through several weeks of smoke, ash, and debris falling around you. It's like living through the Apocalypse, particularly the experience of looking across the valley and seeing fire burn down the ridge so fast that it was as if it were being fanned by the Devil himself.

 

The fire has been hard on people, particularly businesses. It caused a cancellation of a lot of activities over Labor Day that were not only a lot of fun but that the local merchants depended on to bring in revenue. We are now officially in what is known as "slack season": hardly anybody comes here in fall, though heaven knows why. Fishing, hiking, camping and hunting are all great Idaho fall activities. I once went on a beautiful 6-mile hike to a pristine alpine lake and I did not see a single other soul during the hike, other than my hiking buddy and my dog. (Try that in California.) So come on up to Sun Valley, y'all. If there is anything better than terrific natural beauty, it's terrific natural beauty with no crowds.

 

My other change in perspective (besides a newfound appreciation for clean air) is the way I feel about firefighters. You hear all the time - and most of us believe it - that firefighters are heroes. I never doubted that. But it's one thing to think that in the abstract and another to have experienced it firsthand. I got to see a lot of them in Sun Valley in August, since we had 1600 firefighters in a town of 3000 people. My house was never in any real danger, for which I am grateful. Furthermore, there was no loss of life and no structural damage to anybody's houses or businesses. The critters even made out OK, too, though there are a lot of hungry bears wandering around looking for chow.  Pretty much every place in town now has a "thanks, firefighters" sign or banner displayed prominently. We really mean it: thank you, wildland firefighters, you saved our town.

 

Now that the fire is 100% contained, a lot of locals are saying that in the long run it is going to be healthy for the forests that we had a burn; in fact, we were overdue for one. The forest will recover; the wildlife will thrive (so long as cheat grass doesn't crowd out the sage that is a key habitat for many species). It's only been a couple of months since the Trail Creek Fire burned one of my favorite hikes in Sun Valley, but you can already see a sheen of green on the mountains and some new seedlings sprouting up through the blackened detritus. Forests recover, and a periodic burn gets rid of the underbrush that can otherwise build up and contribute to "crown fires" where the fire spreads not along the forest floor, but leaps from treetop to treetop. The difference between a disaster and a blessing in Ketchum was the skill of the firefighters, the grace of God and also the passage and perspective of time.

 

When you think about it, it's amazing how much of what you see really is based on your perspective. Perspective can include where you are as you look at The Big Picture, where you are in the picture and who else is in the picture.

 

I was reminded of this recently in a discussion with a state government struggling with open records issues. States keep a lot of data on their citizens to support, among other things, taxation (personal and property) and licensing (driver's, hunting, fishing, construction, "concealed carry"  permits and more). The question they were asking was how much of this data should be on-line and searchable?

 

I did not offer to write, critique or edit their state's open records laws, but I did point out to one of their legislators that a lot of concerns over privacy might depend very much on who is accessing the data and why they might want to access the data.

 

Most people are OK with some data being collected relevant to a transaction between parties. For example, to get a concealed carry permit in the state of Idaho, I needed to give the state some information to so they could do a background check on me. I also expect the state of Idaho to keep records about the fact they gave me a concealed carry permit (so that a law enforcement official can independently verify that I have a valid license and not a fake one, for example).

 

Many people who provide information for a service or transaction become unhappy if that data is accessed or sold or otherwise used for some purpose they didn't agree to. If you are dealing with a government entity like a state, you expect that when you give information to the state (that they need for things like raising taxes and providing services to citizens) they are going to use it for those "stated" purposes (no pun intended) and not for three thousand other things. I would not expect that the Idaho gun permit database would be searchable, say, by a gun ownership organization (or, conversely, by an anti-gun ownership organization). "Taint none of their goldurn business."

 

When data suppliers' expectations on who accesses what and for what purpose do not match with data collectors' uses, it's a problem. For example, if you've ordered books from Amazon.com, the next time you log on, you might get a friendly message that says something like, "Hi, &lt;Your Name>! Based on your last few book purchases, we think you might be interested in the following books..." (In my case, the book list will be on military history or the Hawaiian language.) Many people might think: "Wow! How cool that they know me and can recommend books I might like!" 

 

Now imagine, if you will, the exact same message coming from the FBI**: "Hi, &lt;Your Name>, based on your last five book purchases, we think you might be interested in ..." Many people would be outraged to think that the FBI (or another law enforcement entity) was looking at their book purchases. But, and here is the kicker: it is exactly the same data! Whether the above message is a "service" or an "invasion of privacy" depends on who had access to "my" data, who is doing the data analysis and why they are looking at the data. It's all about perspective.

 

In the private sector, these discussions take place in the realm of what a company collects, what they use the data for and who they can share the data with. Most companies have privacy policies that forbid collecting data for one stated purpose and using it or sharing it for another purpose that the "collectee" did not agree to, for example.

 

However, if data is public, or a public record, especially if it is Internet accessible and searchable, potentially anybody can access and analyze the data, for any purpose. My advice to the state was that they ought to hire someone to review the data they already have and figure out all the ways that data access could be misused by the evil-minded, like spear-phishers or stalkers. That is the place to start a legitimate public discussion about "open records;" specifically, how much the citizens of the state want to trade off convenience for privacy, and how much citizen data should be searchable and accessible by someone other than the state agency that collected it. It's all about perspective. 

 

People's perspectives on data collection can also be colored by the accuracy of the data that is kept. If someone made a mistake in doing a background check on me, that led to my being denied a carry permit, I should be able to get that "mistake" corrected. Otherwise, someone down the pike may find that I was once "denied" a carry permit and deny me something else. It's the second law of thermodynamics applied to data: entropy always increases. If data is inaccurate, inaccurate decisions will flow from use of that data.

 

Along those lines, there is another issue I've opined about a couple of times, and I'd be done with it except the topic keeps rearing its head in different forums, and that is the idea of "automated vulnerability testing your way to security." As much as I think that the use of automated tools can help deliver more security-worthy software and have said so, there are too many discussions of late dominated by the perspective that vendors are all evil, lazy and greedy slugs (ELGSs) that happily ship products with tons of security holes in them. The perspective of people who subscribe to the ELGS theory is that vendors must be forced to submit their code to multiple, random, unvetted tools to "validate" their security.

 

A differing perspective (mine) is that these tools are useful only to the extent they are used and work in development: they can't "prove" security, and vendors should license and use the tools that work well for them in development. The idea, after all, is to make products better, not have public "rat out" sessions after products have shipped. And I feel really strongly that anybody wanting to run a third party tool against a product should have to prove the tool works properly and accurately. It's only fair.

 

In fact, they ought to have to prove that the tool is accurate before it's used, otherwise the results may "taint" a vendor (just like a mistake in my background check could color people's perceptions of me forever if it is not corrected).

 

The idea of "burden of proof" is important for a couple of reasons. One of them is that we are still in the nascent stages of tool usage (if it were easy, everyone would already do it) and some of the tools don't work so well. The last thing industry needs when we are trying to promote and encourage tool usage in development is every customer, or every country, deciding that IT products need to be submitted to 348 different "tool tests."  Aside from annoyance and inefficiency, accepting tools' "vulnerability alarms" without question goes against the grain of how a lot of other things are supposed to and generally do work. For example:



  • People who are put on trial are assumed to be innocent until proven guilty. Hardly anybody gets thrown in jail for 25 years to life without someone (a prosecutor) validating the evidence, presenting it in court, and defending it (from defense challenges). The burden of proof in our court system is on the prosecution, and the standard of conviction is "beyond a reasonable doubt." (A 90% "false alarm rate" of evidence presented in a prosecution would not be "reasonable doubt.")


  • Journalists are expected to check facts before reporting that, for example, a celebrity was caught in a love nest with another celebrity. Furthermore, if journalists get the news wrong, they generally print a retraction or correction. (Of course, at that point, reputational damage may not be "retractable," which is one reason why good journalists are rigorous about fact checking.)


  • Gossip is called "gossip" and not "impartial fact exchange" because so much of it is not true and potentially hurtful or damaging. This is why your mom tells you not to do it. Mom is right, as she almost always is.

 

The ugly issue in the promise of automated vulnerability tools is that there is no standard for these tools: what they find, how well they find it. Which means anybody can create a tool, point it at a product, claim to find problems, and all the work is on the product vendor to prove their product does not have a problem instead of on a tools vendor to prove the tool is accurate. And let me tell you, having to go through hundreds or thousands of "potential vulnerability fire alarms"  to validate every one makes security worse, not better, because it takes a scarce resource (a security-aware developer) and puts him/her to work chasing phantoms instead of improving products.

 

Some tools vendors push the "evil vendor" perspective because to the extent they can convince IT vendors' customers that their products need to be scanned, they create fear, uncertainty, and doubt (FUD) and thus increase the demand for their scanning product. Can't blame them for that: it's capitalism at work. That said, I take the perspective that these tools offer promise, but they need to be validated to prove that they are accurate before anyone can be expected to use them. Only if they are accurate are they useful. If they are inaccurate, they are useless and harmful.  (Putting it differently, if IT vendors need to "prove" their products are secure, why shouldn't tools vendors need to "prove" their tools are accurate before anybody would even think of using them? What's sauce for the goose is sauce for the gander.)

 

Lastly, some of these tools are so "chattery" and "noisy" that it really is like gossip and, like gossip, the damage is done even if there is a retraction. A tool that has a lot of false alarms taints a vendor's brand just like tabloid journalists can print innuendo that damages someone's reputation unjustly. I shouldn't have to prove the coding equivalent of "I did not spend the weekend in a love nest with a celebrity,"  the vulnerability tool maker should have to prove that I did.

 

(Aside: one of my own amazingly wonderful ethical hacking team members just improved one of our internally-developed tools, a protocol fuzzer lovingly called BitRotter, to do more pernicious and nefarious code breaking in a good cause. He's just rechristened it ByteRotter. Thanks, Jeff.)

 

Clearly, my perspective isn't unbiased, because I work for an IT vendor. I believe in better security, doing more in secure development, and in industry "raising the bar" through better development practice. Automation (and automated tools) can definitely help.

 

I also believe in accuracy and fairness as basic principles of any business undertaking, because it is only when the haze and smoke and debris is swept away, that you can see - really see - what is there.

 

I climbed to the top of the ridge behind my house a few days after the Castle Rock Fire was declared 100% contained. The fall rains had come to help soothe the burns, and the winds that a few days prior had been fanning the fire were now whisking the few remaining puffs of smoke out of the valley. It's about a 600 foot climb through sage and scrub, but when I got to the top of the ridge, I could see the Boulder Mountains in the distance, and the ski runs at Sun Valley, still green and beautiful, and the aspens beginning to change color on the mountains that ring the Wood River Valley.

 

After two weeks of hellish smoke and ash and debris, I could see rightly - 'ike pono, as the Hawaiians say - for miles and miles and miles. There is no better perspective than that.

 

 

*  Mom also noted it was far from the last fit I would throw. What can I say? I learned useful business skills early.

 

** Disclaimer: I know several people who work for the FBI. They have difficult jobs that the rest of us don't understand and take for granted. I am quite sure they have more important things to do than check up on my latest book-buying binge. Ergo, no slight to them was intended nor should be inferred.

 

For more information:

 

Book of the week: I just read another book by James Hornfischer: Ship of Ghosts, about the USS Houston, sunk at the Battle of Sunda Straights in March 1942. Many of the survivors were forced to build the Burma Railway. An amazing story of survival and heroism. Definitely worth a read.

 

http://www.amazon.com/exec/obidos/ASIN/0553803905/bookstorenow69-20

 

About the Castle Rock Fire:

 

http://www.inciweb.org/incident/952/

 

http://www.sunvalleyonline.com/news/article.asp?ID_Article=3894

Transition period

Fadi Hasweh - Wed, 2007-09-26 04:46
Dear all,

I would like to inform you that I am currently at a transition period i moved to a new country and I am trying to explore my options there and that’s why I am not able to post/reply at forums and email list, but every thing will be back to normal within a month.
Wish me luck
Fadi

What do you want to learn about?

Anthony Rayner - Mon, 2007-09-24 07:14
In the interest of trying to write good posts that the APEX community will find useful, I have added a new 'poll' feature to my blog. My first poll is entitled simply, 'What would you like to read more about on my blog?' and I have suggested a few titles to pick from. You can cast your vote by selecting from the check boxes on the right of the page, below 'About Me'.

If you have any other suggestions or areas you wish to learn about or understand better, then please add a comment to this post.

Look forward to hearing from you!
Anthony.

Categories: Development

Pages

Subscribe to Oracle FAQ aggregator