Feed aggregator

ISA Consulting Bought by E&Y

Look Smarter Than You Are - Wed, 2011-03-16 01:52
And so the consulting company acquisitions continue.  I haven't written about this in over a year mostly because these acquisition entries take so many hours to research (cry me a river, Edward), so let's bury the lead by first covering all the major acquisitions that have occurred since my last entry:


November 24, 2009: PWC acquires Paragon
Those in the Oracle EPM areas in Europe & Asia knew of Paragon.  With close to 100 employees, they were a significant player in the UK, Turkey, and Singapore markets.  It's not known how many of Paragon's employees made the transition to PWC, but press releases seem to reflect around 40.


March 29, 2010: Perficient acquires Kerdock
Kerdock was a major, long-standing Oracle BI/EPM vendor dating back to roughly 2002.  Based out of Houston, they had close to 65 employees at their peak.  When they were bought last year by Perficient (a public-traded company - NASDAQ: PRFT - with about 1,400 employees), they had roughly 45 employees and about $8MM in annual revenue.  They were bought for $6MM (of which $3.4MM was in cash and $2.6MM in PRFT stock).


May 4, 2010: Idhasoft acquires TLC Technologies
TLC is a long-time Oracle EPM partner based out of Pennsylvania.  Though they dated back to the late 90's, they were never that large.  Last year, a controlling interest in TLC was acquired by Idhasoft (through their Prism Informatica subsidiary) for an undisclosed sum.
If you hadn't heard of Meridian when Edgewater acquired them, you weren't alone.  They were only a few years old (and they were pretty small) but they had begun developing a reputation as a Hyperion Strategic Finance implementer that was able to compete with the focused expertise of BlueStone.  We'll never know if they would have fulfilled that promise of HSF experience, though, because they were acquired too early on by Edgewater.  They did have several former Alcar executives (the company that became HSF) on their leadership team (including Alcar's former head of services, Ricardo Rasche), so their acquisition was significant.


August 31, 2010: E&Y acquires Global Analytics
Global Analytics, as you may recall, bought Narratus (the former "Data into Action") a couple of years ago and in 2010, they were gobbled up themselves.  Largely through the strengths of Hyperion installation expert, Bill Beach, Global Analytics had developed a reputation in the Hyperion infrastructure world.  For a time, they were one of only 5 companies (interRel was one of the others) with a significant infrastructure practice around Hyperion which included them subcontracting to other larger global systems integrators.  They had several areas outside of Hyperion, and my guess is that's why  E&Y bought them in 2010.  The small size of their Hyperion practice doesn't seem like it would have warranted E&Y's attention.  Though maybe this should have been a predictor of the acquisition of ISA?


October 21, 2010: IBM acquires Clarity
In my opinion, this was the most significant acquisition in the Oracle EPM, Hyperion, and Essbase world in 2010.  Clarity Systems out of Canada (same place my high school girlfriend lived, by the way) was the first substantial partner to build a pre-packaged budgeting solution on top of Essbase that way pre-dated Hyperion Planning.  Originally a consulting partner at Arbor, Clarity turned their spreadsheet-based front-end to Essbase eventually into a full-featured financial planning, consolidation and reporting product.  What was once a fairly pleasant working relationship got contentious for a number of reasons including alleged licensing violations and what later turned into a compete between Clarity and Hyperion's own Planning and Financial Management products.  As Clarity began to score some competitive wins over Hyperion at companies like Southwest Airlines and Alcon Labs, the relationship took a turn for the downright hostile.


Eventually, Clarity started integrating with non-Hyperion products as they continued their expansion.  Interestingly, when IBM bought them last year, IBM made no secrets about their intentions to kill off most of the Clarity suite (including the planning and financial consolidation functionality).  This actually makes complete sense since they already have the Cognos and TM/1 products doing virtually the same functions.  So why did they acquire them?  Consultant bodies to implement BI/EPM at IBM's consulting clients? Clarity's client list? Just to eliminate a competitor.  None of the above.  Apparently, IBM noticed a weakness in their XBRL reporting and one component of Clarity handled this functionality.  Seems like overkill to me, but then I'm not a company the size of IBM.


Throughout 2010: Palladium founders leave to form other firms
As disastrous as the Hyperion/Arbor merger was back in 1998, there are many who feel that the merger of Balanced Scorecard Collaborative, Painted Word, and ThinkFast into Palladium was even worse.  While I'm not one to judge, it has definitely been true that  Palladium has been bleeding talent (in the Hyperion/EPM world, at least) since their founding.  The last 15 months have been particularly harsh with three major group personnel departures:
  • Painted Word executives including Scot MacGillivray, Jim Leavitt, Chris Boulanger, and Peter Graham all left to found Cervello.  All of these people were founders and/or executives at Painted Word when it became part of Palladium.  They stuck it out for a few years and then left as a group to create Cervello which seems to be doing Oracle BI and EPM consulting.  I can't vouch for that personally, because I haven't run into them at all, but their departure from Palladium was definitely a blow.
  • Tom Phelps left Palladium to start up ClearLine Group.  Tom Phelps was the original founder of the company that later became ThinkFast (one of the three components of Palladium).  Tom and his brother, Marty, founded a company that appears to be doing Oracle EPM consulting (but again, like Cervello, I haven't run into them yet).  With Tom Phelps departing and the Painted Words executives departing, the only founders of the component companies that are still part of Palladium are the Balanced Scorecard guys.
  • Palladium Pace team members including Dean Tarpley, Michael Wright, Carolyn Sieben, and a few others left to join Alvarez and Marsal in August, 2010.  The Pace product hadn't been selling anywhere near what its creators expected and this was the final nail in the coffin of the product.  While Pace is still mentioned on Palladium's website, it doesn't seem that there's anyone left at Palladium still working on the product.  Palladium had been shopping around for a buyer of their Pace business unit for a while, so it's unclear as to if Palladium sold the developers to Alvarez or if they simply were hired en masse.  Since there wasn't any sort of "predatory workpractices" lawsuit, I'm concluding that it was a purchase of the talent and Alvarez didn't want Pace at all.


March 15+ 2011: Ernst & Young acquires ISA
Well, I'd love to point to a press release on this, but there isn't one simply because it's not been announced yet. [Editor's Note: it is now public.  Scroll to the end of the story for more.]  Normally, I wouldn't do a blog entry on this until it was official, but this is the least stealthy acquisition in history.  I have heard about it from no fewer than three sources at three different companies, and since offers have already been extended to the employees that are going to get them at ISA Consulting, the affected people already know.  Keep watching Ernst & Young and ISA's news pages and I'm sure something will be up in the next week or two.


ISA is based out of Pennsylvania and is a very large player in the Oracle BI and EPM space.  Though they do other products, ISA is still considered by many to be a primarily Hyperion partner.  Based on what I've been told, E&Y is acquiring ISA primarily for their consulting expertise.  While they're letting almost all the sales and back office staff go (Mitch Rubin and Cliff Matthews being notable exceptions), most all of the consultants seem to be getting offers to join E&Y.  The partners at ISA do seem to be coming on as either partners or close to it at E&Y.


Even though E&Y is one of the 10 largest privately held companies in the USA, this is a significant acquisition because ISA does appear to have well over 100 people focused around BI, EPM, and data warehousing.  Whether they end up putting ISA in the BI & Data Warehousing group or into financial transformation (or split them between them), this acquisition will significantly increase the number of individuals in those areas. If E&Y does manage to hold on to the talent from ISA, they will now be able to much more directly compete with Deloitte on the BI & EPM front.


I haven't heard terms of the acquisition, but since E&Y doesn't need ISA's client list or sales expertise but rather just wants the consulting bodies, the dollars are presumably based on a multiple of EBITDA. Based on other similar deals in the last year, I expect the multiple is 6.5 times 12-month EBITDA (give or take a factor of 1.5).  If anyone knows any different, by all means, either shoot me an e-mail (I'll keep you anonymous) or post it in the comments to this entry.


Who's Next?
If you go way back to my posting from January 5, 2009, I offered up this list of potential targets for acquisition: 
One could speculate that it might be interRel, PII, Kerdock, Global Analytics, US-Analytics, Analytic Vision, HCG, TopDown, or even the Hyperion arm of Palladium, but it could just as likely be some other tiny Hyperion vendor that's not on anyone's radar screen right now. Heck, it might even expand beyond the consulting world to one of the Hyperion software partners like Applied OLAP or Star Analytics.I then went on to say that interRel could be removed from the list.  Well, I was right on Kerdock, Global Analytics, and the Hyperion arm of Palladium, so that leaves PII, US-Analytics, Analytic Vision, HCG, TopDown, Applied OLAP, and Star Analytics.  I guess I would add MarketSphere to that list too even though they're obviously in areas beyond Oracle EPM.  While many of these companies are too small to attract the attention of Deloitte, IBM, E&Y, and Oracle, don't be shocked if one or more of them is gobbled up in the next year by an off-shore consulting firm looking to fill in the EPM/BI gaps in their offerings.


It's now almost 2AM and I have to present to the HUG group in Minneapolis in a few hours, so I'm going to post and then sleep.  If I've stated anything incorrectly above, feel free to comment and please assume I wasn't trying to be malicious.  It's just been a long day and this entry (essay?) was almost 1,800 words.


UPDATE April 5, 2011: E&Y Officially Buys ISA Consulting
It took a week into April, but E&Y finalized the ISA deal and announced the deal publicly.  The press release states that ISA had 130 employees (I'd speculated 100+) and financial terms were not disclosed.  Read more about it here.
Categories: BI & Warehousing

Runtime error ORA-01031: insufficient privileges

Rob van Wijk - Tue, 2011-03-15 16:13
After a new version of software was installed in production, the end users reported a runtime error: ORA-01031: insufficient privileges, when selecting from a view. The developers of the code were investigating the problem and half way through, they asked me to have a look at the problem. I saw a function from schema3, which was used in a view in schema2, which was used by schema1. I had just Rob van Wijkhttp://www.blogger.com/profile/00499478359372903250noreply@blogger.com1

Anonymous exposes sensitive bank emails

Simon Thorpe - Mon, 2011-03-14 03:46

 

anonymous As expected for quite a while, emails purporting to reveal alleged naughtiness at a major bank have been released today. A bank spokesman says "We are confident that his extravagant assertions are untrue".

The BBC report concludes...  "Firms are increasingly concerned about the prospect of disgruntled staff taking caches of sensitive e-mails with them when they leave, said Rami Habal, of security firm Proofpoint.

"You can't do anything about people copying the content," he said.

But firms can put measures in place, such as revoking encryption keys, which means stolen e-mails become unreadable, he added."

Actually, there is something you can do to guard against copying. While traditional encryption lets authorised recipients make unprotected copies long before you revoke the keys, Oracle IRM provides encryption AND guards against unprotected copies being made. Recipients can be authorised to save protected copies, and cut-and-paste within the scope of a protected workflow or email thread - but can be prevented from saving unprotected copies or pasting to unprotected files and emails. 

The IRM audit trail would also help track down attempts to open the protected emails and documents by unauthorised individuals within or beyond your perimeter.

 

Personalized News Recommendations

Khanderao Kand - Sun, 2011-03-13 19:53
A couple of days back, on March 10, Barron’s reported that “NYTimes.com Adds Recommendation Feature‘. Back in Nov 2010 MyBantu powered ‘Personalized News Recommendations‘ was launched for Samachar, largest news portal about India. This personalized news recommendation, one of the firsts, not only increased visitors(readers) traffic to the Samachar site but it also resulted in the readers spending more time reading these personalized articles.  more ....

Explain this

Oracle WTF - Sat, 2011-03-12 09:04

On the subject of cryptic OTN posts, this one has to get an honorary mention as well:

explain this

hi,

write query to find out order detail of oder_date 2 year before (sorry i forget exact question)

No solutions so far.

Make Me One With Everything

Oracle WTF - Sat, 2011-03-12 08:52

Seen on OTN Forums recently (part of a question entitled "HTML not working in PL/SQL block", so I suppose we were warned):

l_col VARCHAR2(30) := to_number(to_char(to_date('01-feb-2011','dd-mon-yyyy'),'dd'));

So the string '01-feb-2011' becomes first a date, then a string again, then a number, before being assigned to a string variable. Much more interesting than boring old

l_col VARCHAR2(30) := extract (day from date '2011-02-01');

Or even,

l_col VARCHAR2(30) := '1';

IRM Item Codes – what are they for?

Simon Thorpe - Fri, 2011-03-11 07:51

 

barcode

A number of colleagues have been asking about IRM item codes recently - what are they for, when are they useful, how can you control them to meet some customer requirements? This is quite a big topic, but this article provides a few answers.

An item code is part of the metadata of every sealed document - unless you define a custom metadata model. The item code is defined when a file is sealed, and usually defaults to a timestamp/filename combination.

This time/name combo tends to make item codes unique for each new document, but actually item codes are not necessarily unique, as will become clear shortly.

In most scenarios, item codes are not relevant to the evaluation of a user's rights - the context name is the critical piece of metadata, as a user typically has a role that grants access to an entire classification of information regardless of item code. This is key to the simplicity and manageability of the Oracle IRM solution.

Item codes are occasionally exposed to users in the UI, but most users probably never notice and never care. Nevertheless, here is one example of where you can see an item code - when you hover the mouse pointer over a sealed file.

tooltip As you see, the item code for this freshly created file combines a timestamp with the file name.

But what are item codes for?

The first benefit of item codes is that they enable you to manage exceptions to the policy defined for a context. Thus, I might have access to all oracle - internal files - except for 2011_03_11 13:33:29 Board Minutes.sdocx.

This simple mechanism enables Oracle IRM to provide file-by-file control where appropriate, whilst offering the scalability and manageability of classification-based control for the majority of users and content. You really don't want to be managing each file individually, but never say never.

Item codes can also be used for the opposite effect - to include a file in a user's rights when their role would ordinarily deny access. So, you can assign a role that allows access only to specified item codes. For example, my role might say that I have access to precisely one file - the one shown above.

So how are item codes set?

In the vast majority of scenarios, item codes are set automatically as part of the sealing process. The sealing API uses the timestamp and filename as shown, and the user need not even realise that this has happened. This automatically creates item codes that are for all practical purposes unique - and that are also intelligible to users who might want to refer to them when viewing or assigning rights in the management UI.

It is also possible for suitably authorised users and applications to set the item code manually or programmatically if required.

Setting the item code manually using the IRM Desktop

The manual process is a simple extension of the sealing task. An authorised user can select the Advanced... sealing option, and will see a dialog that offers the option to specify the item code.

setitemcode

 

To see this option, the user's role needs the Set Item Code right - you don't want most users to give any thought at all to item codes, so by default the option is hidden.

Setting the item code programmatically

A more common scenario is that an application controls the item code programmatically. For example, a document management system that seals documents as part of a workflow might set the item code to match the document's unique identifier in its repository. This offers the option to tie IRM rights evaluation directly to the security model defined in the document management system. Again, the sealing application needs to be authorised to Set Item Code.

The Payslip Scenario

To give a concrete example of how item codes might be used in a real world scenario, consider a Human Resources workflow such as a payslips. The goal might be to allow the HR team to have access to all payslips, but each employee to have access only to their own payslips.

To enable this, you might have an IRM classification called Payslips. The HR team have a role in the normal way that allows access to all payslips. However, each employee would have an Item Reader role that only allows them to access files that have a particular item code - and that item code might match the employee's payroll number. So, employee number 123123123 would have access to items with that code. This shows why item codes are not necessarily unique - you can deliberately set the same code on many files for ease of administration.

The employees might have the right to unseal or print their payslip, so the solution acts as a secure delivery mechanism that allows payslips to be distributed via corporate email without any fear that they might be accessed by IT administrators, or forwarded accidentally to anyone other than the intended recipient.

All that remains is to ensure that as each user's payslip is sealed, it is assigned the correct item code - something that is easily managed by a simple IRM sealing application. Each month, an employee's payslip is sealed with the same item code, so you do not need to keep amending the list of items that the user has access to - they have access to all documents that carry their employee code.

 

Hospital fined $1m for Patient Data Breach

Simon Thorpe - Thu, 2011-03-10 22:14

 

hospital-finedAs an illustration of the potential cost of accidental breaches, the US Dept of Health and Human Services recently fined a hospital $1m for losing documents relating to some of its patients. Allegedly, the documents were left on the subway by a hospital employee.

For incidents in the UK, several local government bodies have been fined between £60k and £100k. Evidently, the watchdogs are taking an increasingly firm position.

 

GUI or not GUI

alt.oracle - Thu, 2011-03-10 20:01
One of the longest and loudest controversies in the DBA world is that of the graphical user interface vs command line.  Some of the opinions sound like this…

“GUIs are for newbies who don’t know what they’re doing.”
“Why should I learn all the commands – there’s already a tool to do that.”
“GUIs are too slow.”
“Learning the command line takes too long.”
“I don’t need to learn a bunch of commands that I’ll never use – I just want to get my job done.”

My own feelings about this go back to my early days as a DBA.  I had this supervisor who was an absolute wizard when it came to Enterprise Manager.  Now, we’re talking the early OEM that came with Oracle version 8.0, here.  Ancient stuff.  If it could be done with OEM, this guy could “git ‘er done”.  One day tho, some kind of devastating emergency happened.  As a newbie, I wasn’t always trusted to handle the big issues, so I went to the supervisor and told him the situation. 

“Hey boss, we need to do so-and-so.” 
“Oh,” says Boss, “I don’t know how to do that with Enterprise Manager.” 
“Um,” I says, “I don’t think you *can* do that with Enterprise Manager.” 
“Oh,” says Boss, “Then what do we do?”

I remember the look of defeat on his face.  He was a nice guy, he wanted to help, he was responsible to help, but since Oracle hadn’t written that particular ability into his GUI tool, he had no idea as to how to do it.  It made an impression on me.  I decided then and there - that wasn’t going to be me.  I made a commitment that lasted for years – I will not use GUI tools.  No matter how much longer it takes me to do the job, with looking up commands and all, I will abstain from the evil of the GUI.  And so I did.

As a result, I learned the command line.  I REALLY learned the command line.  SQL*Plus was my home.  Not only did I learn a ton of data dictionary views by heart, over time, I sort of developed a “feel” for syntax even if I didn’t know it.  I could kinda intuit what might be in a certain v$ view or I could guess what the columns of a particular dba_* view should be.  It was and is incredibly useful and I don’t regret it.  I wrote and saved my little scripts to do things.  But, over time, I started to look down on my peers who used GUI tools, inwardly thinking they really couldn’t hack it from the command line.  You obviously don’t say something like that, but you joke about it, etc, just to let them know.  It probably didn’t help matters that in the ultimate GUI vs command line deathmatch, Windows vs Linux, I was (and am) squarely on the Linux side.

What started to change me was, ironically, Enterprise Manager.  Although I didn’t use it, I’d kept up with OEM, watching it get, for the most part, better and better.  But when 10g was released, it was like OEM had a bar mitzvah, sweet sixteen and a coming-out party all in one.  Re-christened as Grid/Database Control, you could do dang near EVERYTHING with OEM now.  OEM was finally a comprehensive tool.  It was so comprehensive, that it started to shake my “GUIs are for losers” mentality.  I thought, I could really do some damage with this OEM thing (in a good way).  I started to think in terms of what would be more efficient, OEM or command line, for different situations.  Command line was still winning in my mind, but not by as much as before.

The thing that finally “brought balance to the force” for me was a quote I read by a well-known Oracle consultant/author/blogger guy.  If I said his name, you’d probably recognize it.  I read something of his where he was consulting for a client and said this, almost verbatim, “I knew their DBAs were incompetent because they were using Enterprise Manager.”  Whoa.  Now it’s true that I didn’t want to be like my old boss, unable to do anything without a GUI, but I sure didn’t want to be like this arrogant bastard either.  Besides that, I had seen enough of Grid/Database Control to know that his reasoning was crap.

In the end, the command line versus GUI war boils down to a few principles for me.  A good DBA needs to be efficient.  If you’re more efficient using a GUI than command line, then go for it.  If, on the other hand, the only reason you use a GUI is that you’re just too lazy to learn the commands, then you get what you deserve.    I’m still heavily command line oriented, but, in truth, I know there are instances where it would just be faster to use a GUI tool.  Take, for instance, performance tuning.  Everybody has their own way of doing it, but Grid/Database Control really does a good job of pulling a lot of different metrics together.  It would take a lot of scripts to pull that much information into one place.  It’s not for everyone, but it shouldn’t just be written off without a second thought.  And when you decide which one's "faster", you have to take into consideration the amount of time it took for you to come up with that whiz-bang script of yours.

In the end, I think everyone should aspire to learn how to leverage the command line.  It’s powerful, open ended, versatile and doesn’t tie you down to any particular toolset.  A GUI will always be limited by its programming.  If the programmer didn't dream it, you probably can't do it.  But the point is to get the job done.  If Enterprise Manager helps you bust out your super ninja DBA skillz, I won’t stop you.

And if you're still a hardcore command liner, I'll try to change your mind next time.  What if you could make your own GUI?  Hmm?
Categories: DBA Blogs

Collaborative Filtering Vs Personal Preferences Based Recommendations

Khanderao Kand - Thu, 2011-03-10 04:19
From my other blogs:

http://www.mybantu.com/blog/2011/03/10/collaborative-vs-personalized-recommendations/

Lessons From OpenId, Cardspace and Facebook Connect

Mark Wilcox - Wed, 2011-03-09 23:12

Teach and Listen
(c) denise carbonell

I think Johannes Ernst summarized pretty well what happened in a broad sense in regards to OpenId, Cardspace and Facebook Connect.

However, I'm more interested in the lessons we can take away from this.

First  - "Apple Lesson" - If user-centric identity is going to happen it's going to require not only technology but also a strong marketing campaign. I'm calling this the "Apple Lesson" because it's very similar to how Apple iPad saw success vs the tablet market. The iPad is not only a very good technology product but it was backed by a very good marketing plan. I know most people do not want to think about marketing here - but the fact is that nobody could really articulate why user-centric identity mattered in a way that the average person cared about.

Second - "Facebook Lesson" - Facebook Connect solves a number of interesting problems that is easy for both consumer and service providers. For a consumer it's simple to log-in without any redirects. And while Facebook isn't perfect on privacy - no other major consumer-focused service on the Internet provides as much control about sharing identity information. From a developer perspective it is very easy to implement the SSO and fetch other identity information (if the user has given permission). This could only happen because a major company just decided to make a singular focus to make it happen.

Third - "Developers Lesson" -  Facebook Social Graph API is by far the simplest API for accessing identity information which also is another reason why you're seeing such rapid growth in Facebook enabled Websites. By using a combination of URL and Javascript - the power a single HTML page now gives a developer writing Web applications is simply amazing. For example It doesn't get much simpler than this "http://api.facebook.com/mewilcox" for accessing identity. And while I can't yet share too much publicly about the specifics - the social graph API had a profound impact on me in designing our next generation APIs. 

Posted via email from Virtual Identity Dialogue

Buzz Around Non-Relational DBs

Khanderao Kand - Tue, 2011-03-08 18:34
Reposting from my other blog http://texploration.wordpress.com/2011/03/09/buzz-around-nonrelational-db/


Last Saturday we (GITPRO – Global Indian Tech Professionals Association) arranged Tech Talk on NoSQL (nonRelational actually) DBs and Scaling Hadoop. It was very well attended. In the general introduction session when many introduced themselves they told their interests in Hadoop and NoSQL DB. It was nice to see a good size crowd sacrificing their Saturday evening to attend this informative session. It was more surprising to see many of them were actually users of these technology.

We at MyBantu are using MongoDb which is a document orient database. We store XML document (actually when store it is BSON in MongoDB) and queries use Scripting language for conditions. Other alternative in this class is CouchDB which is more Web-like and gives REST based access. Other famous Non-Relational (popularly called as NoSQL) we of course Hadoop and Cassandra. Both are apache projects with few very good show case implementations. However, recently when Diggs had problem and was using Cassandra, it got a bad name which is not that accurate. Anyway, Hadoop and its database called HBase are making more buzz. It was interesting news when Facebook also moved their messaging system from Cassandra to HBase. Its interesting especially because Cassandra originally came from engineers at Facebook. They used in their InBox search. There is some interesting work on Hadoop is happening in Facebook. They are the original contributors of Hive which is a data manipulation add of targeted towards implementing warehousing on top of Hadoop. While MapReduce databases created a lot of buzz around NoSQL, it is interesting that Hive and Hbase are SQL. so, when folks say NoSQL, it is actually non-Relational databases. Another warehousing related add-on to Hadoop is Pig (Apache Pig) which has originally coming out of Yahoo.

Anyway, its interestingly rapid development happening in this space and the major drive is due to the huge user generated data being handled in the social networking giants like Facebook, Zynga, LinkedIn,.. but the original credit to this concept of Big Table goes to Google from where the Map Reduce database 

Pages

Subscribe to Oracle FAQ aggregator