Re: Selection criteria for an in-memory analysis solution

From: joel garry <joel-garry_at_home.com>
Date: Wed, 12 May 2010 09:52:13 -0700 (PDT)
Message-ID: <36a17352-54c2-492f-9ad9-16a62ce5fe67_at_s4g2000prh.googlegroups.com>



On May 12, 4:13 am, JTP PR <janinepellet..._at_yahoo.ca> wrote:
>

> Because reporting data is potentially extracted from a source system
> or a data warehouse and then loaded into memory, data latency can be a
> concern. Front-line workers in a customer service center, for example,
> need near-real-time, highly granular (detailed) data. If an in-memory
> tool contains last week’s product inventory data, it’s probably not of
> use to customer service reps. Thus, the suitability of an in-memory
> tool and the success of the deployment may hinge on the degree to
> which the solution can automate scheduled incremental data loads.
> One of the criticisms’ of  some in-memory analysis tools is their lack
> of incremental load.  This means that whenever a data refresh is
> required the entire data set need to be refreshed rather than just
> changed or new transactions.  This increases the load times and means
> that refreshes cannot be frequent enough to enable near-real time
> reporting.  This is nor the case with Yellowfin’s in-memory
> technology.

I'm having a bit of trouble visualizing a scenario where online customer service workers really need up to the minute BI data. Could you give a more specific example? Everything you've posted so far screams in my memory "STAY THE HECK AWAY FROM MY SERVER!"

jg

--
_at_home.com is bogus.
“Popular votes on budgets is not a good thing to do.”
http://www.signonsandiego.com/news/2010/may/12/residents-at-workshop-vote-by-click/
Received on Wed May 12 2010 - 11:52:13 CDT

Original text of this message