Feed aggregator

BI tools for Data Driven Culture

Nilesh Jethwa - Wed, 2017-07-05 12:05

Modern day technology has brought about new business practices that have not only made industries more efficient but more competitive as well. People these days live in what is known as a data-driven environment where instinct or luck does not run the business culture, but by honest-to-goodness facts.

There was a time when business intelligence tools served the purpose of providing industry insights, information on consumer behavior, and even consumption patterns. These days, they do more than just provide tools for effective pre-operational business analysis.

BI tools are currently at the heart of multiple organizations where they use technology to make better operating decisions. And what once was information reserved for the higher ranks is now information accessible to all members of the staff, especially those working at the front lines. It is with this small change that businesses in a data-driven culture can truly differentiate the customer experience.

Some of the most important things that business owners need to do are identify which market forces will play a role in their success and focus on which economic components will help them achieve their goals. Given that there are components of modern commerce that are unpredictable the playing field is evened out thanks to BI tools.

BI tools consist of a series of technologies that make it simpler and more efficient for companies that use them to gather data and analyze them. It can include everything from sales metrics to consumer spending habits or even provide accurate measurements about the effectiveness of an operating strategy.

Read more at http://www.infocaptor.com/dashboard/what-is-data-driven-culture-and-how-can-bi-tools-help

In Memoriam

Jonathan Lewis - Wed, 2017-07-05 11:19

My mother died a few weeks ago after a couple of months in terminal care. One of my tasks while she was in care was to go through all her paperwork and while doing so I discovered a couple of stories from her past that she had typed out (remember type-writers?) about 30 years ago. I typed them up on my laptop and printed several copies to hand out for people to read at the tea-party we held for her old friends – of all ages, ranging from 15 to 99 – after the funeral; this seemed to give them a lot of pleasure and they found them so interesting that I decided to share them with a larger audience. So here’s the story, written by my mother in 1983, of her evacuation experience at the start of the 2nd world war when she was a little over 14 years old.

The Summer of 1939. Reminiscences of Dorothy Kathleen Lewis (1925 – 2017)

There had been a lot of talk about a war coming. Adolf Hitler and his armies had marched into Austria and were threatening Poland. We had all been issued with gas masks – just in case we would need them – and emergency plans had been made to evacuate all children from the big cities.

During the school holidays I was taken by my parents, with my sister, to my mother’s home village of Llangeitho in Cardiganshire. My mother had a cousin who was a retired school teacher and it was arranged with Auntie Jane that if war broke out Peggy and I would be sent to stay with her. I don’t think we were very pleased with the arrangement because to us she was very old-fashioned, not a bit like our mother. We ended our holiday and went back to London to wait for the school term to begin.

On the 1st September we heard that all children from our school whose parents wanted them to be evacuated should assemble at the school gates with a small suitcase and their gas masks. As we had already been told we were going to Llangeitho if the war broke out we stood and watched all our friends walking in crocodile fashion down the street and mothers and fathers crying watching them go. It was a very sad day, but I wished I was going with them. I didn’t like the idea of staying with Auntie Jane. None of these children knew where they were going, just somewhere in the countryside for safety, and they didn’t know who would be looking after them.

Well, on the morning of 3rd September Neville Chamberlain, our prime minister, spoke on the wireless (we now call it a radio) to say that we were at war with Germany. Immediately the sirens went and everyone ran to the shelters. My parents, Peggy, and I went to Baker Street Station, which has very deep platforms. There were hundreds of people with the same thing on their minds. We all took our gas masks with us. After a short time the all-clear went. My father sent a telegram to Auntie Jane to say Peggy and I would be leaving London on the train at 9:25pm that night. Trains did not travel as fast as they do today and we were due to arrive at Pont Llanio Station at 7:30am on Monday morning. Peggy’s friend and her mother (an Italian lady who did not speak very good English) was coming too, also one of the young people from the village who was working in London.

Paddington Station had very dim lights and when we got on the train there were no lights at all. After a little while we children began to feel a bit less afraid and started to tell ghost stories and play memory games. It was fun going to the toilet on the train because there were people sitting in the corridor and so was their luggage. We could not see them and I don’t think we really tried – it was all a game. We were supposed to be sleeping, but we were too excited for that. When it came time to eat our sandwiches we had to taste them before we knew what we were eating. Can you imagine being in a train without any lights, and there were no lights in the streets or houses or on the station platforms that we passed. Names of stations had already been removed in case the country was invaded by the enemy. The belief was that the enemy would not know were he was if there were no road signs etc. No-one thought about them using maps and compasses as they would now. [ed: 1983]

We eventually arrived in a town called Carmarthen where we had to change trains and take a slow train to Pont Llanio where a car would meet us. Our train from Paddington was very late arriving and the slow train had gone. Someone telephoned Pont Llanio station to say we would be late and to send the car back. The train from Carmarthen was a very slow one and my father used to say “you could get out of the train and pick flowers on the bank and get back into the train again”. It really was very slow and chugged its way along the line. We arrived at last in Pont Llanio and then in Llangeitho after a journey of 16 hours. [ed: 4:30 to 5:00 hours driving time, now; 6 hours by public transport] I am sure we must have looked very dirty and untidy. The trains in those days were steam and there would be plenty of coal smuts flying around.

I did not think Auntie Jane would be very pleased to see us and I was soon to find out that I had thought rightly. The first thing she did was to take up the stair carpet in case we wore it out. I don’t know how she thought we would do that because once we came down in the morning we were not allowed to go back upstairs again until we went to bed. [ed: if you’ve read “Carrie’s War” you may recognise the behaviour]  She also did not know that children eat quite a lot too. For breakfast Auntie Jane would boil an egg and cut it in half, so Peggy and I had half each. And the same for our dinner, we would have two small potatoes – and this was before rationing and shortage of food. We had a lot of friends in the village and if it was not for them asking us out to tea and/or supper we would have been very hungry. Peggy went to school in the village, but I was too old [ed: at 14 yrs 4 months] and had nothing to do all day, but soon found a baby I could take out in the pram and that meant I would be asked if I would like a piece of cake and a drink. After a few weeks and a number of letters home things goT a little better because my mother was sending parcels of food to Auntie Jane. I don’t know what arrangements were made money wise; because we were not Government evacuees Auntie Jane would not have been paid by the authorities to keep us.

One of the things we used to do together with two of our friends was to help the local butcher clean out his slaughter-house after he had killed a beast. This meant he then asked us to supper in his old farm-house with a huge Inglenook fireplace. Another of my mother’s friends used to have us in early for a meal and say “don’t tell Auntie Jane or she will not give you anything else to eat”. I often think back on those days and wonder why she was so mean. She had never married and had children, but being a teacher I would have expected her to be more tolerant.

In December of 1939 Peggy wrote a letter home which was full of complaint and left it somewhere where Auntie Jane found it and this letter was sent to my parents with a letter from Auntie Jane asking that we be sent back to London. A lot of the people in the village were very surprised to think that she should think to send us back to London when there were air-raids (these had not started at that time). People were saying we would be going home to be killed, but as for me I would rather take that chance than be left in Llangeitho.

Going back to London wasn’t much fun – the school was closed so once again we were at a loose end. We stayed in London over Christmas and again the government started evacuating school children and in February we joined a group who were leaving London – this time as London School Evacuees. We were sent to Buckingham to a family with a little girl of 2 years. This seemed to be alright and we went to school in the afternoons whilst the local children went to school in the mornings. It got rather uncomfortable there after a while because the man of the house, aged 24, lost his job (I don’t know why) and there were a lot of arguments in the house. His wife did not make herself look smart and he started to pay too much attention to me. Again a letter home that it was time we left there and one morning my father arrived and said: “pack your bags, you’re coming home”. What joy!

I don’t have much memory about this part of being an evacuee except to say I was relieved to be out of that house and back in the safety of my family. Whilst we were in Buckingham there had been bombing in London and Peggy and I were taken to see some of the damage that had been done. I think this was to frighten us so that we would be willing to go away again. I certainly did not like the sirens going and having to stop what we were doing and go to the shelter[1]. Once again we were on the move and this time I have a very detailed memory of the events.

We were assembled at the school playground all with our cases and gas masks – worried children and even more worried parents and teachers. No one knew where we were going except that we all piled into a double-decker bus. Lots of tears this time because we knew that people were being killed and injured. Would we see our parents again? What was going to happen to us if they were killed? Where would we go, who would look after us? Questions, questions!

We were taken to Marylebone station and put on an underground train. Where was it going? What were mum and dad doing now; were they still blowing their noses? We were not so bothered because we still knew where we were. Next stop Paddington Station and hundreds of children milling about. I remember I was in the Junior Red Cross at that time and a Red Cross nurse saw my badge and came to speak to me. Such a little thing but it meant such a lot and I have never forgotten her kind words, saying I was going somewhere safe and would be alright. Maybe I was crying at the time, I don’t know.

As the train pulled out of Paddington Station we were all trying to get to a window to wave, although we didn’t know anybody and we didn’t know where we were going. Well of all places – we arrived in Banbury. Now my Auntie Kit, my father’s sister, only lived about 2 miles out of Banbury in a village called Bodicote. I knew Banbury well because we had often visited Bodicote. If only she knew I was here. I know we could not stay with her because she had a very small house and she had 4 children already.

Again on another bus and somehow Peggy and I and two other girls from my class got separated from the rest of our school and we were taken to a village called Great Rollright. Peggy and I went to stay with a lady called Mrs. Robinson who kept the village shop, and my two class friends went to a farm.

Mrs. Robinson was a kind lady – she had asked for two boys but somewhere along the line she had two girls instead. It was very strange arriving in Great Rollright. We were all taken into the Church Hall and there were village people there who, to us, had a funny way of speaking. And one after the other they were saying how many children they would take. Mrs. Robinson wasn’t there, so maybe that is why she didn’t get two boys. I thought it was very embarrassing to be standing there with these quaint country people whilst they were deciding whether they wanted one, two, or three children.

Our time with Mrs. Robinson was very happy. Peggy went to the village school and I went to the county school in Chipping Norton, again on a part-time basis. Mrs. Robinson had a pet cockerel which was allowed to roam round the house – I hated that bird and I think it knew it. Its name was Cocky. Every time I wanted to go down the garden to the toilet this bird would follow me and stay outside until I came out again and peck my legs as I raced back up the garden.

There was certainly plenty to eat in this house and we really had an enjoyable time there. We were always sent to bed with a small glass of cider. I never knew then that cider was an alcoholic drink and I thought it was lovely. We didn’t seem any the worse for it anyway.

We got involved with the village church and would have been happy to have stayed there. But doom. One day I came home from school to find my head mistress from London sitting at the table. Unbeknown to us she had been frantically looking for these four girls she had lost at Banbury Station. I don’t know how she had found us, whether she had contacted our parents or hunted through the schools in the area. With a surname like Walklett I don’t think we would have been difficult to find. I don’t think she had been in touch with our parents – what an awful thing to say to them: “I’m very sorry but I may have lost your children”. No, she must have hunted through the school registers.

The upshot of this visit was that she had found somewhere more suitable for us and would we pack our things because she had found somewhere else for us to stay. More tears because we liked Mrs. Robinson, and the village people were all lovely to us and we were invited to their homes. Off we went with Miss Attride in the car to another village called Duns Tew. The strange thing was that none of our school were there, so why were we moved yet again?

This time we stayed with Mr. and Mrs. Beck, his name was Harry and hers was Daisy, but they were Mr. and Mrs. Beck to us. Mr. Beck was a farm hand and he worked with horses. He used to plough the fields of the farm across the road. He must have walked miles in the days he was ploughing. Although I had had many holidays in Wales and Shropshire at haymaking time I knew nothing about ploughing.

Mr. and Mrs. Beck had a young man living with them. He was like their son; although his family lived in the village he had lived with the Becks since he was a baby and they called him their son. His name was Walter. The village was a beautiful place and we lived in No. 73. There were no street names, every house had a name and a number so we were at No. 73 Duns Tew, the last house in the village, a lovely old thatched cottage. There was always a big wood fire in the grate and plenty on the table. Mr. and Mrs. Beck were the nicest people in village.

Peggy now had to go to Steeple Aston School (since moving to Banbury in 1975 I have met the widow of her headmaster there), and I went to a Continuation College which had been evacuated from the East End of London. This was very odd to me – we were taught shorthand, typing, arithmetic, English grammar, French. This was obviously training us for the commercial world. I was much younger than the other girls there but my education was more advanced than theirs so I soon became top of the class. My English was always being complimented. What they didn’t know was that I had a Welsh mother and the Welsh language used very letter in the word. My French was well in advance and my Maths took a flying leap.

I made friends in the class. The class was held in The Hall, North Aston – a country seat. The Hall was so large that there were 9 girls living there and they had servants. The school room was in the Grand Hall and it was so beautiful it seemed a pity to me that there were desks etc. on the polished floor.

In Duns Tew we had one of the masters of the school staying in The Nurseries (which is still there) and every Friday evening the family he stayed with invited those of us in the village to spend the evening in their house and they had a piano so all the war songs were being sung: “Roll out the Barrel”, “We’re going to hang out the washing on the Siegfried line” and many more.

Because the school at North Aston was a long walk I bought a bike, something I had always wanted, and I joined the cycling group. This meant on one day a week we would go for an outing to places like Blenheim Palace [ed: 10 miles away] etc. I became a good cyclist and had plenty of energy when others flagged behind. I certainly made use of my bike.

One particularly happy time was when it snowed in the winter. Yes, we did get snow in London, but not like this. It was white[2] and where the wind blew it was as high as the hedgerows; I couldn’t believe what I saw. Walter the Beck’s son had a sledge and he showed us where it was good to use it. It was a fantastic time.


[Banbury, 1983]


[1] One of the stories about my mother that I first heard at her funeral was about the time she persuaded her parents to let her stay at home overnight. At the time the family used to head for the air-raid shelter (i.e. the local underground station) at the end of the day and stay there all night long. My mother hated this and persuaded her parents to let her stay at home in her own bed provided she promised to join them at the air-raid shelter as soon as the air-raid sirens sounded. She was only allowed to do this once – because she managed to sleep through two bombing runs and without being woken by the sirens or the explosions.

[2]If you’re wondering why white snow is worth mentioning you probably don’t know about the density of London smog at that time.

DA 7.2 UCF Transfer failing with SSL

Yann Neuhaus - Wed, 2017-07-05 08:14

This is a blog following the one I already wrote ealier:  https://blog.dbi-services.com/documentum-administrator-ucf-troubleshooting/
The first one was for “how to find the error” and not how to resolve it. In this blog I will talk about an error I got by a customer about UCF.

I got an error message using DA 7.2 where I coulnd’t download documents, in fact every transfer were failing due to UCF. By following my previous blog I found the specific error in the logs saying:
SSL Handshake failed.

You probably had this issue as well if you used SSL with DA. By default when you configure SSL with DA it tries to find a certificate from the java CA cert. You can add the certificate to your keystore to prevent this issue.
But in my case I had a keystore generated from certs certified by the customer authority. So I had to find another way.

I found the solution in the documentation: https://support.emc.com/docu56531_Documentum-Web-Development-Kit-6.8-Development-Guide.pdf?language=en_US at page 58.

You can deactivate the java validation as follow:
vi ./webapps/da/wdk/contentXfer/ucf.installer.config.xml

Add the following option:

<option name="https.host.validation" persistent="false">

Now restart tomcat or your application server and you will be able to transfer content.


Cet article DA 7.2 UCF Transfer failing with SSL est apparu en premier sur Blog dbi services.

Oracle Data Visualization Desktop v3

Rittman Mead Consulting - Wed, 2017-07-05 07:57
Oracle Data Visualization Desktop v3

The ODTUG Kscope17 conference last week in San Antonio was a great event with plenty of very interesting sessions and networking opportunities. Rittman Mead participated during the thursday deep dive BI session and delivered three sessions including a special "fishing" one.

. pic.twitter.com/jC04r4RNvx

— Andrew Fomin (@fomin_andrew) 28 giugno 2017

In the meantime Oracle released Data Visualization Desktop which was presented in detail during Philippe Lions session and includes a set of new features and enhancements to already existing functionalities. Starting from new datasources, through new visualization options, in this post I'll go in detail on each of them.

Data Sources

The following new datasources have been introduced:

The latter two (still in beta) are very relevant since they enable querying any product directly exposing JDBC or ODBC connectors (like Presto) without needing to wait for the official support in the DVD list of sources.

Still in DVD v3 there is no support for JSON or XML files. In my older blog post I wrote how JSON (and XML) can be queried in DVD using Apache Drill, however this solution has Drill installation and knowledge as a prerequisite which is not always achievable in end users environment where self-service BI is happening. I believe future versions of DVD will address this problem by providing full support to both data sources.

Connection to OBIEE

One of the most requested new features is the new interface to connect to OBIEE: until DVD v2 only pre-built OBIEE analysis could be used as sources, with DVD v3 OBIEE Subject Areas are exposed making them accessible. The set of columns and filters can't be retrieved on the fly during the project creation but must be defined upfront during datasource definition. This feature avoids move back and forth from OBIEE to DVD to create an analysis in as datasource, and then use it in DVD.

Oracle Data Visualization Desktop v3

Another enhancement in the datasource definition is the possibility to change the column delimiter in txt sources, useful if the datasource has an unusual delimiters.

Oracle Data Visualization Desktop v3

Data Preparation

On the data-preparation side we have two main enhancements: the convert-to-date and the time grain level.
The convert-to-date feature enhances ability for columns to date conversion including the usage of custom parsing strings. Still this feature has some limits like not being able to parse dates like 04-January-2017 where the month name is complete. For this date format a two step approach, reducing the month-name and then converting, is still required.

Oracle Data Visualization Desktop v3

The second enhancement in the data preparation side is the time grain level and format, those options simplify the extraction of attributes (e.g. Month, Week, Year) from date fields which can now be done visually instead of writing logical SQL.

Oracle Data Visualization Desktop v3

The Dataflow component in DVD v3 has an improved UI with new column merge and aggregation functionalities which makes the flow creation easier. Its output can now be saved as Oracle database or Hive table eliminating the need of storing all the data locally.

Oracle Data Visualization Desktop v3

It's worth mentioning that Dataflow is oriented to self-service data management: any parsing or transformation happens on the machine where DVD is installed and its configuration options are limited. If more robust transformations are needed then proper ETL softwares should be used.

New Visualization Options

There are several enhancement on the visualization side, with the first one being the trendlines confidence levels which can be shown, with fixed intervals (90%, 95% or 99%)
Oracle Data Visualization Desktop v3

Top N and bottom N filtering has been added for each measure columns expanding the traditional "range" one.

Two new visualizations have also been included: waterfall and boxplot are now default visualizations. Boxplots were available as plugin in previous versions, however the five number summary had to be pre-calculated; in DVD v3 the summary is automatically calculated based on the definition of category (x-axis) and item (value within the category).

Oracle Data Visualization Desktop v3

Other new options in the data visualization area include: the usage of logarithmic scale for graphs, the type of interpolation line to use (straight, curved, stepped ...), and the possibility to duplicate and reorder canvases (useful when creating a BI story).

Oracle Data Visualization Desktop v3


The latest set of enhancements regard the console: this is a new menu allowing end users to perform task like the upload of a plugin that before were done manually on the file system.

The new Oracle Analytics Store lists add-ins divided into categories:

  • PlugIn: New visualizations or enhancement to existing ones (e.g. auto-refresh, providing a similar behaviour to OBIEE's slider)
  • Samples: Sample projects showing detailed DVD capabilities
  • Advanced Analytics: custom R scripts providing non-default functionalities
  • Map Layers: JSON shape files that can be used to render custom maps data.

The process to include a new plugin into DVD v3 is really simple: after downloading it from the store, I just need open DVD's console and upload it. After a restart of the tool, the new plugin is available.

Oracle Data Visualization Desktop v3

The same applies for Map Layers, while custom R scripts still need to be stored under the advanced_analytics\script_repository subfolder under the main DVD installation folder.

As we saw in this blog post, the new Data Visualization Desktop release includes several enhancement bringing more agility in the data discovery with enhancements both in the connections to new sources (JDBC and ODBC) and standard reporting with OBIEE subject areas now accessible. The new visualizations, the Analytics Store and the plugin management console make the end user workflow extremely easy also when non-default features need to be incorporated. If you are interested in Data Visualization Desktop and want to understand how it can be proficiently used against any data source don't hesitate to contact us!

Categories: BI & Warehousing

Storage Server: Software - FreeNAS

Frank van Bortel - Wed, 2017-07-05 02:21
Software: FreeNAS All hardware has been implemented, all 13 harddisks and one SSD are connected, serial numbers, as well as physical, and logical locations noted. Cramming 4 2.5" disks in the CD bay Time to add some software. I will install the latest and the greatest(?) FreeNAS software, V11. Installation The installation, due to IPMI being capable of mounting remote images, is a Frankhttp://www.blogger.com/profile/07830428804236732019noreply@blogger.com0

Check Workflow's Health with Workflow Analyzer

Steven Chan - Wed, 2017-07-05 02:00

In addition to helping customers resolve issues via Service Requests, Oracle Support also builds diagnostic tools for Oracle E-Business Suite. Here's an index to all of these tools:

What do Support Analyzers do?

These Support Analyzers are non-invasive scripts that run health-checks on your EBS environments. They look for common issues and generate standardized reports summarizing that provide solutions for known issues and recommendations on best practices.

What areas do they cover?

There are over 60 different analyzers for EBS 11i, 12.0, 12.1, and 12.2. There are analyzers for the following product families:

  • E-Business Suite Core (Technology)
  • Financials
  • Manufacturing
  • Human Capital Management  
  • Customer Relationship Management  

Spotlight on Workflow Analyzer

If you haven't tried any of these tools, I'd strongly recommend getting started with the Workflow Analyzer:

The Workflow Analyzer reviews Workflow configurations and compares them against Oracle's best practices. The script reviews Workflow queues, runtime data, stuck activities, looping activities, and purging processes.

This tool can be run manually or configured to run as a concurrent request, so it can be scheduled to be run periodically and included in regular Workflow Maintenance cycles.

Can this script be run against Production?

Yes. There is no DML in the Workflow Analyzer Script (workflow_analyzer.sql), so it is safe to run against Production instances to get a current detailed overview of the Workflow environment for a specific instance. As always it is recommended to test all suggestions against a TEST instance before applying to Production.


Related Articles


Categories: APPS Blogs

report on form

Tom Kyte - Tue, 2017-07-04 22:06
Hi team, I created one table create table books (id number, book_name varchar2(256), published_date date); Insert into book(1,'HARRY POTTER','09-01-2010'); Insert into book(2,'HARRY POTTER 2','09-01-2011'); now I created a report on...
Categories: DBA Blogs

Changing Schema and Schema Object names

Tom Kyte - Tue, 2017-07-04 22:06
We have a project where we are planning change the schema name and the objects in schema Suppose Table Name is Schemaname_<TABLE_NAME> to have New Schema Suffixes same applies to packages Example Current Schema Name - SCOTT Objects in Sche...
Categories: DBA Blogs

Virtual Date Column as paritition doesn't work

Tom Kyte - Tue, 2017-07-04 22:06
Dear All, Database Version : Oracle Database 12c Enterprise Edition Release - 64bit I have a table XX_FT with 2 column as below <code>REF_NO VARCHAR2(100) </code> This column can have two variations of data "XX17010XXXXXX" ...
Categories: DBA Blogs

Kill Session Revert / Miss some data on commit

Tom Kyte - Tue, 2017-07-04 22:06
Hi, I am new to Oracle. I have one Issue. Sometime user said that the application is hungup and they can't do anything. sometime they said they are unable to generate reports. For that I have studied your answers and I am using those to resolve th...
Categories: DBA Blogs


Tom Kyte - Tue, 2017-07-04 22:06
1)How an insert statement work internal of oracle architecture how data is going to stored in data block 2)How update statement works internal of oracle architecture how data is going to update a row
Categories: DBA Blogs

Get max permutation

Tom Kyte - Tue, 2017-07-04 22:06
I have a table with 1 column. I need max permutation for each row values. COLUMN1 -------- a b c d For above table output should be: a_b a_c a_d b_c b_d c_d Your response in this regard is deeply appreciated.
Categories: DBA Blogs

How to connect SQLPlus without tnsnames.ora

Tom Kyte - Tue, 2017-07-04 22:06
Hi I am trying to use SQLPlus to connect to database directly using just the connect string (ie without referencing to tnsnames.ora) I have this in my tnsnames.ora POD = (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(H...
Categories: DBA Blogs

Validate Performance Improvement Using Query Folding Feature in Power BI

Ittichai Chammavanijakul - Tue, 2017-07-04 19:27

I’ve been using Power BI for a couple months now, not as a developer, but as a system architecture. I may not deal with dashboard and report development on a daily basis, however, I, as an end user, use Power BI extensively to monitor Azure and Power BI usage including audit and billing. I would like to learn more about this tool to its nuts and bolts. The intention of this blog series is to document and share what I’ve learned in this journey.

My first area I’d like to explore is performance as it has been most talked within our developer circle and Power BI community. One thing I’ve seen quite often when searching for Power BI performance improvement is to utilize the Query Folding, which basically, pushes query logics and filters to the database. Let’s database do what it does best – extracting and processing data before sending back to Power BI Desktop client. If this is done correctly, this will reduce workload that Power BI has to do on the client side.

There are a couple well-written articles about Query Folding and its benefits already. There is no need for me to repeat it.

Part of this exercise here is to learn how to measure expected performance improvement when using Query Folding.


I’m connecting to a 1.3-million-row Oracle database table.



I created a series of transformations with and without supporting Query Folding. The first two – Filtered Rows and Uppercased Text – in the sample below are the ones supporting Query Folding. We can confirm it by seeing that the View Native Query is available if right-clicking from the last one.

Or we can confirm it by viewing query itself to see that the native SQL query is re-written to include those transformations.

For the next one, the Split Column by Delimiter does not support Query Folding. Note that the View Native Query is now disabled.

The plan is to run these transformations so most will be utilizing the Query Folding then take a measurement. Then move the non-supported one up to the top so the rest will not be able to utilize the Query Folding thus transformations will be processed on the client.

If I would run this small set of transformations, we’d probably not see much difference in term of runtime so I added more transformations just by manually modifying the M Query as seen below – in this case, just adding alternative upper- and lower-case transformations.

To measure the total run time, I’m using the Rui Romano’s Power BI Desktop Trace Logs Analyser, which will read and visualize Power BI Desktop’s diagnostic trace files.

Before each run, the cache will be cleared and enable tracing will be enabled. Once done in each run, the tracing will be disabled.



A. 1st Run data refresh utilizing Query Folding

B. 1st Run data refresh without utilizing Query Folding


C. 2nd Run data refresh utilizing Query Folding

D. 2nd Run data refresh without utilizing Query Folding

The outcome clearly confirms our expectation. I think the number difference would be greater if more transformations especially complex ones are added.

In my future post, I’d like to explore further to understand what these Action Details are, though now I’m not confident that I could find any. It seems like this information may not be shared with customers according to the discussion here.


Categories: DBA Blogs

Fire an employee? It's that easy!

Flavio Casetta - Tue, 2017-07-04 15:45
Categories: DBA Blogs

KPI Dashboards for Retail Industry

Nilesh Jethwa - Tue, 2017-07-04 12:20

The retail industry is highly competitive. It is one of the fastest growing industries worldwide.

Based on the 2015 Global Powers of Retailing Report, the US has

seventy-six of the world’s largest retailing companies. With the growing number of retailers, staying on top will be a tough job.

Retailers who want to rise above their competitors must know all the details about their company. This is where a Key Performance Indicator (KPI) comes to place.

KPIs are helpful in evaluating and monitoring a company’s performance in achieving certain targets or goals. KPIcan also help retailers come up with right business decisions.

Having a KPI dashboard which contains the right data at the right time can help improve the productivity and performance of your company.

While there are many KPIs that a retailer can keep track of, we have compiled ten of the most helpful KPIs for any retail business.

Read more at http://www.infocaptor.com/dashboard/kpi-dashboards-for-retail-industry

SSL/TLS: How to choose your cipher suite

Amis Blog - Tue, 2017-07-04 11:00

For SSL/TLS connections, cipher suites determine for a major part how secure the connection will be. A cipher suite is a named combination of authentication, encryption, message authentication code (MAC) and key exchange algorithms used to negotiate the security settings (here). But what does this mean and how do you choose a secure cipher suite? The area of TLS is quite extensive and I cannot cover it in its entirety in a single blog post but I will provide some general recommendations based on several articles researched online. At the end of the post I’ll provide some suggestions for strong ciphers for JDK8.


First I’ll introduce what a cipher suite is and how it is agreed upon by client / server. Next I’ll explain several of the considerations which can be relevant while making a choice of cipher suites to use.

What does the name of a cipher suite mean?

The names of the cipher suites can be a bit confusing. You see for example a cipher suite called: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 in the SunJSSE list of supported cipher suites. You can break this name into several parts:

  • TLS: transport layer security (duh..)http://www.jscape.com/blog/cipher-suites
  • ECDHE: The key exchange algoritm is ECDHE (Elliptic curve Diffie–Hellman, ephemeral).
  • ECDSA: The authentication algorithm is ECDSA (Elliptic Curve Digital Signature Algorithm). The certificate authority uses an ECDH key to sign the public key. This is what for example Bitcoin uses.
  • WITH_AES_256_CBC: This is used to encrypt the message stream. (AES=Advanced Encryption Standard, CBC=Cipher Block Chaining). The number 256 indicates the block size.
  • SHA_384: This is the so-called message authentication code (MAC) algorithm. SHA = Secure Hash Algorithm. It is used to create a message digest or hash of a block of the message stream. This can be used to validate if message contents have been altered. The number indicates the size of the hash. Larger is more secure.

If the key exchange algorithm or the authentication algorithm is not explicitly specified, RSA is assumed. See for example here for a useful explanation of cipher suite naming.

What are your options

First it is a good idea to look at what your options are. This is dependent on the (client and server) technology used. If for example you are using Java 8, you can look here (SunJSSE) for supported cipher suites. In you want to enable the strongest ciphers available to JDK 8 you need to install Java Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy Files (here). You can find a large list of cipher suites and which version of JDK supports them (up to Java 8 in case of the Java 8 documentation). Node.js uses OpenSSL for cipher suite support. This library supports a large array of cipher suites. See here.

How determining a cipher suite works

They are listed in preference order. How does that work? During the handshake phase of establishing an TLS/SSL connection, the client sends supported cipher suites to the server. The server chooses the cipher to use based on the preference order and what the client supports.

This works quite efficiently, but a problem can arise when

  • There is no overlap in ciphers the client and server can speak
  • The only overlap between client and server supported cipher is a cipher which provides poor or no encryption

This is illustrated in the image below. The language represents the cipher suite. The order/preference specifies the encryption strength. In the first illustration, client and server can both speak English so the server chooses English. In the second image, the only overlapping language is French. French might not be ideal to speak but the server has no other choice in this case but to accept speaking French or to refuse talking to the client.

Thus it is a good practice to for the server only select specific ciphers which conform to your security requirements, but do of course take client compatibility into account.

How to choose a cipher suite Basics Check which cipher suites are supported

There are various mechanisms to check which ciphers are supported. For cloud services or websites you can use SSLLabs. For internal server checking, you can use various scripts available online such as this one or this one.

TLS 1.2

Of course you only want TLS 1.2 cipher suites since older TLS and SSL versions contain security liabilities. Within TLS 1.2 there is a lot to choose from. OWASP provides a good overview of which ciphers to choose here (‘Rule – Only Support Strong Cryptographic Ciphers’). Wikipedia provides a nice overview of (among other things) TLS 1.2 benefits such as GCM (Galois/Counter Mode) support which provides integrity checking.

Disable weak ciphers

As indicated before, if weak ciphers are enabled, they might be used, making you vulnerable. You should disable weak ciphers like those with DSS, DSA, DES/3DES, RC4, MD5, SHA1, null, anon in the name. See for example here and here. For example, do not use DSA/DSS: they get very weak if a bad entropy source is used during signing (here). For the other weak ciphers, similar liabilities can be looked up.

How to determine the key exchange algorithm Types

There are several types of keys you can use. For example:

  • ECDHE: Use elliptic curve diffie-hellman (DH) key exchange (ephemeral). One key is used for every exchange. This key is generated for every request and does not provide authentication like ECDH which uses static keys.
  • RSA: Use RSA key exchange. Generating DH symetric keys is faster than RSA symmetric keys. DH also currently seems more popular. DH and RSA keys solve different challenges. See here.
  • ECDH: Use elliptic curve diffie-hellman key exchange. One key is for the entire SSL session. The static key can be used for authentication.
  • DHE: Use normal diffie-hellman key. One key is used for every exchange. Same as ECDHE but a different algorithm is used for the calculation of shared secrets.

There are other key algorithms but the above ones are most popular. A single server can host multiple certificates such as ECDSA and RSA certificates. Wikipedia is an example. This is not supported by all web servers. See here.

Forward secrecy

Forward secrecy means that is a private key is compromised, past messages which are send cannot also be decrypted. Read here. Thus it is beneficial to have perfect forward secrecy for your security (PFS).

The difference between ECDHE/DHE and ECDH is that for ECDH one key for the duration of the SSL session is used (which can be used for authentication) while with ECDHE/DHE a distinct key for every exchange is used. Since this key is not a certificate/public key, no authentication can be performed. An attacked can use their own key (here). Thus when using ECDHE/DHE, you should also implement client key validation on your server (2-way SSL) to provide authentication.

ECDHE and DHE give forward secrecy while ECDH does not. See here. ECDHE is significantly faster than DHE (here). There are rumors that the NSA can break DHE keys and ECDHE keys are preferred (here). On other sites it is indicated DHE is more secure (here). The calculation used for the keys is also different. DHE is prime field Diffie Hellman. ECDHE is Elliptic Curve Diffie Hellman. ECDHE can be configured. ECDHE-ciphers must not support weak curves, e.g. less than 256 bits (see here).

Certificate authority

The certificate authority you use to get a certificate from to sign the key can have limitations. For example, RSA certificates are very common while ECDSA is gaining popularity. If you use an internal certificate authority, you might want to check it is able to generate ECDSA certificates and use them for signing. For compatibility, RSA is to be preferred.

How to determine the message encryption mechanism

As a rule of thumb: AES_256 or above is quite common and considered secure. 3DES, EDE and RC4 should be avoided.

The difference between CBC and GCM

GCM provides both encryption and integrity checking (using a nonce for hashing) while CBC only provides encryption (here). You can not use the same nonce for the same key to encrypt twice when using GCM. This protects against replay attacks. GCM is supported from TLS 1.2.

How to choose your hashing algorithm

MD5 (here) and SHA-1 (here) are old and should not be used anymore. As a rule of thumb, SHA256 or above can be considered secure.

Finally Considerations

Choosing a cipher suite can be a challenge. Several considerations play a role in making the correct choice here. Just to name a few;
Capabilities of server, client and certificate authority (required compatibility); you would choose a different cipher suite for an externally exposed website (which needs to be compatible with all major clients) than for internal security.

  • Encryption/decryption performance
  • Cryptographic strength; type and length of keys and hashes
  • Required encryption features; such as prevention of replay attacks, forward secrecy
  • Complexity of implementation; can developers and testers easily develop servers and clients supporting the cipher suite?

Sometimes even legislation plays a role since some of the stronger encryption algorithms are not allowed to be used in certain countries (we will not guess for the reason but you can imagine).


Based on the above I can recommend some strong cipher suites to be used for JDK8 in preference order:


My personal preference would be to use TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 as it provides

  • Integrity checking: GCM
  • Perfect forward secrecy: ECDHE
  • Uses strong encryption: AES_256
  • Uses a strong hashing algorithm: SHA384
  • It uses a key signed with an RSA certificate authority which is supported by most internal certificate authorities.

Since ECDHE does not provide authentication, you should tell the server to verify client certificates (implement 2-way SSL).

The post SSL/TLS: How to choose your cipher suite appeared first on AMIS Oracle and Java Blog.

JDeveloper Patch for Transient Expression Compilation Infinite Loop

Andrejus Baranovski - Tue, 2017-07-04 10:26
If you are using JDeveloper or, probably you run into transient expression compilation infinite loop issue. Infinite loop happens when you open ADF BC project and navigate to VO, which contains Groovy expressions. JDeveloper starts to print repeating message in the log - compiling TransientExpression and soon at some point JDeveloper window closing down without any feedback:

For those of you, who are not aware - there is a patch for this issue. Patch can be downloaded from Oracle Support, search for Patch: 25218838 (there is one for JDEV and another one for JDEV

If its your first time applying Oracle patch - no worries, process is very simple and smooth. Extract download patch zip archive first. Next setup ORACLE_HOME environment variable, point to root folder of JDEV install:

Once environment variable is set, run OPatch by executing opatch apply from the directory where patch archive was extracted. You can reference OPatch by direct path:

Hopefully fix provided by this patch will be included into next JDEV version by default.

Common Questions and Misconceptions in The Data Science Field

Rittman Mead Consulting - Tue, 2017-07-04 09:06

There are many types of scenarios in which data science could help your business. For example, customer retention, process automation, improving operational efficiency or user experience.

It is not however always initially clear which questions to concentrate on, or how to achieve your aims.

This post presents information about the type of questions you could address using your data and common forms of bias that may be encountered.

Types of Question
  • Descriptive: Describe the main features of the data, no implied meaning is inferred. This will almost always be the first kind of analysis performed on the data.

  • Exploratory: Exploring the data to find previously unknown relationships. Some of the found relationships may define future projects.

  • Inferential: Looking at trends in a small sample of a data set and extrapolating to the entire population. In this type of scenario you would end up with an estimation of the value and an associated error. Inference depends heavily on both the population and the sampling technique.

  • Predictive: Look at current and historical trends to make predictions about future events. Even if x predicts y, x does not cause y. Accurate predictions are hard to achieve and depend heavily on having the correct predictors in the data set. Arguably more data often leads to better results however, large data sets are not always required.

  • Causal: To get the real relationship between variables you need to use randomised control trials and measure average effects. i.e. if you change x by this much how does y change. Even though this can be carried out on observed data huge assumptions are required and large errors would be introduced into the results.

Biases in data collection or cleaning

It is very easy to introduce biases into your data or methods if you are not careful.
Here are some of the most frequent:

  • Selection/sampling bias: If the population selected does not represent the actual population, the results are skewed. This commonly occurs when data is selected subjectively rather than objectively or when non-random data has been selected.

  • Confirmation bias: Occurs when there is an intentional or unintentional desire to prove a hypothesis, assumption, or opinion.

  • Outliers: Extreme data values that are significantly out of the normal range of values can completely bias the results of an analysis. If the outliers are not removed in these cases the results of the analysis can be misleading. These outliers are often interesting cases and ought to be investigated separately.

  • Simpson's Paradox: A trend that is indicated in the data can reverse when the data is split into comprising groups.

  • Overfitting: Involves an overly complex model which overestimates the effect/relevance of the examples in the training data and/or starts fitting to the noise in the training data.

  • Underfitting: Occurs when the underlying trend in the data is not found. Could occur if you try to fit a linear model to non linear data or if there is not enough data available to train the model.

  • Confounding Variables: Two variables may be assumed related when in fact they are both related to an omitted confounding variable. This is why correlation does not imply causation.

  • Non-Normality: If a distribution is assumed to be normal when it is not the results may be biased and misleading.

  • Data Dredging: This process involves testing huge numbers of hypotheses about a single data set until the desired outcome is found.

Comics from Dilbert Comics By Scott Adams.
Spurious Correlations from http://tylervigen.com/spurious-correlations.

Insights Lab

To learn more about the Rittman Mead Insights Lab please read my previous blog post about our methodology.

Or contact us at info@rittmanmead.com

Categories: BI & Warehousing

Full Table Scan With Partition Hint

Tom Kyte - Tue, 2017-07-04 03:46
Hi, My situation is this: 1.) I have a table that is partitioned on a given column (we'll call the partition p1). 2.) After loading the p1 partition, stats are run for that partition only (not the whole table). 3.) Once stats are run, a selec...
Categories: DBA Blogs


Subscribe to Oracle FAQ aggregator