Feed aggregator

Presidents of USA and their Birth Signs – Sankey Visualization

Nilesh Jethwa - Thu, 2015-10-08 14:00

In this analysis, we will visualize the relation between the Age at Presidency, State of Birth and birth sign.

Read more at: www.infocaptor.com/dashboard/presidents-of-usa-and-their-birth-signs-sankey-visualization

Amazon Quick Sight – BI on Cloud?

Dylan's BI Notes - Thu, 2015-10-08 09:08
In my post Data Warehouses on Cloud – Amazon Redshift, I mentioned that what would be really useful is providing BI on Cloud, not just Data Warehouse on Cloud. I felt that BICS makes more sense comparing to Amazon Redshfit. I discussed with a couple of people last night in a meetup.  Some of them […]
Categories: BI & Warehousing

How to delete older emails from GMAIL

Arun Bavera - Wed, 2015-10-07 09:40

image

image

 

Other category:

category: social older_than:45d

Categories: Development

Why go to Oracle OpenWorld?

Duncan Davies - Wed, 2015-10-07 08:00

We’re a shade under a month away from the biggest event in the calendar for those that work in the Oracle marketplace – the Oracle OpenWorld Conference.

It runs every year in San Francisco and draws a massive 60,000 attendees from 145 countries (plus 2.1 million online attendees). That’s huge.

There are more than 2,500 sessions from ~3,600 speakers, approximately half of which are customers/partners and half are Oracle themselves. As well as the sessions there are the demo grounds and the exhibition hall, all great places for networking with people that you’ve either not met before or have only ever come across online. You get quality face-time with top developers and execs, who are normally hidden behind many levels of Oracle Support. These are the people who have designed and written the products and services that we’ll be using over the coming years, so meeting up with them is priceless.

If you register before the event, it’s $2,450 (about £1,600).

I’m lucky to have the chance to go again this year, and I know already that it’s going to have huge value for both me and Cedar. Both my colleague, Graham, and I were lucky enough to be selected to speak (his session is on Fluid, mine is on Selective Adoption – the two hottest topics in PeopleSoft right now).

Graham also produced this lively promo video:

This (above) is what we look like, it’d be great to say hello to you if you’re around. Likewise, if you’re coming to either of our sessions let us know and we’ll be sure to say hi.

As a nice bonus, we get to see Elton John and Beck at the Appreciation Event!

I’m really looking forward to seeing and hearing about the very latest from the PeopleSoft and Fusion/Taleo worlds. Look out for a Cedar event when we return where we can share everything with you.


About My Son, Chris Silva, Amazing Artist, Father and All-Around Human Being

FeuerThoughts - Tue, 2015-10-06 14:59
"For the record...."




Chris is the 2015 recipient of a 3arts grant, which makes me incredibly proud and also gives me the opportunity to share his professional art bio (I mostly experience him these days as Papa to my two wonderful granddaughters).

Born in Puerto Rico, Chris Silva has been a prominent figure in Chicago’s graffiti and skateboarding scenes since the 1980s, as well as an enthusiastic fan of a wide range of music genres which have resulted from the influence of metropolitan life. Building on his solid graffiti art foundation, Silva proceeded to play a significant role in the development of what is now commonly referred to as "street art." He now splits his time between working on large-scale commissions, producing gallery oriented work, and leading youth-involved public art projects. As a self-taught sound artist with roots in DJ culture, Silva also anchors a collaborative recording project known as This Mother Falcon, and has recently started integrating his audio compositions into his installation work.

In the early 90s, Silva worked on a mural with the Chicago Public Art Group and was eventually brought on board to help lead community art projects with other urban youth. As a result, the act of facilitating art experiences for young people has become an important part of his art practice, and he regularly includes students as collaborators on large-scale artwork that often leans heavily on improvisation. Over the years, Silva has helped orchestrate youth art projects both independently and in partnership with Chicago Public Art Group, Young Chicago Authors, Gallery 37, Yollocalli Arts Reach, After School Matters, and the School of The Art Institute of Chicago.

Silva was awarded a major public art commission by the Chicago Transit Authority to create a mosaic for the Pink Line California Station (2004); created block-long murals in Chicago's Loop “You Are Beautiful” (2006); created a sculpture for the Seattle Sound Transit System (2008); won the Juried Award for Best 3D Piece at Artprize (2012); and created large commissions for 1871 Chicago (2013), the City of Chicago, LinkedIn, CBRE (2014), OFS Brands, and The Prudential Building (2015). He has exhibited in Chicago, San Francisco, Los Angeles, New York City, Philadelphia, London, Melbourne, Copenhagen, and The International Space Station. In 2007 Silva received an Artist Fellowship Award from The Illinois Arts Council.
Categories: Development

Top 8 Strategies to Thrive at Oracle OpenWorld

VitalSoftTech - Tue, 2015-10-06 14:08
Yes! It’s that time of the year again when we start planning for the premier Oracle OpenWorld. So let’s get right into it! Read more here – Top 8 Strategies to Thrive at Oracle OpenWorld o o o o o o Realted article – Advanced Sessions at Oracle OpenWorld 2015 I I I
Categories: DBA Blogs

Fundamentals of SQL Writeback in Dodeca

Tim Tow - Mon, 2015-10-05 22:00
One of the features of Dodeca is read-write functionality to SQL databases.  We often get questions as to how to write data back to a relational database, so I thought I would post a quick blog entry for our customers to reference.

This example will use a simple table structure in SQL Server though the concepts are the same when using Oracle, DB2, and most other relational databases.  The example will use a simple Dodeca connection to a JDBC database.  Here is the Dodeca SQL Connection object used for the connection.

The table I will use for this example was created with the following CREATE TABLE  statement.

CREATE TABLE [dbo].[Test](
[TestID] [int] IDENTITY(1,1) NOT NULL,
[TestCode] [nvarchar](50) NULL,
[TestName] [nvarchar](50) NULL,
  CONSTRAINT [PK_Test] PRIMARY KEY CLUSTERED 
  ([TestID] ASC)
)

First, I used the Dodeca SQL Excel View Wizard to create a simple view in Dodeca to retrieve the data into a spreadsheet.  The view, before setting up writeback capabilities, looks like this.

To make this view writeable, follow these steps.
  1. Add the appropriate SQL insert, update, or delete statements to the Dodeca SQL Passthrough Dataset object.  The values to be replaced in the SQL statement must be specified using the notation @ColumnName where ColumnName is the column name, or column alias, of the column containing the data.
  2. Add the column names of the primary key for the table to the PrimaryKey property of the SQL Passthrough DataSet object.
  3. Depending on the database used, define the column names and their respective JDBC datatypes in the Columns property of the SQL Passthrough Dataset.  This mapping is optional for SQL Server because Dodeca can obtain the required information from the Microsoft JDBC driver, however, the Oracle and DB2 JDBC drivers do not provide this information and it must be entered by the developer.
For insert, update, and delete operations, Dodeca parses the SQL statement to read the parameters that use the @ indicator and creates a JDBC prepared statement to execute the statements.  The prepared statement format is very efficient as it compiles the SQL statement once and then executes it multiple times.  Each inserted row is also passed to the server during the transaction.  The values from each row are then used in conjunction with the prepared statement to perform the operation.

Here is the completed Query definition.


Next, modify the DataSetRanges property of the Dodeca View object and, to enable insert operations, set the AllowAddRow property to True.  Note that if you added update and/or delete SQL to your SQL Passthrough Dataset object, be sure to enable those operations on the worksheet via the AllowDeleteRow and AllowModifyRow properties.

Once this step is complete, you can run the Dodeca View, add a row, and press the Save button to save the record to the relational database.



The insert, update, and delete functionalities using plain SQL statements is limited to operations on a single table.  If you need to do updates on multiple tables, you must use stored procedures to accomplish the functionality.  You can call a stored procedure in Dodeca using syntax similar to the following example:

{call sp_InsertTest(@TestCode, @TestName)}

Dodeca customers can contact support for further information at support@appliedolap.com.
Categories: BI & Warehousing

IBM Bluemix - Specify only Liberty buildpack features you require

Pas Apicella - Mon, 2015-10-05 21:22
I am more often then not using spring boot applications on IBM Bluemix and most of what I need is packaged with the application from JPA or JDBC, drivers, Rest etc. Of course with IBM Bluemix we can specify which build pack we wish to use but by default for java applications LIberty is used.

When a stand-alone application is deployed, a default Liberty configuration is provided for the application. The default configuration enables the following Liberty features:
  • beanValidation-1.1
  • cdi-1.2
  • ejbLite-3.2
  • el-3.0
  • jaxrs-2.0
  • jdbc-4.1
  • jndi-1.0
  • jpa-2.1
  • jsf-2.2
  • jsonp-1.0
  • jsp-2.3
  • managedBeans-1.0
  • servlet-3.1
  • websocket-1.1
  • icap:managementConnector-1.0
  • appstate-1.0
Here is how I strip out some of what isn't required in my Liberty runtime container to a bare minimal of what I need.

manifest.yml

applications:
 - name: pas-speedtest
   memory: 512M
   instances: 1
   path: ./demo-0.0.1-SNAPSHOT.jar
   host: pas-speedtest
   domain: mybluemix.net
   env:
     JBP_CONFIG_LIBERTY: "app_archive: {features: [jsp-2.3, websocket-1.1, servlet-3.1]}"


 More Information

https://www.ng.bluemix.net/docs/starters/liberty/index.html#optionsforpushinglibertyapplications


Categories: Fusion Middleware

Uploading 26M StackOverflow Questions into Oracle 12c

Marcelo Ochoa - Mon, 2015-10-05 17:42
Just for fun or testing in-memory capabilities of Oracle 12c

Following the post Import 10M Stack Overflow Questions into Neo4j In Just 3 Minutes I modified the python script to basically include the foreign key columns not included into the graph database design and required in a relational model.
Python files to_csv.py and utils.py can be download from my drive, basically it adds these two lines:
                el.get('parentid'),
                el.get('owneruserid'),
when generating the output file csvs/posts.csv, the idea is to convert the StackOverflow export files:
-rw-r--r-- 1 root root   37286997 ago 18 12:50 stackoverflow.com-PostLinks.7z
-rw-r--r-- 1 root root 7816218683 ago 18 13:52 stackoverflow.com-Posts.7z
-rw-r--r-- 1 root root     586861 ago 18 13:52 stackoverflow.com-Tags.7z
-rw-r--r-- 1 root root  160468734 ago 18 13:54 stackoverflow.com-Users.7z
-rw-r--r-- 1 root root  524354790 ago 18 13:58 stackoverflow.com-Votes.7z
-rw-r--r-- 1 root root 2379415989 sep  2 14:28 stackoverflow.com-Comments.7z
-rw-r--r-- 1 root root  112105812 sep  2 14:29 stackoverflow.com-Badges.7z
to a list of CSV files for quick importing into Oracle 12c RDBMS using external tables, here the list of converted files and theirs sizes:
3,8G         posts.csv
287M posts_rel.csv
524K tags.csv
517M tags_posts_rel.csv
355M users.csv
427M users_posts_rel.csv
with above files and an Oracle 12c running in a Docker container as is described into my previous post On docker, Ubuntu and Oracle RDBMS, I executed these steps:
- logged as SYSalter system set sga_max_size=4G scope=spfile;
alter system set sga_target=4G scope=spfile;
alter system set inmemory_size=2G scope=spfile;
create user sh identified by sh
   default tablespace ts_data
   temporary tablespace temp
   quota unlimited on ts_data;
grant connect,resource,luceneuser to sh;
create directory data_dir1 as '/mnt';
create directory tmp_dir as '/tmp';
grant all on directory data_dir1 to sh;
grant all on directory tmp_dir to sh;
it basically create a new user and directories to be used by the external tables. Note that the CSV files are available into the Docker machine as /mnt directory, I am running my Docker images with:
docker run --privileged=true --ipc=host --volume=/var/lib/docker/dockerfiles/stackoverflow.com/csvs:/mnt --volume=/mnt/backup/db/ols:/u01/app/oracle/data --name ols --hostname ols --detach=true --publish=1521:1521 --publish=9099:9099 oracle-12102
Then logged as SH user:
- Importing users
create table users_external
( user_id            NUMBER(10),
  display_name VARCHAR2(4000),
  reputation       NUMBER(10),
  aboutme         VARCHAR2(4000),
  website_url    VARCHAR2(4000),
  location          VARCHAR2(4000),
  profileimage_url VARCHAR2(4000),
  views             NUMBER(10),
  upvotes          NUMBER(10),
  downvotes     NUMBER(10)
)
organization external
( type  oracle_loader
  default directory data_dir1
  access parameters
  ( records delimited BY newline
    badfile tmp_dir: 'sh%a_%p.bad'
    logfile tmp_dir: 'sh%a_%p.log'
    fields
            terminated BY ','
            optionally enclosed BY '"'
            lrtrim
            missing field VALUES are NULL
  )
  location (data_dir1:'users.csv')
 )
 parallel
 reject limit unlimited;CREATE TABLE so_users
   TABLESPACE ts_data
   STORAGE (INITIAL 8M NEXT 8M)
   PARALLEL
   NOLOGGING
   COMPRESS FOR ALL OPERATIONS
      as (select * from users_external);
-- Elapsed: 00:00:22.76
ALTER TABLE so_users ADD PRIMARY KEY (user_id);
-- Elapsed: 00:00:13.08
create index so_users_display_name_idx on so_users(display_name);
-- Elapsed: 00:00:08.01
- Importing Posts
create table posts_external
( post_id      NUMBER(10),
  parent_id   NUMBER(10),
  user_id      NUMBER(10),
  title            VARCHAR2(4000),
  body          CLOB,
  score         NUMBER(10),
  views        NUMBER(10),
  comments NUMBER(10)
)
organization external
( type  oracle_loader
  default directory data_dir1
  access parameters
  ( records delimited BY newline
    badfile tmp_dir: 'sh%a_%p.bad'
    logfile tmp_dir: 'sh%a_%p.log'
    fields
            terminated BY ','
            optionally enclosed BY '"'
            lrtrim
            missing field VALUES are NULL
  )
  location (data_dir1:'posts.csv')
 )
 parallel
 reject limit unlimited;
CREATE TABLE so_posts
   TABLESPACE ts_data
   STORAGE (INITIAL 8M NEXT 8M)
   PARALLEL
   NOLOGGING
   COMPRESS FOR ALL OPERATIONS
      as (select * from posts_external);
-- Elapsed: 00:14:20.89
ALTER TABLE so_posts ADD PRIMARY KEY (post_id);
-- Elapsed: 00:02:35.86
-- purge posts associated to no imported users
delete from so_posts where user_id not in (select user_id from so_users);
-- Elapsed: 00:02:41.64
create index so_posts_user_id_idx on so_posts(user_id);
-- Elapsed: 00:01:34.87
ALTER TABLE so_posts ADD CONSTRAINT fk_so_user FOREIGN KEY (user_id) REFERENCES so_users(user_id);
-- Elapsed: 00:00:09.28
Note that 26 million posts where imported in 14 minutes, not so bad considering that CSV source was at an external USB 2.0 drive and Oracle 12c tablespaces where placed at an USB 3.0 drive, here a screenshot showing the IO bandwidth consumed in both drivers.

only 4.8 Mb/s for reading from sdb (CSV) and 9.7 Mb/s for writing at sdc1 (ts_data).
- Importing tags
create table tags_external
( tag_id      VARCHAR2(4000)
)
organization external
( type  oracle_loader
  default directory data_dir1
  access parameters
  ( records delimited BY newline
    badfile tmp_dir: 'sh%a_%p.bad'
    logfile tmp_dir: 'sh%a_%p.log'
    fields
            terminated BY ','
            optionally enclosed BY '"'
            lrtrim
            missing field VALUES are NULL
  )
  location (data_dir1:'tags.csv')
 )
 parallel
 reject limit unlimited;
CREATE TABLE so_tags
   TABLESPACE ts_data
   STORAGE (INITIAL 8M NEXT 8M)
   PARALLEL
   NOLOGGING
   COMPRESS FOR ALL OPERATIONS
      as (select * from tags_external);
-- Elapsed: 00:00:00.55
create table tags_posts_external
( post_id      NUMBER(10),
  tag_id      VARCHAR2(4000)
)
organization external
( type  oracle_loader
  default directory data_dir1
  access parameters
  ( records delimited BY newline
    badfile tmp_dir: 'sh%a_%p.bad'
    logfile tmp_dir: 'sh%a_%p.log'
    fields
            terminated BY ','
            optionally enclosed BY '"'
            lrtrim
            missing field VALUES are NULL
  )
  location (data_dir1:'tags_posts_rel.csv')
 )
 parallel
 reject limit unlimited;
CREATE TABLE so_tags_posts
   TABLESPACE ts_data
   STORAGE (INITIAL 8M NEXT 8M)
   PARALLEL
   NOLOGGING
   COMPRESS FOR ALL OPERATIONS
      as (select * from tags_posts_external);
-- Elapsed: 00:00:43.75
-- purge tags associated to no imported posts
delete from so_tags_posts where post_id not in (select post_id from so_posts);
-- Elapsed: 00:02:42.00
create index so_tags_posts_post_id_idx on so_tags_posts(post_id);
-- Elapsed: 00:00:43.29
ALTER TABLE so_tags_posts ADD CONSTRAINT fk_so_posts FOREIGN KEY (post_id) REFERENCES so_posts(post_id);
-- Elapsed: 00:01:16.65
Note that as in posts<->users one-to-many relation, tags<->posts is also a one-to-many relation and some posts referenced by a few tags where not imported due character-encoding errors.
As a summary of the above steps 26 millions posts of 4.5 millions registered users where imported; 41K distinct tags are used with an average of 1.11 tag by post (29M tags/posts rows).
Next blog post will be about using Oracle 12c in-memory features to query this corpus data.

OTN at Oracle OpenWorld Group - Join today!

OTN TechBlog - Mon, 2015-10-05 12:27

Join the OTN at Oracle OpenWorld group on the OTN Community Platform!  This group is designed to keep you in the know about all the GREAT activities and events that the Team OTN is planning/organizing for Oracle OpenWorld in San Francisco this October (24th to 28th).

Some of the events/activities to look forward to -

Community Events - RAC Attack and Blogger Meetup.

Networking Opportunities - Sunday Kick off Party, Cloud Hour

NEW activities! Graffiti Wall and giant games plus Make Your Own T-Shirt is back with NEW art!

15221_OTN Lounge-Graphics_FINAL 1.jpg 15221_OTN Lounge-Graphics_FINAL 4.jpg15221_OTN Lounge-Graphics_FINAL 5.jpg15221_OTN Lounge-Graphics_FINAL 7.jpg15221_OTN Lounge-Graphics_FINAL 8.jpg

We hope to see you there!

TEAM OTN


Do we really need semantic layer from OBIEE?

Dylan's BI Notes - Mon, 2015-10-05 10:22
Not all BI tools have the semantic layer.  For example, Oracle Discoverer seems not having a strong semantic layer. This page summarizes what OBIEE semantic layer can do for you… BI Platform Semantic Layer I think that if these features can be accomplished in other ways and can be proven that they are not necessary, […]
Categories: BI & Warehousing

Cedar’s Selective Adoption Event recap

Duncan Davies - Mon, 2015-10-05 05:00

A week or so back Cedar held a free Selective Adoption event for clients and friends. The idea behind the event was to help those on 9.2 already to make the most of what Selective Adoption can offer, and to show those that are yet to make the step to 9.2 what the future could look like.

The event went really well. Jeff Robbins opened the proceedings, giving an overview of the technology and what the roadmap looks like. Then Graham Smith and I did a couple of slots each on how the process works, what you need to get the technology up and running, the huge value it can bring, and the areas that you should do yourself versus the ones where it’s cheaper to get help.

03 - Graham Dives DeeperGraham diving deep into the Tech

04 - Duncan Discusses OptionsCovering the Options

After the event we all decamped to a nearby pub for less formal chat. It was really great to see that some clients still wanted more however. Happily, Graham was able to do a live demo from the middle of the pub, showing that we can ‘walk the walk’ as well as talking about it …

08 - Graham live demos in the pubLive demo in the pub


Data Lake vs. Data Warehouse

Dylan's BI Notes - Sun, 2015-10-04 18:18
These are different concepts. Data Lake – Collect data from various sources in a central place.  The data are stored in the original form.  Big data technologies are used and thus the typical data storage is Hadoop HDFS. Data Warehouse – “Traditional” way of collecting data from various sources for reporting.  The data are consolidated […]
Categories: BI & Warehousing

Use DMZ to access BI from outside firewall

Dylan's BI Notes - Fri, 2015-10-02 14:12
DMZ is a technology that allows you to configure your network to be accessible outside firewall. Some of users may want to access some of corporate reports from mobile or from their personal computers. While VPN and Citrix may be useful for these cases, DMZ can provide another option. A good article – OBIEE Security […]
Categories: BI & Warehousing

What I Wanted to Tell Terry Bradshaw

Cary Millsap - Thu, 2015-10-01 17:23
I met Terry Bradshaw one time. It was about ten years ago, in front of a movie theater near where I live.

When I was little, Terry Bradshaw was my enemy because, unforgivably to a young boy, he and his Pittsburgh Steelers kept beating my beloved Dallas Cowboys in Super Bowls. As I grew up, though, his personality on TV talk shows won me over, and I enjoy watching him to this day on Fox NFL Sunday. After learning a little bit about his life, I’ve grown to really admire and respect him.

I had heard that he owned a ranch not too far from where I live, and so I had it in mind that inevitably I would meet him someday, and I would say thank you. One day I had that chance.

I completely blew it.

My wife and I saw him there at the theater one day, standing by himself not far from us. It seemed like if I were to walk over and say hi, maybe it wouldn’t bother him. So I walked over, a little bit nervous. I shook his hand, and I said, “Mr. Bradshaw, hi, my name is Cary.” I would then say this:

I was a big Roger Staubach fan growing up. I watched Cowboys vs. Steelers like I was watching Good vs. Evil.

But as I’ve grown up, I have gained the deepest admiration and respect for you. You were a tremendous competitor, and you’re one of my favorite people to see on TV. Every time I see you, you bring a smile to my face. You’ve brought joy to a lot of people.

I just wanted to say thank you.
Yep, that’s what I would say to Terry Bradshaw if I got the chance. But that’s not how it would turn out. How it actually went was like this, …my big chance:

Me: I was a big Roger Staubach fan growing up.
TB: Hey, so was I!
Me: (stunned)
TB: (turns away)
The End
I was heartbroken. It bothers me still today. If you know Terry Bradshaw or someone who does, I wish you would please let him know. It would mean a lot to me.

…I did learn something that day about the elevator pitch.

Oracle Priority Support Infogram for 01-OCT-2015

Oracle Infogram - Thu, 2015-10-01 14:42

RDBMS


PL/SQL

A Surprising Program, from Oracle Database PL/SQL and EBR.

Data Warehouse

DOP Downgrades, or Avoid The Ceiling, from The Data Warehouse Insider blog.

WebLogic


Java


Creating Games with JavaFX 8: Case Study, from The Java Tutorials Blog.

OAG

The 10 most recently created notes for OAG as of 24 Sept. 2015., from Proactive Support - Java Development using Oracle Tools.

Ops Center

Changing an Asset's Name, from the Oracle Ops Center blog.

Data Integration


SOA

Top tweets SOA Partner Community – September 2015, from the the SOA & BPM Partner Community Blog.

Real User Monitoring

How to Configure Used ID Identification, from Real User Monitoring.

Solaris

Solaris: Identifying EFI disks, from Giri Mandalika's Repository.

EBS

From the Oracle E-Business Suite Support blog:




Finally Eliminate Those Duplicate WIP Transactions!


Rise of the Robots - Review

Abhinav Agarwal - Thu, 2015-10-01 11:08
Rise of the Robots: Technology and the Threat of a Jobless Future
Martin Ford

Part 1 of 3

"I'm smart; you're dumb. I'm big; you're small. I'm right; you're wrong. And there's nothing you can do about it."

Thus spake Harry Wormwood in the movie "Matilda". This well could be the message that robots will have for us in the not too distant future. The dramatic improvements in the speed, the accuracy, and the areas in which computers have begun to comprehensively outperform humans leads one to believe that while a so-called singularity may well be some ways off, the more immediate effects of this automation are already being felt in permanent job losses. In a country like India, which has used digital technologies quite effectively in the last decade and a half to grow a $150 billion IT-BPM industry, the impact could be devastating - especially where an estimated 10 million people are employed.
In many spheres - chess for example - they could utter these lines to us humans today and there's nothing we can do about it - for the computer is right. The puniest of computers in the tiniest of smartphones possesses enough computing horsepower and smart-enough algorithms (written by us humans - oh yes, the irony!) to defeat the best of us humans in chess, every single time, without breaking a sweat. Computers have been able to add, subtract, divide, square, multiply faster and more accurately than us for decades now, and there's nothing we can do about that either.

From the time of the Luddites - who rose up against the machines of the Industrial Revolution in the early years of the nineteenth century - to the present-day "Judgment Day" Sarah Connor avatars, inspired as much by an acute awareness of the march of technology as by James Cameroon's "Terminator" movies, the refrain of the chorus has been more or less unchanging: the machines are coming for our jobs, our livelihoods, and will finally come for us (the Matrix was premised on a variant of one such dystopian future). Computing power of computers exploded in the second half of the twentieth century, obeying the inexorable pull of Moore's Law, and made feasible by advances in semiconductors, fabrication techniques, and electrical engineering. As did fears that similar software advances could somehow endow machines with intelligence - Artificial Intelligence. These fears however did not quite come to pass. For several decades, there were several false hopes and starts that were kindled and then extinguished. Till this decade. The congruence of seemingly infinite computing power - thanks to massive server farms running in the "cloud" (a mangled metaphor if ever there was one), cheap and lightning fast bandwidth available on tap, storage and memory that keeps getting impossibly cheaper every year, and sophisticated software algorithms - has however made it clear that "machine intelligence" is no longer an oxymoron. We are well and truly living in the middle of the machine age. The "singularity" may well be witnessed in our lifetimes, within a decade or two even.

Martin Ford's book, "The Rise of the Robots: Technology and the Threat of a Jobless Future" lays out the case for a not-so-distant future where machines make possible the automation of almost every task imaginable, but at a great social and economic cost. The book is neatly organized, lucidly argued, and except for a lengthy and somewhat incongruous chapter on the medical system, the book stays on point. Ford makes it clear that neither is this technological progress reversible, nor wholly desirable. Its consequences therefore cannot be wished away - income inequality as an example, which economists for three decades have been explaining away as a temporary anomaly. The last section, which is more contemplative and prescriptive, as opposed to the earlier sections which are descriptive, discusses possible solutions, some of which will shock free market proponents. Whether there are more practical, workable answers is quite another thing though.

Part 2 of 3

Machines have been able to do mechanical jobs faster than humans, with greater precision, and for longer periods of time - the cotton gin invented in the eighteenth century for example. The inevitable loss of jobs called for a re-skilling of the people affected, and the mantra went that you had to pull yourself up by your socks, learn a new skill, and get productive again. Martin Ford's book shatters that illusion. There is not a single profession left - whether unskilled or skilled, whether in technology or medicine or liberal arts, whether one that can be performed remotely or requires direct human interaction - that is not at threat from the machines. Whichever way you slice and dice it, you are left facing one or the other variation of a dystopian future, with stark income inequalities, a substantial population that will require doles on a permanent doles, and the concomitant social upheavals.

Some years back, when offshoring was in the news and concerns about its impact on US jobs was at its peak, with hundreds of thousands of jobs moved offshore to countries like India, there were stories coming out regularly, like the one about Southern California workers being made to train H1-B visa holders, many of whom took over their jobs. Pfizer made "hundreds of tech workers at its Connecticut R&D facilities" train their replacements - guest workers from India. If the economics of labor cost arbitrage precipitated the migration of skilled technology jobs away from the United States and to countries like India (being "Bangalored" entered the urban lexicon only a decade ago), technology could plausibly bring those jobs back to the United States - call it "reshoring". The quantum of jobs reshored, however, is going to be a massive disappointment. Consider this: "In 2011, the Washington Post’s Michael Rosenwald reported that a colossal, billion-dollar data center built by Apple, Inc., in the town of Maiden, North Carolina, had created only fifty full-time positions." But it is precisely this elimination of the need for many people that makes the economics of reshoring work out. Ironical.

While the United States can at least look forward to the reshoring of some jobs lost to manufacturing in China or services in India, the loss of such jobs is certain, on the other hand, to cause greater upheaval in these offshore countries. India's socio-economic progress is predicated in great deal on a re-skilling of its labour force to take advantage of an emerging "Digital India" both in the manufacturing and services sector, but which is in mortal danger of being blindsided by the rise of the machines. The use of IT-based services as a catalyst for driving economic growth in smaller - Tier B and Tier C - cities in India is a recurrent theme for planners. But this could be short-circuited by the rise of the robots, who, once trained - by humans - can perform the jobs of humans, better, and faster. Indians were trained by their American counterparts to do their jobs. Unbeknownst to many, these people are actors in the same offshoring saga that played out a decade ago, but with the proverbial shoe on the other foot now. "The bottom line is that if you find yourself working with, or under the direction of, a smart software system, it’s probably a pretty good bet that—whether you’re aware of it or not—you are also training the software to ultimately replace you."

India has been a spectacular laggard when it has come to industrializing its economy - it is probably unique among all developing nations to be progressing (or at least with ambitions of progressing) from a primarily agrarian economy to a services-based economy, skipping substantially the intermediate phase of industrialization that every single industrialized nation went through last century. It was industrialization that provided the bedrock for the middle-class in nations, which then aspired towards a better quality of life, with the ability to pay for it - thus driving the move towards a services-based economy. For India, it could be argued by some that this skipping may prove to be a blessing, since an industrialized economy is more susceptible to efficiencies wrought by advancements in technology. Consider these examples from Ford's book:

1. "in the United States, chickens are grown to standardized sizes so as to make them compatible with automated slaughtering and processing."

2. Momentum Machines, a San Francisco based startup has developed a machine that "shapes burgers from freshly ground meat and then grills them to order - including even the ability to add just the right amount of char while retaining all the juices. The machine, which is capable of producing about 360 hamburgers per hour, also toasts the bun and then slices and adds fresh ingredients like tomatoes, onions, and pickles only after the order is placed." The company's co-founder is clear that these machines are not "meant to make employees more efficient... It's meant to completely obviate them."

3. "Vision Robotics, a company based in San Diego, California, is developing an octopus-like orange harvesting machine. The robot will use three-dimensional machine vision to make a computer model of an entire orange tree and then store the location of each fruit. That information will then be passed on to the machine’s eight robotic arms, which will rapidly harvest the oranges."

4. "Researchers at Facebook have likewise developed an experimental system—consisting of nine levels of artificial neurons—that can correctly determine whether two photographs are of the same person 97.25 percent of the time, even if lighting conditions and orientation of the faces vary. That compares with 97.53 percent accuracy for human observers."

5. "A Facebook executive noted in November 2013 that the Cyborg system routinely solves thousands of problems that would otherwise have to be addressed manually, and that the technology allows a single technician to manage as many as 20,000 computers."

6. If reading certain news articles makes you wonder whether a robot wrote it, things are going to get better - or worse. Computer algorithms are at work to churn out articles that will be indistinguishable from those written by humans. Liberal arts became even more unviable - if ever that was possible.
"In 2010, the Northwestern University researchers who oversaw the team of computer science and journalism students who worked on StatsMonkey raised venture capital and founded a new company, Narrative Science, Inc., to commercialize the technology. The company hired a team of top computer scientists and engineers; then it tossed out the original StatsMonkey computer code and built a far more powerful and comprehensive artificial intelligence engine that it named “Quill.”
... One of Narrative Science’s earliest backers was In-Q-Tel, the venture capital arm of the Central Intelligence Agency"

"To keep instructional costs down, colleges are relying ever more heavily on part-time, or adjunct, faculty who are paid on a per-course basis—in some cases as little as $2,500 for a semester-long class—and receive no employee benefits. Especially in the liberal arts, these adjunct positions have become dead-end jobs for huge numbers of PhD graduates who once hoped for tenure-track academic careers."

7. "Radiologists, for example, are trained to interpret the images that result from various medical scans. Image processing and recognition technology is advancing rapidly and may soon be able to usurp the radiologist’s traditional role."

8. "In July 2012, the London Symphony Orchestra performed a composition entitled Transits—Into an Abyss. One reviewer called it “artistic and delightful.” The event marked the first time that an elite orchestra had played music composed entirely by a machine. The composition was created by Iamus, a cluster of computers running a musically inclined artificial intelligence algorithm."

9. "Perhaps the most remarkable elder-care innovation developed in Japan so far is the Hybrid Assistive Limb (HAL)—a powered exoskeleton suit straight out of science fiction. Developed by Professor Yoshiyuki Sankai of the University of Tsukuba, the HAL suit is the result of twenty years of research and development. Sensors in the suit are able to detect and interpret signals from the brain. When the person wearing the battery-powered suit thinks about standing up or walking, powerful motors instantly spring into action, providing mechanical assistance. A version is also available for the upper body and could assist caretakers in lifting the elderly. Wheelchair-bound seniors have been able to stand up and walk with the help of HAL."

As one goes over these examples, it becomes obvious that automation is a sword that cuts both ways. Is India equipped - and more importantly, are the planners aware - to handle the flood of automation that could wash away entire swathes of jobs being dreamed up by ambitions of a digitally-enabled nation?

Part 3 of 3

As 2014 drew to a close, the Indian IT industry was rocked by rumours that TCS (the largest Indian IT company by annual revenues) had completed an internal review and had initiated lay offs of thousands of employees - mostly in middle management. Some stories talked about a number as high as 30,000. The saga finally ended with a round of clarifications and denials by TCS and some well-deserved opprobrium over its inept handling of the needless controversy. What the fracas however served to highlight was a stark truth that's been staring at the Indian IT industry for some time now - the skills that the typical Indian IT worker possesses are mostly undifferentiated and prime candidates for automation.
What is worse, from at least one perspective, is the fact that (smart) humans have built technology that has becoming adept at "engineering the labor out of the product." One will need to be particularly myopic to not also recognize that "the machines are coming for the higher-skill jobs as well." This much should have been clear in part two of this series, through the examples I cited from Martin Ford's book.
One recurring theme in Martin Ford's book, "Rise of the Robots", at least in the initial chapters, is the permanence and acceleration of offshoring to countries like India, which he believes, "has built a major, nationally strategic industry specifically geared toward the electronic capture of American and European jobs." (As an aside, most Indians would be somewhat puzzled by this assertion, given at times the outright hostility which politicians in India display towards the IT industry, like the time when a former prime minister indirectly accused the Bangalore IT industry of "immoral, unethical and illegal prosperity"!) Anyway, leaving that aside aside, in advancing his argument Ford posits that as "powerful AI-based tools make it easier for offshore workers to compete with their higher-paid counterparts in developed countries, advancing technology is also likely to upend many of our most basic assumptions about which types of jobs are potentially offshorable. Nearly everyone believes, for example, that occupations that require physical manipulation of the environment will always be safe."

Ford believes that the development of a digital infrastructure in India and the advancement of AI and related technologies will make things worse for US (and Europe) jobs. True to some extent though that may be, you have to consider the fact that increasing automation makes it cheaper and less labor-intensive to maintain, run, and patch-and-upgrade software applications. Furthermore, any offshoring of jobs adds its own overheads by way of administrative and managerial redundancies that cannot be done away with. Automation efficiencies reduce the need for labour, which is the often the single biggest component in any software application over its entire life. Therefore, the very factors that Ford fears are threatening to make offshoring permanent and more widespread are also likely to make reshoring financially viable. It's a sword that cuts both ways.

To be fair, the digital economy in India has a lot of headroom to grow; especially as the Indian government's Smart City initiative brings e-governance and services to the common man through the Internet and technologies. This could well provide a second wind to the Indian IT industry for a decade or more.

However, it is a smart strategy to keep one eye on the what the winds of such a digital nirvana may blow in. An indicator of the direction in which the Indian IT job market is likely to evolve therefore can be found by looking at the US, where the "propensity for the economy to wipe out solid middle-skill, middle-class jobs, and then to replace them with a combination of low-wage service jobs and high-skill, professional jobs that are generally unattainable for most of the workforce, has been dubbed "job market polarization.""
This phrase - "job market polarization" should give us a fair indication of what is in store for the hundreds of thousands, even millions, of graduates in India, many of whom emerge today out of college with a stark degree of antipathy for doing the "9-5" grind that they saw their parents and its generation go through. Digital "start-up" nirvana beckons for them. Each sees himself as a digital entrepreneur of the new economy. They are ready to chuck the "dependable income stream that anchors them into the middle-class" - they view it not as an "anchor" but more a millstone. However, the vast majority is likely to find itself stuck in what "techno visionary" Jared Lanier calls the "informal economy that is found in third-world nations." It's a tiny minority that will "live at the extreme left of the long tail" of such a digital economy. For every Flipkart or SnapDeal (the final word on that fairy-tale saga is yet to be written), you will find tens of thousands of resellers at the other end of the payoff tail, paying these e-tailers money every month for the privilege of selling on their platforms, at prices that barely cover operating costs.

The Indian middle-class, for all its flaws, has represented for decades an aspirational lodestar for the vast majority of the country's poor. So what happens when the digital economy hollows out the middle of the job market - "job polarization" as described above? Again, we can look westwards for possible answers.
"In an analysis published in February 2014, MIT economist James Poterba found that a remarkable 50 percent of American households aged sixty-five to sixty-nine have retirement account balances of $5,000 or less. According to Poterba’s paper, even a household with $100,000 in retirement savings would receive a guaranteed income of only about $5,400 per year (or $450 per month) with no cost-of-living increases, if the entire balance were used to purchase a fixed annuity."
In other words, in the absence of both a retirement corpus and a government guaranteed pension, there is a real risk of an emergent middle-class sliding right back into the working poor or even the underclass - a recipe for social unrest.

An inevitable counter-argument to all this unease generated by the "rise of the robots" is the "humans are underrated" palliative. Championing this is Tom Davenport (of "Competing on Analytics" fame) who now talks of "amplified intelligence" (which unfortunately has more the stench of a seo-optimized buzzword than anything substantial at this point) - where "smart" humans work to "augment" "smart" machines. Then there is also Geoff Colvin, who penned the insightful 2008 book, "Talent Is Overrated", and who has returned with "Humans Are Overrated". I have yet to read Colvin's latest book, so judgment day is reserved on the book, but to Davenport's argument, some of the evidence suggests an easy refutation - "In his 2007 book Super Crunchers, Yale University professor Ian Ayres cites study after study showing that algorithmic approaches routinely outperform human experts. When people, rather than computers, are given overall control of the process, the results almost invariably suffer." In many fields where algorithms rule the roost, to argue for human "augmentation" or "amplification" is no better than to argue for more cooks to brew the broth - we know that aphorism, don't we?

In conclusion, and in many ways, the saga documented in "Rise of the Robots" calls to mind the ancient Indian tale of the four friends:
In ancient India there lived four friends. Three of them were very learned, while the fourth was a simpleton, even considered a fool. The four decided to go to the capital and seek their fortune from the king. Along the way, while passing through a jungle, they came across the bones of a lion long dead. The first friend used his knowledge to assemble the bones into a skeleton. The second friend used his skills to fashion a skin over the skeleton, while the third prepared to bring the lion back to life. At this the fourth friend - the simpleton - warned his other three friends of the perils of doing so, and was roundly rebuked by the three, wiser friends. The simpleton again warned them and upon being ignored, climbed a tree for safety. The third friend used his knowledge to breathe life into the lion. I don't need to tell you how this tale ended for the three wise men.

And I will end here.

Buying Info:
Hardcover: 352 pagesPublisher: Basic Books (May 5, 2015)ISBN-10: 0465059996ISBN-13: 978-0465059997

US: Rise of the Robots: Technology and the Threat of a Jobless Future
India: Rise of the Robots: Technology and the Threat of a Jobless Future

The first part was first published in PerformanceGurus on 8th August, 2015.
The second part was first published in PerformanceGurus on 13th August, 2015.
The concluding part was first published in PerformanceGurus on 15th August, 2015.

 
© 2015, Abhinav Agarwal (अभिनव अग्रवाल). All rights reserved.

Generate 11g password hash

Laurent Schneider - Thu, 2015-10-01 10:14

An easy way to generate a value string from the ssl is to use openssl

Let’s take a random salt of ABCDEFGHIJ. The length of 10 is important.

The hexadecimal representation is -41-42-43-44-45-46-47-48-49-4A-


$ echo "SafePassw0rDABCDEFGHIJ\c" | openssl dgst -sha1
(stdin)= 47cc4102144d6e479ef3d776ccd9e0d0158842bb

With this hash, I can construct my value


SQL> create user testuser identified by values 'S:47CC4102144D6E479EF3D776CCD9E0D0158842BB4142434445464748494A';

User created.

SQL> grant create session to testuser;

Grant succeeded.

SQL> conn testuser/SafePassw0rD
Connected.

If you prefer PL/SQL over shell, use DBMS_CRYPTO


SQL> exec dbms_output.put_line('S:'||dbms_crypto.hash(utl_raw.cast_to_raw('SafePassw0rDABCDEFGHIJ'),dbms_crypto.HASH_SH1)||utl_raw.cast_to_raw('ABCDEFGHIJ'))
S:47CC4102144D6E479EF3D776CCD9E0D0158842BB4142434445464748494A

PL/SQL procedure successfully completed.

In 12c there is also a “T” String. According to the doc
The cryptographic hash function used for generating the 12C verifier is based on a de-optimized algorithm involving PBKDF2 and SHA-512.

IBM Bluemix - Triggerring backing service creation from "Deploy to Bluemix" button

Pas Apicella - Thu, 2015-10-01 08:01
I recently posted about the "Deploy to Bluemix" button which will automatically deploy an application into IBM Bluemix from a single click.

http://theblasfrompas.blogspot.com.au/2015/09/adding-deploy-to-bluemix-button-to-my.html

If the application requires backing services they can automatically be created using the"declared-services" tag as shown below. Declared services are a manifest extension, which creates or looks for the required or optional services that are expected to be set up before the app is deployed, such as a data cache service. You can find a list of the eligible Bluemix services, labels, and plans by using the CF Command Line Interface and running cf marketplace

manifest.yml

declared-services:
  redis-session:
    label: rediscloud
    plan: 30mb
applications:
 - name: pas-sbsessions
   memory: 512M
   instances: 2
   path: ./target/SpringBootHTTPSession-0.0.1-SNAPSHOT.jar
   host: pas-sbsessions
   domain: mybluemix.net
   buildpack: java_buildpack
   services:
    - redis-session


Note: Declared services is an IBM extension of the standard Cloud Foundry manifest format. This extension might be revised in a future release as the feature evolves and improves.
Categories: Fusion Middleware

Pages

Subscribe to Oracle FAQ aggregator