Feed aggregator

Fundamentals of SQL Writeback in Dodeca

Tim Tow - Mon, 2015-10-05 22:00
One of the features of Dodeca is read-write functionality to SQL databases.  We often get questions as to how to write data back to a relational database, so I thought I would post a quick blog entry for our customers to reference.

This example will use a simple table structure in SQL Server though the concepts are the same when using Oracle, DB2, and most other relational databases.  The example will use a simple Dodeca connection to a JDBC database.  Here is the Dodeca SQL Connection object used for the connection.

The table I will use for this example was created with the following CREATE TABLE  statement.

CREATE TABLE [dbo].[Test](
[TestID] [int] IDENTITY(1,1) NOT NULL,
[TestCode] [nvarchar](50) NULL,
[TestName] [nvarchar](50) NULL,
  CONSTRAINT [PK_Test] PRIMARY KEY CLUSTERED 
  ([TestID] ASC)
)

First, I used the Dodeca SQL Excel View Wizard to create a simple view in Dodeca to retrieve the data into a spreadsheet.  The view, before setting up writeback capabilities, looks like this.

To make this view writeable, follow these steps.
  1. Add the appropriate SQL insert, update, or delete statements to the Dodeca SQL Passthrough Dataset object.  The values to be replaced in the SQL statement must be specified using the notation @ColumnName where ColumnName is the column name, or column alias, of the column containing the data.
  2. Add the column names of the primary key for the table to the PrimaryKey property of the SQL Passthrough DataSet object.
  3. Depending on the database used, define the column names and their respective JDBC datatypes in the Columns property of the SQL Passthrough Dataset.  This mapping is optional for SQL Server because Dodeca can obtain the required information from the Microsoft JDBC driver, however, the Oracle and DB2 JDBC drivers do not provide this information and it must be entered by the developer.
For insert, update, and delete operations, Dodeca parses the SQL statement to read the parameters that use the @ indicator and creates a JDBC prepared statement to execute the statements.  The prepared statement format is very efficient as it compiles the SQL statement once and then executes it multiple times.  Each inserted row is also passed to the server during the transaction.  The values from each row are then used in conjunction with the prepared statement to perform the operation.

Here is the completed Query definition.


Next, modify the DataSetRanges property of the Dodeca View object and, to enable insert operations, set the AllowAddRow property to True.  Note that if you added update and/or delete SQL to your SQL Passthrough Dataset object, be sure to enable those operations on the worksheet via the AllowDeleteRow and AllowModifyRow properties.

Once this step is complete, you can run the Dodeca View, add a row, and press the Save button to save the record to the relational database.



The insert, update, and delete functionalities using plain SQL statements is limited to operations on a single table.  If you need to do updates on multiple tables, you must use stored procedures to accomplish the functionality.  You can call a stored procedure in Dodeca using syntax similar to the following example:

{call sp_InsertTest(@TestCode, @TestName)}

Dodeca customers can contact support for further information at support@appliedolap.com.
Categories: BI & Warehousing

IBM Bluemix - Specify only Liberty buildpack features you require

Pas Apicella - Mon, 2015-10-05 21:22
I am more often then not using spring boot applications on IBM Bluemix and most of what I need is packaged with the application from JPA or JDBC, drivers, Rest etc. Of course with IBM Bluemix we can specify which build pack we wish to use but by default for java applications LIberty is used.

When a stand-alone application is deployed, a default Liberty configuration is provided for the application. The default configuration enables the following Liberty features:
  • beanValidation-1.1
  • cdi-1.2
  • ejbLite-3.2
  • el-3.0
  • jaxrs-2.0
  • jdbc-4.1
  • jndi-1.0
  • jpa-2.1
  • jsf-2.2
  • jsonp-1.0
  • jsp-2.3
  • managedBeans-1.0
  • servlet-3.1
  • websocket-1.1
  • icap:managementConnector-1.0
  • appstate-1.0
Here is how I strip out some of what isn't required in my Liberty runtime container to a bare minimal of what I need.

manifest.yml

applications:
 - name: pas-speedtest
   memory: 512M
   instances: 1
   path: ./demo-0.0.1-SNAPSHOT.jar
   host: pas-speedtest
   domain: mybluemix.net
   env:
     JBP_CONFIG_LIBERTY: "app_archive: {features: [jsp-2.3, websocket-1.1, servlet-3.1]}"


 More Information

https://www.ng.bluemix.net/docs/starters/liberty/index.html#optionsforpushinglibertyapplications


Categories: Fusion Middleware

Uploading 26M StackOverflow Questions into Oracle 12c

Marcelo Ochoa - Mon, 2015-10-05 17:42
Just for fun or testing in-memory capabilities of Oracle 12c

Following the post Import 10M Stack Overflow Questions into Neo4j In Just 3 Minutes I modified the python script to basically include the foreign key columns not included into the graph database design and required in a relational model.
Python files to_csv.py and utils.py can be download from my drive, basically it adds these two lines:
                el.get('parentid'),
                el.get('owneruserid'),
when generating the output file csvs/posts.csv, the idea is to convert the StackOverflow export files:
-rw-r--r-- 1 root root   37286997 ago 18 12:50 stackoverflow.com-PostLinks.7z
-rw-r--r-- 1 root root 7816218683 ago 18 13:52 stackoverflow.com-Posts.7z
-rw-r--r-- 1 root root     586861 ago 18 13:52 stackoverflow.com-Tags.7z
-rw-r--r-- 1 root root  160468734 ago 18 13:54 stackoverflow.com-Users.7z
-rw-r--r-- 1 root root  524354790 ago 18 13:58 stackoverflow.com-Votes.7z
-rw-r--r-- 1 root root 2379415989 sep  2 14:28 stackoverflow.com-Comments.7z
-rw-r--r-- 1 root root  112105812 sep  2 14:29 stackoverflow.com-Badges.7z
to a list of CSV files for quick importing into Oracle 12c RDBMS using external tables, here the list of converted files and theirs sizes:
3,8G         posts.csv
287M posts_rel.csv
524K tags.csv
517M tags_posts_rel.csv
355M users.csv
427M users_posts_rel.csv
with above files and an Oracle 12c running in a Docker container as is described into my previous post On docker, Ubuntu and Oracle RDBMS, I executed these steps:
- logged as SYSalter system set sga_max_size=4G scope=spfile;
alter system set sga_target=4G scope=spfile;
alter system set inmemory_size=2G scope=spfile;
create user sh identified by sh
   default tablespace ts_data
   temporary tablespace temp
   quota unlimited on ts_data;
grant connect,resource,luceneuser to sh;
create directory data_dir1 as '/mnt';
create directory tmp_dir as '/tmp';
grant all on directory data_dir1 to sh;
grant all on directory tmp_dir to sh;
it basically create a new user and directories to be used by the external tables. Note that the CSV files are available into the Docker machine as /mnt directory, I am running my Docker images with:
docker run --privileged=true --ipc=host --volume=/var/lib/docker/dockerfiles/stackoverflow.com/csvs:/mnt --volume=/mnt/backup/db/ols:/u01/app/oracle/data --name ols --hostname ols --detach=true --publish=1521:1521 --publish=9099:9099 oracle-12102
Then logged as SH user:
- Importing users
create table users_external
( user_id            NUMBER(10),
  display_name VARCHAR2(4000),
  reputation       NUMBER(10),
  aboutme         VARCHAR2(4000),
  website_url    VARCHAR2(4000),
  location          VARCHAR2(4000),
  profileimage_url VARCHAR2(4000),
  views             NUMBER(10),
  upvotes          NUMBER(10),
  downvotes     NUMBER(10)
)
organization external
( type  oracle_loader
  default directory data_dir1
  access parameters
  ( records delimited BY newline
    badfile tmp_dir: 'sh%a_%p.bad'
    logfile tmp_dir: 'sh%a_%p.log'
    fields
            terminated BY ','
            optionally enclosed BY '"'
            lrtrim
            missing field VALUES are NULL
  )
  location (data_dir1:'users.csv')
 )
 parallel
 reject limit unlimited;CREATE TABLE so_users
   TABLESPACE ts_data
   STORAGE (INITIAL 8M NEXT 8M)
   PARALLEL
   NOLOGGING
   COMPRESS FOR ALL OPERATIONS
      as (select * from users_external);
-- Elapsed: 00:00:22.76
ALTER TABLE so_users ADD PRIMARY KEY (user_id);
-- Elapsed: 00:00:13.08
create index so_users_display_name_idx on so_users(display_name);
-- Elapsed: 00:00:08.01
- Importing Posts
create table posts_external
( post_id      NUMBER(10),
  parent_id   NUMBER(10),
  user_id      NUMBER(10),
  title            VARCHAR2(4000),
  body          CLOB,
  score         NUMBER(10),
  views        NUMBER(10),
  comments NUMBER(10)
)
organization external
( type  oracle_loader
  default directory data_dir1
  access parameters
  ( records delimited BY newline
    badfile tmp_dir: 'sh%a_%p.bad'
    logfile tmp_dir: 'sh%a_%p.log'
    fields
            terminated BY ','
            optionally enclosed BY '"'
            lrtrim
            missing field VALUES are NULL
  )
  location (data_dir1:'posts.csv')
 )
 parallel
 reject limit unlimited;
CREATE TABLE so_posts
   TABLESPACE ts_data
   STORAGE (INITIAL 8M NEXT 8M)
   PARALLEL
   NOLOGGING
   COMPRESS FOR ALL OPERATIONS
      as (select * from posts_external);
-- Elapsed: 00:14:20.89
ALTER TABLE so_posts ADD PRIMARY KEY (post_id);
-- Elapsed: 00:02:35.86
-- purge posts associated to no imported users
delete from so_posts where user_id not in (select user_id from so_users);
-- Elapsed: 00:02:41.64
create index so_posts_user_id_idx on so_posts(user_id);
-- Elapsed: 00:01:34.87
ALTER TABLE so_posts ADD CONSTRAINT fk_so_user FOREIGN KEY (user_id) REFERENCES so_users(user_id);
-- Elapsed: 00:00:09.28
Note that 26 million posts where imported in 14 minutes, not so bad considering that CSV source was at an external USB 2.0 drive and Oracle 12c tablespaces where placed at an USB 3.0 drive, here a screenshot showing the IO bandwidth consumed in both drivers.

only 4.8 Mb/s for reading from sdb (CSV) and 9.7 Mb/s for writing at sdc1 (ts_data).
- Importing tags
create table tags_external
( tag_id      VARCHAR2(4000)
)
organization external
( type  oracle_loader
  default directory data_dir1
  access parameters
  ( records delimited BY newline
    badfile tmp_dir: 'sh%a_%p.bad'
    logfile tmp_dir: 'sh%a_%p.log'
    fields
            terminated BY ','
            optionally enclosed BY '"'
            lrtrim
            missing field VALUES are NULL
  )
  location (data_dir1:'tags.csv')
 )
 parallel
 reject limit unlimited;
CREATE TABLE so_tags
   TABLESPACE ts_data
   STORAGE (INITIAL 8M NEXT 8M)
   PARALLEL
   NOLOGGING
   COMPRESS FOR ALL OPERATIONS
      as (select * from tags_external);
-- Elapsed: 00:00:00.55
create table tags_posts_external
( post_id      NUMBER(10),
  tag_id      VARCHAR2(4000)
)
organization external
( type  oracle_loader
  default directory data_dir1
  access parameters
  ( records delimited BY newline
    badfile tmp_dir: 'sh%a_%p.bad'
    logfile tmp_dir: 'sh%a_%p.log'
    fields
            terminated BY ','
            optionally enclosed BY '"'
            lrtrim
            missing field VALUES are NULL
  )
  location (data_dir1:'tags_posts_rel.csv')
 )
 parallel
 reject limit unlimited;
CREATE TABLE so_tags_posts
   TABLESPACE ts_data
   STORAGE (INITIAL 8M NEXT 8M)
   PARALLEL
   NOLOGGING
   COMPRESS FOR ALL OPERATIONS
      as (select * from tags_posts_external);
-- Elapsed: 00:00:43.75
-- purge tags associated to no imported posts
delete from so_tags_posts where post_id not in (select post_id from so_posts);
-- Elapsed: 00:02:42.00
create index so_tags_posts_post_id_idx on so_tags_posts(post_id);
-- Elapsed: 00:00:43.29
ALTER TABLE so_tags_posts ADD CONSTRAINT fk_so_posts FOREIGN KEY (post_id) REFERENCES so_posts(post_id);
-- Elapsed: 00:01:16.65
Note that as in posts<->users one-to-many relation, tags<->posts is also a one-to-many relation and some posts referenced by a few tags where not imported due character-encoding errors.
As a summary of the above steps 26 millions posts of 4.5 millions registered users where imported; 41K distinct tags are used with an average of 1.11 tag by post (29M tags/posts rows).
Next blog post will be about using Oracle 12c in-memory features to query this corpus data.

OTN at Oracle OpenWorld Group - Join today!

OTN TechBlog - Mon, 2015-10-05 12:27

Join the OTN at Oracle OpenWorld group on the OTN Community Platform!  This group is designed to keep you in the know about all the GREAT activities and events that the Team OTN is planning/organizing for Oracle OpenWorld in San Francisco this October (24th to 28th).

Some of the events/activities to look forward to -

Community Events - RAC Attack and Blogger Meetup.

Networking Opportunities - Sunday Kick off Party, Cloud Hour

NEW activities! Graffiti Wall and giant games plus Make Your Own T-Shirt is back with NEW art!

15221_OTN Lounge-Graphics_FINAL 1.jpg 15221_OTN Lounge-Graphics_FINAL 4.jpg15221_OTN Lounge-Graphics_FINAL 5.jpg15221_OTN Lounge-Graphics_FINAL 7.jpg15221_OTN Lounge-Graphics_FINAL 8.jpg

We hope to see you there!

TEAM OTN


What I Wanted to Tell Terry Bradshaw

Cary Millsap - Thu, 2015-10-01 17:23
I met Terry Bradshaw one time. It was about ten years ago, in front of a movie theater near where I live.

When I was little, Terry Bradshaw was my enemy because, unforgivably to a young boy, he and his Pittsburgh Steelers kept beating my beloved Dallas Cowboys in Super Bowls. As I grew up, though, his personality on TV talk shows won me over, and I enjoy watching him to this day on Fox NFL Sunday. After learning a little bit about his life, I’ve grown to really admire and respect him.

I had heard that he owned a ranch not too far from where I live, and so I had it in mind that inevitably I would meet him someday, and I would say thank you. One day I had that chance.

I completely blew it.

My wife and I saw him there at the theater one day, standing by himself not far from us. It seemed like if I were to walk over and say hi, maybe it wouldn’t bother him. So I walked over, a little bit nervous. I shook his hand, and I said, “Mr. Bradshaw, hi, my name is Cary.” I would then say this:

I was a big Roger Staubach fan growing up. I watched Cowboys vs. Steelers like I was watching Good vs. Evil.

But as I’ve grown up, I have gained the deepest admiration and respect for you. You were a tremendous competitor, and you’re one of my favorite people to see on TV. Every time I see you, you bring a smile to my face. You’ve brought joy to a lot of people.

I just wanted to say thank you.
Yep, that’s what I would say to Terry Bradshaw if I got the chance. But that’s not how it would turn out. How it actually went was like this, …my big chance:

Me: I was a big Roger Staubach fan growing up.
TB: Hey, so was I!
Me: (stunned)
TB: (turns away)
The End
I was heartbroken. It bothers me still today. If you know Terry Bradshaw or someone who does, I wish you would please let him know. It would mean a lot to me.

…I did learn something that day about the elevator pitch.

Oracle Priority Support Infogram for 01-OCT-2015

Oracle Infogram - Thu, 2015-10-01 14:42

RDBMS


PL/SQL

A Surprising Program, from Oracle Database PL/SQL and EBR.

Data Warehouse

DOP Downgrades, or Avoid The Ceiling, from The Data Warehouse Insider blog.

WebLogic


Java


Creating Games with JavaFX 8: Case Study, from The Java Tutorials Blog.

OAG

The 10 most recently created notes for OAG as of 24 Sept. 2015., from Proactive Support - Java Development using Oracle Tools.

Ops Center

Changing an Asset's Name, from the Oracle Ops Center blog.

Data Integration


SOA

Top tweets SOA Partner Community – September 2015, from the the SOA & BPM Partner Community Blog.

Real User Monitoring

How to Configure Used ID Identification, from Real User Monitoring.

Solaris

Solaris: Identifying EFI disks, from Giri Mandalika's Repository.

EBS

From the Oracle E-Business Suite Support blog:




Finally Eliminate Those Duplicate WIP Transactions!


Rise of the Robots - Review

Abhinav Agarwal - Thu, 2015-10-01 11:08
Rise of the Robots: Technology and the Threat of a Jobless Future
Martin Ford

Part 1 of 3

"I'm smart; you're dumb. I'm big; you're small. I'm right; you're wrong. And there's nothing you can do about it."

Thus spake Harry Wormwood in the movie "Matilda". This well could be the message that robots will have for us in the not too distant future. The dramatic improvements in the speed, the accuracy, and the areas in which computers have begun to comprehensively outperform humans leads one to believe that while a so-called singularity may well be some ways off, the more immediate effects of this automation are already being felt in permanent job losses. In a country like India, which has used digital technologies quite effectively in the last decade and a half to grow a $150 billion IT-BPM industry, the impact could be devastating - especially where an estimated 10 million people are employed.
In many spheres - chess for example - they could utter these lines to us humans today and there's nothing we can do about it - for the computer is right. The puniest of computers in the tiniest of smartphones possesses enough computing horsepower and smart-enough algorithms (written by us humans - oh yes, the irony!) to defeat the best of us humans in chess, every single time, without breaking a sweat. Computers have been able to add, subtract, divide, square, multiply faster and more accurately than us for decades now, and there's nothing we can do about that either.

From the time of the Luddites - who rose up against the machines of the Industrial Revolution in the early years of the nineteenth century - to the present-day "Judgment Day" Sarah Connor avatars, inspired as much by an acute awareness of the march of technology as by James Cameroon's "Terminator" movies, the refrain of the chorus has been more or less unchanging: the machines are coming for our jobs, our livelihoods, and will finally come for us (the Matrix was premised on a variant of one such dystopian future). Computing power of computers exploded in the second half of the twentieth century, obeying the inexorable pull of Moore's Law, and made feasible by advances in semiconductors, fabrication techniques, and electrical engineering. As did fears that similar software advances could somehow endow machines with intelligence - Artificial Intelligence. These fears however did not quite come to pass. For several decades, there were several false hopes and starts that were kindled and then extinguished. Till this decade. The congruence of seemingly infinite computing power - thanks to massive server farms running in the "cloud" (a mangled metaphor if ever there was one), cheap and lightning fast bandwidth available on tap, storage and memory that keeps getting impossibly cheaper every year, and sophisticated software algorithms - has however made it clear that "machine intelligence" is no longer an oxymoron. We are well and truly living in the middle of the machine age. The "singularity" may well be witnessed in our lifetimes, within a decade or two even.

Martin Ford's book, "The Rise of the Robots: Technology and the Threat of a Jobless Future" lays out the case for a not-so-distant future where machines make possible the automation of almost every task imaginable, but at a great social and economic cost. The book is neatly organized, lucidly argued, and except for a lengthy and somewhat incongruous chapter on the medical system, the book stays on point. Ford makes it clear that neither is this technological progress reversible, nor wholly desirable. Its consequences therefore cannot be wished away - income inequality as an example, which economists for three decades have been explaining away as a temporary anomaly. The last section, which is more contemplative and prescriptive, as opposed to the earlier sections which are descriptive, discusses possible solutions, some of which will shock free market proponents. Whether there are more practical, workable answers is quite another thing though.

Part 2 of 3

Machines have been able to do mechanical jobs faster than humans, with greater precision, and for longer periods of time - the cotton gin invented in the eighteenth century for example. The inevitable loss of jobs called for a re-skilling of the people affected, and the mantra went that you had to pull yourself up by your socks, learn a new skill, and get productive again. Martin Ford's book shatters that illusion. There is not a single profession left - whether unskilled or skilled, whether in technology or medicine or liberal arts, whether one that can be performed remotely or requires direct human interaction - that is not at threat from the machines. Whichever way you slice and dice it, you are left facing one or the other variation of a dystopian future, with stark income inequalities, a substantial population that will require doles on a permanent doles, and the concomitant social upheavals.

Some years back, when offshoring was in the news and concerns about its impact on US jobs was at its peak, with hundreds of thousands of jobs moved offshore to countries like India, there were stories coming out regularly, like the one about Southern California workers being made to train H1-B visa holders, many of whom took over their jobs. Pfizer made "hundreds of tech workers at its Connecticut R&D facilities" train their replacements - guest workers from India. If the economics of labor cost arbitrage precipitated the migration of skilled technology jobs away from the United States and to countries like India (being "Bangalored" entered the urban lexicon only a decade ago), technology could plausibly bring those jobs back to the United States - call it "reshoring". The quantum of jobs reshored, however, is going to be a massive disappointment. Consider this: "In 2011, the Washington Post’s Michael Rosenwald reported that a colossal, billion-dollar data center built by Apple, Inc., in the town of Maiden, North Carolina, had created only fifty full-time positions." But it is precisely this elimination of the need for many people that makes the economics of reshoring work out. Ironical.

While the United States can at least look forward to the reshoring of some jobs lost to manufacturing in China or services in India, the loss of such jobs is certain, on the other hand, to cause greater upheaval in these offshore countries. India's socio-economic progress is predicated in great deal on a re-skilling of its labour force to take advantage of an emerging "Digital India" both in the manufacturing and services sector, but which is in mortal danger of being blindsided by the rise of the machines. The use of IT-based services as a catalyst for driving economic growth in smaller - Tier B and Tier C - cities in India is a recurrent theme for planners. But this could be short-circuited by the rise of the robots, who, once trained - by humans - can perform the jobs of humans, better, and faster. Indians were trained by their American counterparts to do their jobs. Unbeknownst to many, these people are actors in the same offshoring saga that played out a decade ago, but with the proverbial shoe on the other foot now. "The bottom line is that if you find yourself working with, or under the direction of, a smart software system, it’s probably a pretty good bet that—whether you’re aware of it or not—you are also training the software to ultimately replace you."

India has been a spectacular laggard when it has come to industrializing its economy - it is probably unique among all developing nations to be progressing (or at least with ambitions of progressing) from a primarily agrarian economy to a services-based economy, skipping substantially the intermediate phase of industrialization that every single industrialized nation went through last century. It was industrialization that provided the bedrock for the middle-class in nations, which then aspired towards a better quality of life, with the ability to pay for it - thus driving the move towards a services-based economy. For India, it could be argued by some that this skipping may prove to be a blessing, since an industrialized economy is more susceptible to efficiencies wrought by advancements in technology. Consider these examples from Ford's book:

1. "in the United States, chickens are grown to standardized sizes so as to make them compatible with automated slaughtering and processing."

2. Momentum Machines, a San Francisco based startup has developed a machine that "shapes burgers from freshly ground meat and then grills them to order - including even the ability to add just the right amount of char while retaining all the juices. The machine, which is capable of producing about 360 hamburgers per hour, also toasts the bun and then slices and adds fresh ingredients like tomatoes, onions, and pickles only after the order is placed." The company's co-founder is clear that these machines are not "meant to make employees more efficient... It's meant to completely obviate them."

3. "Vision Robotics, a company based in San Diego, California, is developing an octopus-like orange harvesting machine. The robot will use three-dimensional machine vision to make a computer model of an entire orange tree and then store the location of each fruit. That information will then be passed on to the machine’s eight robotic arms, which will rapidly harvest the oranges."

4. "Researchers at Facebook have likewise developed an experimental system—consisting of nine levels of artificial neurons—that can correctly determine whether two photographs are of the same person 97.25 percent of the time, even if lighting conditions and orientation of the faces vary. That compares with 97.53 percent accuracy for human observers."

5. "A Facebook executive noted in November 2013 that the Cyborg system routinely solves thousands of problems that would otherwise have to be addressed manually, and that the technology allows a single technician to manage as many as 20,000 computers."

6. If reading certain news articles makes you wonder whether a robot wrote it, things are going to get better - or worse. Computer algorithms are at work to churn out articles that will be indistinguishable from those written by humans. Liberal arts became even more unviable - if ever that was possible.
"In 2010, the Northwestern University researchers who oversaw the team of computer science and journalism students who worked on StatsMonkey raised venture capital and founded a new company, Narrative Science, Inc., to commercialize the technology. The company hired a team of top computer scientists and engineers; then it tossed out the original StatsMonkey computer code and built a far more powerful and comprehensive artificial intelligence engine that it named “Quill.”
... One of Narrative Science’s earliest backers was In-Q-Tel, the venture capital arm of the Central Intelligence Agency"

"To keep instructional costs down, colleges are relying ever more heavily on part-time, or adjunct, faculty who are paid on a per-course basis—in some cases as little as $2,500 for a semester-long class—and receive no employee benefits. Especially in the liberal arts, these adjunct positions have become dead-end jobs for huge numbers of PhD graduates who once hoped for tenure-track academic careers."

7. "Radiologists, for example, are trained to interpret the images that result from various medical scans. Image processing and recognition technology is advancing rapidly and may soon be able to usurp the radiologist’s traditional role."

8. "In July 2012, the London Symphony Orchestra performed a composition entitled Transits—Into an Abyss. One reviewer called it “artistic and delightful.” The event marked the first time that an elite orchestra had played music composed entirely by a machine. The composition was created by Iamus, a cluster of computers running a musically inclined artificial intelligence algorithm."

9. "Perhaps the most remarkable elder-care innovation developed in Japan so far is the Hybrid Assistive Limb (HAL)—a powered exoskeleton suit straight out of science fiction. Developed by Professor Yoshiyuki Sankai of the University of Tsukuba, the HAL suit is the result of twenty years of research and development. Sensors in the suit are able to detect and interpret signals from the brain. When the person wearing the battery-powered suit thinks about standing up or walking, powerful motors instantly spring into action, providing mechanical assistance. A version is also available for the upper body and could assist caretakers in lifting the elderly. Wheelchair-bound seniors have been able to stand up and walk with the help of HAL."

As one goes over these examples, it becomes obvious that automation is a sword that cuts both ways. Is India equipped - and more importantly, are the planners aware - to handle the flood of automation that could wash away entire swathes of jobs being dreamed up by ambitions of a digitally-enabled nation?

Part 3 of 3

As 2014 drew to a close, the Indian IT industry was rocked by rumours that TCS (the largest Indian IT company by annual revenues) had completed an internal review and had initiated lay offs of thousands of employees - mostly in middle management. Some stories talked about a number as high as 30,000. The saga finally ended with a round of clarifications and denials by TCS and some well-deserved opprobrium over its inept handling of the needless controversy. What the fracas however served to highlight was a stark truth that's been staring at the Indian IT industry for some time now - the skills that the typical Indian IT worker possesses are mostly undifferentiated and prime candidates for automation.
What is worse, from at least one perspective, is the fact that (smart) humans have built technology that has becoming adept at "engineering the labor out of the product." One will need to be particularly myopic to not also recognize that "the machines are coming for the higher-skill jobs as well." This much should have been clear in part two of this series, through the examples I cited from Martin Ford's book.
One recurring theme in Martin Ford's book, "Rise of the Robots", at least in the initial chapters, is the permanence and acceleration of offshoring to countries like India, which he believes, "has built a major, nationally strategic industry specifically geared toward the electronic capture of American and European jobs." (As an aside, most Indians would be somewhat puzzled by this assertion, given at times the outright hostility which politicians in India display towards the IT industry, like the time when a former prime minister indirectly accused the Bangalore IT industry of "immoral, unethical and illegal prosperity"!) Anyway, leaving that aside aside, in advancing his argument Ford posits that as "powerful AI-based tools make it easier for offshore workers to compete with their higher-paid counterparts in developed countries, advancing technology is also likely to upend many of our most basic assumptions about which types of jobs are potentially offshorable. Nearly everyone believes, for example, that occupations that require physical manipulation of the environment will always be safe."

Ford believes that the development of a digital infrastructure in India and the advancement of AI and related technologies will make things worse for US (and Europe) jobs. True to some extent though that may be, you have to consider the fact that increasing automation makes it cheaper and less labor-intensive to maintain, run, and patch-and-upgrade software applications. Furthermore, any offshoring of jobs adds its own overheads by way of administrative and managerial redundancies that cannot be done away with. Automation efficiencies reduce the need for labour, which is the often the single biggest component in any software application over its entire life. Therefore, the very factors that Ford fears are threatening to make offshoring permanent and more widespread are also likely to make reshoring financially viable. It's a sword that cuts both ways.

To be fair, the digital economy in India has a lot of headroom to grow; especially as the Indian government's Smart City initiative brings e-governance and services to the common man through the Internet and technologies. This could well provide a second wind to the Indian IT industry for a decade or more.

However, it is a smart strategy to keep one eye on the what the winds of such a digital nirvana may blow in. An indicator of the direction in which the Indian IT job market is likely to evolve therefore can be found by looking at the US, where the "propensity for the economy to wipe out solid middle-skill, middle-class jobs, and then to replace them with a combination of low-wage service jobs and high-skill, professional jobs that are generally unattainable for most of the workforce, has been dubbed "job market polarization.""
This phrase - "job market polarization" should give us a fair indication of what is in store for the hundreds of thousands, even millions, of graduates in India, many of whom emerge today out of college with a stark degree of antipathy for doing the "9-5" grind that they saw their parents and its generation go through. Digital "start-up" nirvana beckons for them. Each sees himself as a digital entrepreneur of the new economy. They are ready to chuck the "dependable income stream that anchors them into the middle-class" - they view it not as an "anchor" but more a millstone. However, the vast majority is likely to find itself stuck in what "techno visionary" Jared Lanier calls the "informal economy that is found in third-world nations." It's a tiny minority that will "live at the extreme left of the long tail" of such a digital economy. For every Flipkart or SnapDeal (the final word on that fairy-tale saga is yet to be written), you will find tens of thousands of resellers at the other end of the payoff tail, paying these e-tailers money every month for the privilege of selling on their platforms, at prices that barely cover operating costs.

The Indian middle-class, for all its flaws, has represented for decades an aspirational lodestar for the vast majority of the country's poor. So what happens when the digital economy hollows out the middle of the job market - "job polarization" as described above? Again, we can look westwards for possible answers.
"In an analysis published in February 2014, MIT economist James Poterba found that a remarkable 50 percent of American households aged sixty-five to sixty-nine have retirement account balances of $5,000 or less. According to Poterba’s paper, even a household with $100,000 in retirement savings would receive a guaranteed income of only about $5,400 per year (or $450 per month) with no cost-of-living increases, if the entire balance were used to purchase a fixed annuity."
In other words, in the absence of both a retirement corpus and a government guaranteed pension, there is a real risk of an emergent middle-class sliding right back into the working poor or even the underclass - a recipe for social unrest.

An inevitable counter-argument to all this unease generated by the "rise of the robots" is the "humans are underrated" palliative. Championing this is Tom Davenport (of "Competing on Analytics" fame) who now talks of "amplified intelligence" (which unfortunately has more the stench of a seo-optimized buzzword than anything substantial at this point) - where "smart" humans work to "augment" "smart" machines. Then there is also Geoff Colvin, who penned the insightful 2008 book, "Talent Is Overrated", and who has returned with "Humans Are Overrated". I have yet to read Colvin's latest book, so judgment day is reserved on the book, but to Davenport's argument, some of the evidence suggests an easy refutation - "In his 2007 book Super Crunchers, Yale University professor Ian Ayres cites study after study showing that algorithmic approaches routinely outperform human experts. When people, rather than computers, are given overall control of the process, the results almost invariably suffer." In many fields where algorithms rule the roost, to argue for human "augmentation" or "amplification" is no better than to argue for more cooks to brew the broth - we know that aphorism, don't we?

In conclusion, and in many ways, the saga documented in "Rise of the Robots" calls to mind the ancient Indian tale of the four friends:
In ancient India there lived four friends. Three of them were very learned, while the fourth was a simpleton, even considered a fool. The four decided to go to the capital and seek their fortune from the king. Along the way, while passing through a jungle, they came across the bones of a lion long dead. The first friend used his knowledge to assemble the bones into a skeleton. The second friend used his skills to fashion a skin over the skeleton, while the third prepared to bring the lion back to life. At this the fourth friend - the simpleton - warned his other three friends of the perils of doing so, and was roundly rebuked by the three, wiser friends. The simpleton again warned them and upon being ignored, climbed a tree for safety. The third friend used his knowledge to breathe life into the lion. I don't need to tell you how this tale ended for the three wise men.

And I will end here.

Buying Info:
Hardcover: 352 pagesPublisher: Basic Books (May 5, 2015)ISBN-10: 0465059996ISBN-13: 978-0465059997

US: Rise of the Robots: Technology and the Threat of a Jobless Future
India: Rise of the Robots: Technology and the Threat of a Jobless Future

The first part was first published in PerformanceGurus on 8th August, 2015.
The second part was first published in PerformanceGurus on 13th August, 2015.
The concluding part was first published in PerformanceGurus on 15th August, 2015.

 
© 2015, Abhinav Agarwal (अभिनव अग्रवाल). All rights reserved.

Generate 11g password hash

Laurent Schneider - Thu, 2015-10-01 10:14

An easy way to generate a value string from the ssl is to use openssl

Let’s take a random salt of ABCDEFGHIJ. The length of 10 is important.

The hexadecimal representation is -41-42-43-44-45-46-47-48-49-4A-


$ echo "SafePassw0rDABCDEFGHIJ\c" | openssl dgst -sha1
(stdin)= 47cc4102144d6e479ef3d776ccd9e0d0158842bb

With this hash, I can construct my value


SQL> create user testuser identified by values 'S:47CC4102144D6E479EF3D776CCD9E0D0158842BB4142434445464748494A';

User created.

SQL> grant create session to testuser;

Grant succeeded.

SQL> conn testuser/SafePassw0rD
Connected.

If you prefer PL/SQL over shell, use DBMS_CRYPTO


SQL> exec dbms_output.put_line('S:'||dbms_crypto.hash(utl_raw.cast_to_raw('SafePassw0rDABCDEFGHIJ'),dbms_crypto.HASH_SH1)||utl_raw.cast_to_raw('ABCDEFGHIJ'))
S:47CC4102144D6E479EF3D776CCD9E0D0158842BB4142434445464748494A

PL/SQL procedure successfully completed.

In 12c there is also a “T” String. According to the doc
The cryptographic hash function used for generating the 12C verifier is based on a de-optimized algorithm involving PBKDF2 and SHA-512.

IBM Bluemix - Triggerring backing service creation from "Deploy to Bluemix" button

Pas Apicella - Thu, 2015-10-01 08:01
I recently posted about the "Deploy to Bluemix" button which will automatically deploy an application into IBM Bluemix from a single click.

http://theblasfrompas.blogspot.com.au/2015/09/adding-deploy-to-bluemix-button-to-my.html

If the application requires backing services they can automatically be created using the"declared-services" tag as shown below. Declared services are a manifest extension, which creates or looks for the required or optional services that are expected to be set up before the app is deployed, such as a data cache service. You can find a list of the eligible Bluemix services, labels, and plans by using the CF Command Line Interface and running cf marketplace

manifest.yml

declared-services:
  redis-session:
    label: rediscloud
    plan: 30mb
applications:
 - name: pas-sbsessions
   memory: 512M
   instances: 2
   path: ./target/SpringBootHTTPSession-0.0.1-SNAPSHOT.jar
   host: pas-sbsessions
   domain: mybluemix.net
   buildpack: java_buildpack
   services:
    - redis-session


Note: Declared services is an IBM extension of the standard Cloud Foundry manifest format. This extension might be revised in a future release as the feature evolves and improves.
Categories: Fusion Middleware

JavaScript on the App Server: Scripting PeopleCode

Jim Marion - Wed, 2015-09-30 21:25

It has been nearly a decade since I started playing with JavaScript on the PeopleSoft application server. Back then I had to deploy a couple of JAR files to the app server. At that time, maintaining and deploying unmanaged files seemed more headache than benefit. Today Java provides full scripting support through the ScriptEngineManager and embedded Mozilla Rhino JavaScript script engine. Why would I want to script PeopleCode? Here are a few of my favorite reasons:

  • Low-level socket communication
  • Avoid reflection: JavaScript executes all methods regardless of variable type whereas PeopleCode only recognizes the returned type, not the real type
  • Process simple JSON structures that can't be modeled with the Documents module

Here is the PeopleCode required to invoke JavaScript

Local JavaObject &manager =  CreateJavaObject("javax.script.ScriptEngineManager");
Local JavaObject &engine = &manager.getEngineByName("JavaScript");

REM ** Evaluate a simple JavaScript;
&engine.eval("var result = Math.random();");

REM ** Access the value of the JavaScript variable named result;
Local string &result_text = &engine.get("result").toString();

Here is some JavaScript that converts the variable &json_string into a JSON Array and then iterates over each entry, inserting values into a table. Notice that I'm invoking the PeopleCode SQLExec function from JavaScript.

var result = (function() {
var SQLExec = Packages.PeopleSoft.PeopleCode.Func.SQLExec;
var json = JSON.parse(json_string);
var count = 0;
json.forEach(function(item, idx) {
SQLExec("INSERT INTO ... SYSTIMESTAMP", [idx, item]
);
count++;
});
return count + " rows inserted";
}());

Where did that &json_string variable come from? Here:

&engine.put("json_string", "[""item1"", ""item2"", ""item3""]");

OpenWorld 2015 Conference Schedule

Jim Marion - Wed, 2015-09-30 14:52

Just a couple more weeks and we will be enjoying the great weather and hospitality of San Francisco. I am anxiously anticipating another great OpenWorld conference. As always, I look forward to meeting with you between sessions or in the demo grounds. I will be presenting "PeopleSoft Developer: Tips and Techniques [CON8596]" on Monday, Oct 26 at 12:15 p.m. in Moscone West—3007.

I find the OpenWorld/JavaOne content catalog a little intimidating. If you are presenting a PeopleTools topic, please post your session details in the comments below to help the rest of us find PeopleTools-related sessions.

DOAG Conference Presentation Summary Finished

Dietmar Aust - Wed, 2015-09-30 14:48
Hi guys,

today was the deadline for uploading the four-page abstract / summary of my presentation at the German Oracle conference in Nürnberg:



If you still have to upload yours, hurry up ... only a few hours left ;).

You can have a look here (it is German though ;). 

See you in Nürnberg in November.

Cheers,
~Dietmar.


DAM tools, IBM Guardium, Oracle E-Business Suite, PeopleSoft and SAP

A question we have answered a few times in the last few months is whether or not, and if so, how easy do Database Activity Monitoring (DAM) tools such as IBM Guardium support ERP platforms such as the Oracle E-Business Suite, PeopleSoft and SAP. The answer is yes; DAM tools can support ERP systems. For example, IBM Guardium has out-of-the-box policies for both the E-Business Suite and SAP – see figures one and two below.

There are many advantages to deploying a DAM solution to protect your ERP platform, the first being additional defense-in-depth for one of your most critical assets. You can read more here ( Integrigy Guide to Auditing and Logging in Oracle E-Business Suite)  about Integrigy’s recommendations for database security programs. DAM solutions allow for complex reporting as well as 24x7 monitoring and easy relaying of alerts to your SIEM (e.g. Splunk or ArcSight).

Deploying DAM solutions to protect your SAP, PeopleSoft or E-Business Suite is a not-plug-and-play exercise. IBM Guardium’s out-of-the-box policies for the E-Business Suite require configuration to be of any value – see figure three below. The out-of-the-box DAM policies are a good starting point and Integrigy rarely sees them implemented as is. Integrigy also highly recommends, if at all possible, to complete a sensitive data discovery project prior to designing your initial DAM policies. Such projects greatly help to define requirements as well as offer opportunities for data clean up.

Overall, to design and implement an initial set of Guardium policies for the E-Business Suite (or any other ERP package) is usually a few weeks of effort depending on your size and complexity.

If you have any questions, please contact us at info@integrigy.com

Figure 1- Seeded Guardium Policies for EBS and SAP

Figure 2- Guardium E-Business Suite PCI Policy

Figure 3- Example of Blank Configuration

 

 

 

Auditing, Oracle E-Business Suite, IBM Guardium
Categories: APPS Blogs, Security Blogs

PeopleTools Mobile Book Now Shipping

Jim Marion - Wed, 2015-09-30 10:02

I received notice yesterday that our latest book, PeopleSoft PeopleTools: Mobile Applications Development (Oracle Press) 1st Edition, is now shipping. Probably the most exciting news is that Amazon has the book listed as the #1 New Release in the Oracle Databases category.

delete all data

Laurent Schneider - Tue, 2015-09-29 09:53

How do you delete all data? The simplistic approach would be to truncate all tables


SQL> select table_name from user_tables;
TABLE_NAME
----------
T1
SQL> truncate table t1;
Table truncated.

You cannot truncate if you have referential integrity constraints.


SQL> truncate table t2;
ORA-02266: unique/primary keys in table 
  referenced by enabled foreign keys

Ok, let’s disable the RIC


SQL> select table_name, constraint_name
  from user_constraints
  where constraint_type='R';
TAB CONSTRAINT
--- ----------
T3  SYS_C00107
SQL> alter table t3 disable constraint SYS_C00107;
Table altered.
SQL> truncate table t2;
Table truncated.
SQL> truncate table t3;
Table truncated.

You cannot truncate cluster tables


SQL> truncate table t4;
ORA-03292: Table to be truncated is part of a cluster

Cluster tables could be dropped with TRUNCATE CLUSTER.


SQL> select cluster_name from user_clusters;
CLUSTER_NAME                  
------------
C                             
SQL> truncate cluster c;
Cluster truncated.

The code above doesn’t work with Partitioned cluster (12.1.0.2) because it was not properly implemented at the time of the writing.
Check Bug 20284579 : CAN NOT QUERY DYNAMIC CLUSTER PARTITIONS

For reference partitioning, it is not possible to disable the foreign key


SQL> alter table t6 disable constraint fk;
ORA-14650: operation not supported for 
  reference-partitioned tables

In 12c, if the foreign key is defined with ON DELETE CASCADE, you can truncate cascade the parent.


SQL> select table_name, REF_PTN_CONSTRAINT_NAME
  from user_part_tables 
  where partitioning_type='REFERENCE';
TAB REF
--- ---
T6  FK 
SQL> select r_constraint_name, delete_rule 
  from user_constraints 
  where constraint_name='FK';
R_CON DELETE_RULE
----- -----------
PK    CASCADE    
SQL> select table_name
  from user_constraints 
  where constraint_name='PK';
TAB
---
T5 
SQL> truncate table t5 cascade;
Table truncated.

But if one of child or child-child table is using reference partitioning without the ON DELETE CASCADE, then the parent or grand-parent could not be truncated. And truncate cascade for reference partitioning is not documented (yet).

But there is very nice alternative to TRUNCATE called is DELETE &#x1f642;


SQL> select table_name, REF_PTN_CONSTRAINT_NAME
  from user_part_tables 
  where partitioning_type='REFERENCE';
TAB REF
--- ---
T8  FK 
SQL> select r_constraint_name, delete_rule 
  from user_constraints 
  where constraint_name='FK';
R_CON DELETE_RULE
----- -----------
PK    NO ACTION  
SQL> select table_name
from user_constraints 
where constraint_name='PK'
TAB
---
T7 
SQL> truncate table t7 cascade;
ORA-14705: unique or primary keys referenced by enabled foreign keys in table "SCOTT"."T8"
SQL> truncate table t8;
Table truncated.
SQL> delete from t7;
2 rows deleted

To get the tables in the right order, parent tables after children, you can do some hierarchical query and then order by rownum desc, a construct I’m using for the first time I confess. Note the leaf tables are truncable.


select c_owner owner, child table_name   
FROM 
  (
    SELECT 
      p_OWNER, parent, nvl(c_owner, a.owner) c_owner,
      nvl(child, a.table_name ) child
    FROM 
    (
      SELECT 
        PT.OWNER P_owner, pt.table_name parent, 
        pt2.owner c_owner, pt2.table_name child
      FROM all_part_tables pt
      JOIN all_constraints c
      ON pt.OWNER = c.owner
        AND PT.TABLE_NAME = c.table_name
        AND c.constraint_type = 'P'
        AND c.status = 'ENABLED'
      JOIN all_constraints r
      ON r.r_owner = c.owner
        AND r.r_constraint_name = c.constraint_name
        AND r.constraint_type = 'R'
        AND r.status = 'ENABLED'
      JOIN all_part_tables pt2
      ON r.owner = pt2.owner
        AND r.constraint_name = pt2.REF_PTN_CONSTRAINT_NAME
        AND pt2.partitioning_type = 'REFERENCE'
    ) t
    RIGHT JOIN all_tables a 
    ON child = table_name and a.owner = T.c_OWNER
  )
where connect_by_isleaf=0  
CONNECT BY parent = PRIOR child and p_owner=PRIOR c_owner
start with parent is null 
order by rownum desc;

OWNER TAB
----- ---
SCOTT T10 
SCOTT T9

Note the query above is very slow. If dictionary-performance is an issue, maybe we could delete all tables and catch exceptions and loop until all tables are empty


SQL> delete from t9;
ORA-02292: integrity constraint (SCOTT.F10) violated - child record found
SQL> delete from t10;
ORA-02292: integrity constraint (SCOTT.F11) violated - child record found
SQL> delete from t11;
1 row deleted.
SQL> delete from t9;
ORA-02292: integrity constraint (SCOTT.F10) violated - child record found
SQL> delete from t10;
1 row deleted.
SQL> delete from t11;
0 row deleted.
SQL> delete from t9;
1 row deleted.
SQL> delete from t10;
0 row deleted.
SQL> delete from t11;
0 row deleted.
SQL> delete from t9;
0 row deleted.
SQL> delete from t10;
0 row deleted.
SQL> delete from t11;
0 row deleted.

If you have close to zero reference-partitioning table, this approach will be more efficient.

Column Group Headings in an Interactive Report (APEX 5.0)

Dimitri Gielis - Tue, 2015-09-29 06:12
In 2008 I blogged about how to add Group Headings into an Interactive Report in Oracle Application Express (APEX 3.x!).

Here's an example what I mean with Group Headings, you want to logically group certain columns (e.g. Empno, Ename) and have a group name above them (e.g. Group 1):


The approach I used at that time worked, but it required a lot of manual work. In 2009 Martin D'Souza improved it and wrote a blog post about that. In 2010, when plugins were introduced in APEX 4, Martin wrapped it all in a nice APEX plugin which made adding Column Group Headings almost declarative. The plugin was further improved and put on GitHub.

Today, having APEX 5, and the Interactive Reports being changed a lot behind the scenes, the plugin was broken. As we're in the middle of restructuring how to share code, plugins etc., I forked Martin's git repo and made the plugin compatible with APEX 5 and Universal Theme (see version 1.0.4), so when you need it, you can grap it from the forked repo. At a later time the plugin will be moved to OraOpenSource, so depending when you read this post, check out OraOpenSource as Martin, I and many other developers will start putting our open source code on OraOpenSource.

Here's a quick how to use the plugin.

Step 1: Download the file dynamic_action_plugin_com_clarifit_apexplugin_ir_column_grouping.sql
Step 2: Import the plugin in your APEX application
Step 3: Define the Groups in the Interactive Report (you can right click on Column Groups in the  Attributes of your Interactive Report and add the Groups you need):


Step 4: Assign the group to the columns of your report (you can multi select the columns and in the right pane you can select the group):


Important: define for every column that has a group defined, a static id equal to the name of the column (e.g. ENAME).

Step 5: Add the dynamic action plugin to the page
When:
- Event: After Refresh
- Selection Type: Region
- Region:
Action:
- IR Column Grouping [Plug-In]



That's it... note that when you scroll down and you have sticky headers, the group will stay with it.


Categories: Development

Lot's Wife

Greg Pavlik - Mon, 2015-09-28 11:57
Lot's Wife
by Anna Akhmatova
an interpretive translation by Greg Pavlik
The righteous Lot heard the voice of God
As if coming from the distant and black mountains.
But his wife,
She saw what was until yesterday her blessing.
Under the beautiful spires of Sodom,
Where she sang spinning cloth -
The empty window of the room,
In which her children were born.
She looked – and her pain died with her,
For she could look no more:
Her body translucent salt,
Her feet joined the earth.
Who will mourn for Marah?
An insignificant role in a grand saga -
Yet my conscience cannot forget
The one who gave her life for a fleeting glance.
-----
Original poem
И праведник шел за посланником Бога,
Огромный и светлый, по черной горе.
Но громко жене говорила тревога:
Не поздно, ты можешь еще посмотреть
На красные башни родного Содома,
На площадь, где пела, на двор, где пряла,
На окна пустые высокого дома,
Где милому мужу детей родила.
Взглянула – и, скованы смертною болью,
Глаза ее больше смотреть не могли;
И сделалось тело прозрачною солью,
И быстрые ноги к земле приросли.
Кто женщину эту оплакивать будет?
Не меньшей ли мнится она из утрат?
Лишь сердце мое никогда не забудет
Отдавшую жизнь за единственный взгляд.

Custom image on your APEX app login page

Dimitri Gielis - Sun, 2015-09-27 16:15
In a comment on my post APEX 5.0: pimping the Login page I got a question how to put your own logo or a custom image on the login page, instead of an icon.

You only need a bit of CSS to do the trick:

You can add the above CSS to your login page - Inline CSS in page attributes or you can add it to your page template or custom CSS in Universal Theme.

The result is this:


To differentiate the login page you can do a lot more, here're some things we show during our APEX UI training:

  • add transparency to the login box
  • add a background image to your entire page (blurred or not)
Again, you can do that with just some CSS.
Categories: Development

My sessions recommendations for JavaOne '15 and OOW '15

Kuassi Mensah - Sun, 2015-09-27 14:52
JavaOne SF 2015  Session recommendations

High Availability with Java EE Containers, JDBC, and Java Connection Pools [BOF7732]
Monday, Oct 26, 8:00 p.m. | Parc 55—Mission

Implement Cloud Data Services with Java 8 Nashorn [CON4405]

Java Connection Pool Performance and Scalability with Wait-Free Programming [CON2158]
Wednesday, Oct 28, 4:30 p.m. | Hilton—Continental Ballroom 1/2/3


OOW SF 2015 - Session recommendations

Java Virtual Machine Cookbook [UGF2720]
Sunday, Oct 25, 9:00 a.m. | Moscone West—3011


Next-Generation Database: Implement Cloud Data Services with Java 8 Nashorn [CON8461]
Monday, Oct 26, 5:15 p.m. | Moscone South—308

Next-Generation Database: Java Connection Pool for Multitenant and Sharded Databases [CON8460]
Monday, Oct 26, 2:45 p.m. | Moscone South—308


Integrate Master Data with Big Data on Hadoop and Spark [CON8459]
Wednesday, Oct 28, 3:00 p.m. | Moscone South—308

Market Basket Analysis Using Oracle In-Database Container for Hadoop [CON8462]
Thursday, Oct 29, 12:00 p.m. | Moscone South—307

Next-Gen Database Enhancements for Java Application Performance and Scalability [CON10310]
Thursday, Oct 29, 2:30 p.m. | Moscone South—307


Dialog with the Oracle Database Java Developers and Architects [MTE9501]
Tuesday, Oct 27, 7:15 p.m. | Moscone South—305



Creepy Dolls - A Technology and Privacy Nightmare!

Abhinav Agarwal - Sat, 2015-09-26 11:28
This post was first published on LinkedIn on 20th May, 2015.

"Hi, I'm Chucky. Wanna play?"[1]  Fans of the horror film genre will surely recall these lines - innocent-sounding on their own, yet bone-chilling in the context of the scene in the movie - that Chucky, the possessed demonic doll, utters in the cult classic, "Child's Play". Called a "cheerfully energetic horror film" by Roger Ebert [2], the movie was released to more than a thousand screens on its debut in November 1988 [3]. It went on to spawn at least five sequels and developed a cult following of sorts over the next two decades [4].

Chucky the doll
(image credit: http://www.shocktillyoudrop.com/)In "Child's Play", Chucky the killer doll stays quiet around the adults - at least initially - but carries on secret conversations with Andy, and is persuasive enough to convince him to skip school and travel to downtown Chicago. Chucky understands how children think, and can evidently manipulate - or convince, depending on how you frame it - Andy into doing little favours for him. A doll that could speak, hear, see, understand, and have a conversation with a human in the eighties was the stuff out of science fiction, or in the case of "Child's Play" - out of a horror movie.

Edison Talking Doll.
Image credit: www.davescooltoys.comA realistic doll that could talk and converse was for long the "holy grail" of dollmakers [5]. It will come as a huge surprise to many - at least it did to me - that within a few years of the invention of the phonograph by Thomas Edison in 1877, a doll with a pre-recorded voice had been developed and marketed in 1890! It didn't have a very happy debut however. After "several years of experimentation and development", the Edison Talking Doll, when it launched in 1890, "was a dismal failure that was only marketed for a few short weeks."[6] Talking dolls seem to have made their entry into mainstream retail only with the advent of "Chatty Cathy" - released by Mattel in the 1960s - and which worked on a simple pull-string mechanism. The quest to make these dolls more interactive and more "intelligent" continued; "Amazing Amanda" was another milestone in this development; it incorporated "voice-recognition and memory chips, sensory technology and facial animatronics" [7]. It was touted as an "an evolutionary leap from earlier talking dolls like Chatty Cathy of the 1960's" by some analysts [8]. In some ways that assessment was not off-the-mark. After all, "Amazing Amanda" utilized RFID technology - among the hottest technology buzzwords a decade back. "Radio-frequency tags in Amanda's accessories - including toy food, potty and clothing - wirelessly inform the doll of what it is interacting with." This is what enabled "Amazing Amanda" to differentiate between "food" (pizza, or "cookies, pancakes and spaghetti") and "juice"[9]. "However, even with all these developments and capabilities, the universe of what these toys could was severely limited. At most they could recognize the voice of the child as its "mommy".
Amazing Amanda doll.
Image credit:amazing-amanda.fuzzup.netThey were constrained by both the high price of storage (Flash storage is much sturdier than spinning hard drives, but an order of magnitude costlier; this limits the amount of storage possible) and limited computational capability (putting in a high-end microprocessor inside every doll would make them prohibitively expensive). The flip side was that what the toys spoke in home to the children stayed at home. These toys had a limited set of pre-programmed sentences and emotions they could convey, and if you wanted something different, you went out and bought a new toy, or in some cases, a different cartridge.

That's where things stood. Till now.

Screenshot of ToyFair websiteBetween February 14-17, 2015, the Jacob K. Javits Convention Center in New York saw "the Western Hemisphere’s largest and most important toy show"[10] - the 2015 Toy Fair. This was a trade-show, which meant that "Toy Fair is not open to the public. NO ONE under the age of 18, including infants, will be admitted."[11] It featured a "record-breaking 422,000+ net square feet of exhibit space"[12] and hundreds of thousands of toys. Yet no children were allowed. Be that as it may, there was no dearth of, let's say, "innovative" toys. Apart from an "ultra creepy mechanical doll, complete with dead eyes", a fake fish pet that taken to a "whole new level of weird", or a "Doo Doo Head" doll that had the shape of you-guessed-it [13], of particular interest was a "Hello Barbie" doll, launched by the Fortune 500 behemoth, Mattel. This doll had several USPs to its credit. It featured voice-recognition software, voice recording capabilities, the ability to upload recorded conversations to a server (presumably Mattel's or ToyTalk's) in the cloud, over "Wi-Fi" - as a representative at the exhibition took pains to emphasize, repeatedly - and give "chatty responses."[14] This voice data would be processed and analyzed by the company's servers. The doll would learn the child's interests, and be able to carry on a conversation on those topics - made possible by the fact that the entire computational and learning capabilities of a server farm in the cloud could be accessed by every such toy. That the Barbie franchise is a vital one to Mattel could not be understated. The Barbie brand netted Mattel $1.2 billion in FY 2013 [15], but this represented a six per cent year-on-year decline. Mattel attributed that this decline in Barbie sales in part to "product innovation not being strong enough to drive growth." The message was clear. Something very "innovative" was needed to jump-start sales. To make that technological leap forward, Mattel decided to team up with ToyTalk.

ToyTalk is a San Francisco-based start-up, and its platform powered the voice-recognition software used by "Hello Barbie". ToyTalk is headed by "CEO Oren Jacob, Pixar's former CTO, who worked at the groundbreaking animation company for 20 years" [16], and which claimed "$31M in funding from Greylock Partners, Charles River Ventures, Khosla Ventures, True Ventures and First Round Capital as well as a number of angel investors." [17]

Cover of Misery, by Stephen King.
Published by Viking Press.The voice recognition software would allow Mattel and ToyTalk to learn the preferences of the child, and over time refine the responses that Barbie would communicate back. As the Mattel representative put it, "She's going to get to know all my likes and all my dislikes..."[18] - a statement that at one level reminds one of Annie Wilkes when she says, "I'm your number one fan."[19] We certainly don't want to be in Paul Sheldon shoes.

Hello Barbie's learning would start happening from the time the doll was switched on and connected to a Wi-Fi network. ToyTalk CEO Oren Jacob said, "we'll see week one what kids want to talk about or not" [20]. These recordings, once uploaded to the company's servers, would be used by "ToyTalk's speech recognition platform, currently powering the company's own interactive iPad apps including The Winston Show, SpeakaLegend, and SpeakaZoo" and which then "allows writers to create branching dialogue based on what children will potentially actually say, and collects kids' replies in the cloud for the writers to study and use in an evolving environment of topics and responses."[20]. Some unknown set of people. sitting in some unknown location, would potentially get to hear and listen to entire conversations of a child before his parents would.

If Mattel or ToyTalk did not anticipate the reaction this doll would generate, one can only put it down to the blissful disconnect from the real-world that Silicon Valley entrepreneurs often develop, surrounded as they are by similar-thinking digerati. In any case, the responses were swift, and in most cases brutal. The German magazine "Stern" headlined an article on the doll - "Mattel entwickelt die Stasi-Barbie" [21] Even without the benefit of translation, the word "Stasi" stood out like a red flag. In any case, if you wondered, the headline translated to "Mattel developed the Stasi Barbie" [22]. Stern "curtly re-baptised" it "Barbie IM". "The initials stand for “Inoffizieller Mitarbeiter”, informants who worked for East Germany’s infamous secret police, the Stasi, during the Cold War." [23] [24]. A Newsweek article carried a story, "Privacy Advocates Call Talking Barbie 'Surveillance Barbie'"[25]. France 24 wrote - "Germans balk at new ‘Soviet snitch’ Barbie" [26]. The ever-acerbic The Register digged into ToyTalk's privacy policy on the company's web site, and found these gems out [27]:
Screenshot of ToyTalk's Privacy page- "When users interact with ToyTalk, we may capture photographs or audio or video recordings (the "Recordings") of such interactions, depending upon the particular application being used.
- We may use, transcribe and store such Recordings to provide and maintain the Service, to develop, test or improve speech recognition technology and artificial intelligence algorithms, and for other research and development or internal purposes."

Further reading revealed that what your child spoke to the doll in the confines of his home in, say, suburban Troy Michigan, could end up travelling half the way across the world, to be stored on a server in a foreign country - "We may store and process personal information in the United States and other countries." [28]

What information would ToyTalk share with "Third Parties" was equally disturbing, both for the amount of information that could potentially be shared as well as for the vagueness in defining who these third-parties could possibly be - "Personal information"; "in an aggregated or anonymized form that does not directly identify you or others;"; "in connection with, or during negotiations of, any merger, sale of company assets, financing or acquisition, or in any other situation where personal information may be disclosed or transferred as one of the business assets of ToyTalk"; "We may also share feature extracted data and transcripts that are created from such Recordings, but from which any personal information has been removed, with Service Providers or other third parties for their use in developing, testing and improving speech recognition technology and artificial intelligence algorithms and for research and development or other purposes."[28] A child's speech, words, conversation, voice - as recorded by the doll - was the "business asset" of the company.

And lest the reader have any concerns about safety and security of the data on the company's servers, the following disclaimer put paid to any reassurances on that front also: "no security measures are perfect or impenetrable and no method of data transmission that can be guaranteed against any interception or other type of misuse."[28] If the sound of hands being washed-off could be put down on paper, that sentence above is what it could conceivably look like.

Apart from the firestorm of criticism described above, the advocacy group "Campaign for a Commercial Free Childhood" started a campaign to petition Mattel "CEO Christopher Sinclair to stop "Hello Barbie" immediately." [29]

The brouhaha over "Hello Barbie" is however only symptomatic of several larger issues that have emerged and intersect each other in varying degrees, raising important questions about technology, including the cloud, big data, the Internet of Things, data mining, analytics; privacy in an increasingly digital world; advertising and the ethics of marketing to children; law and how it is able to or unable to cope with an increasingly digitized society; and the impact on children and teens - sociological as well as psychological. Technology and Moore's Law [30] have combined with the convenience of broadband to make possible what would have been in the realm of science fiction even two decades ago.

The Internet, while opening up untold avenues of betterment for society at large, has however also revealed itself as not without a dark side - a dilemma universally common to almost every transformative change in society. From the possibly alienating effects of excessive addiction to the Internet to physiological changes that the very nature of the hyperlinked web engenders in humans - these are issues that are only recently beginning to attract the attention of academics and researchers. The basic and most fundamental notions of what people commonly understood as "privacy" are not only being challenged in today's digital world, but in most cases without even a modicum of understanding on the part of the affected party - you. In the nebulous space that hopefully still exists between those who believe in technology as the only solution capable of delivering a digital nirvana to all and every imaginable problem in society on the one hand and the Luddites who see every bit of technology as a rabid byte (that's a bad pun) against humanity lies a saner middle ground that seeks to understand and adapt technology for the betterment of humanity, society, and the world at large.

So what happened to Chucky? Well, as we know, it spawned a successful and profitable franchise of sequels and other assorted franchise. Which direction "Hello Barbie" takes is of less interest to me as the broader questions I raised in the previous paragraph.

References:
[1] http://www.imdb.com/title/tt0094862/quotes?item=qt0289926 
[2] "Child's Play" review, http://www.rogerebert.com/reviews/childs-play-1988
[3] http://www.the-numbers.com/movie/Childs-Play#tab=box-office
[4] https://en.wikipedia.org/wiki/Child%27s_Play_%28franchise%29
[5] "A Brief History of Talking Dolls--From Bebe Phonographe to Amazing Amanda", http://collectdolls.about.com/od/dollsbymaterial/a/talkingdolls.htm
[6] "Edison Talking Doll", http://www.edisontinfoil.com/doll.htm
[7] http://www.canada.com/story.html?id=f4370a3c-903d-4728-a9a4-3d3f941055a6
[8] http://www.nytimes.com/2005/08/25/technology/circuits/25doll.html?pagewanted=all&_r=0
[9] http://www.canada.com/story.html?id=f4370a3c-903d-4728-a9a4-3d3f941055a6
[10] http://www.toyfairny.com/toyfair/Toy_Fair/Show_Info/A_Look_Back.aspx
[11] http://www.toyfairny.com/ToyFair/ShowInfo/About_the_Show/Toy_Fair/Show_Info/About_the_Show.aspx
[12] http://www.toyfairny.com/ToyFair/ShowInfo/About_the_Show/Toy_Fair/Show_Info/About_the_Show.aspx
[13] http://mashable.com/2015/02/15/weird-toys-2015-toy-fair/
[14] https://www.youtube.com/watch?feature=player_embedded&v=RJMvmVCwoNM
[15] http://corporate.mattel.com/PDFs/2013_AR_Report_Mattel%20Inc.pdf
[16] http://www.fastcompany.com/3042430/most-creative-people/using-toytalk-technology-new-hello-barbie-will-have-real-conversations-
[17] https://www.toytalk.com/about/
[18] https://www.youtube.com/watch?feature=player_embedded&v=RJMvmVCwoNM
[19] http://www.imdb.com/title/tt0100157/quotes?item=qt0269492
[20] http://www.fastcompany.com/3042430/most-creative-people/using-toytalk-technology-new-hello-barbie-will-have-real-conversations-
[21] http://www.stern.de/digital/ueberwachung/barbie-wird-zum-spion-im-kinderzimmer-2173997.html
[22] https://translate.google.co.in/?ie=UTF-8&hl=en&client=tw-ob#auto/en/Mattel%20entwickelt%20die%20Stasi-Barbie
[23] http://www.france24.com/en/20150224-hello-barbie-germany-stasi-data-collection/
[24] http://www.stern.de/digital/ueberwachung/barbie-wird-zum-spion-im-kinderzimmer-2173997.html
[25] http://www.newsweek.com/privacy-advocates-want-take-wifi-connected-hello-barbie-offline-313432
[26] http://www.france24.com/en/20150224-hello-barbie-germany-stasi-data-collection/
[27] http://www.theregister.co.uk/2015/02/19/hello_barbie/
[28] https://www.toytalk.com/legal/privacy/
[29] http://org.salsalabs.com/o/621/p/dia/action3/common/public/?action_KEY=17347
[30] http://en.wikipedia.org/wiki/Moore's_law


Disclaimer: Views expressed are personal.


© 2015, Abhinav Agarwal. All rights reserved.

Pages

Subscribe to Oracle FAQ aggregator