Skip navigation.

Oracle AppsLab

Syndicate content
Driving Innovation
Updated: 5 days 1 hour ago

Look What We Made

Thu, 2014-11-20 17:53

As a team-building activity for our newly merged team of research, design and development, someone, who probably wishes to remain nameless, organized a glass mosaic and welding extravaganza at The Crucible in Oakland.

We split into two teams, one MIG welding, the other glass breaking, and here’s the result.

Original image, glass before firing.

Original image, glass before firing.

Finished product, including frame.

Finished product, including frame.

All-in-all an interesting and entertaining activity. Good times were had by all, and no one was cut or burned, so bonus points for safety.Possibly Related Posts:

From Concept to Code

Thu, 2014-11-20 17:42

Editor’s note: Here’s a repost of a wonderful write-up of an event we did a couple weeks ago, courtesy of Friend of the ‘Lab Karen Scipi (@KarenScipi).

What Karen doesn’t mention is that she organized, managed and ran the event herself. Additional props to Ultan (@ultan) on the idea side, including the naming, Sandra Lee (@SandraLee0415) on the execution side and to Misha (@mishavaughan) for seeing the value. Without the hard work of all these people, I’m still just talking about a great idea in my head that I’m too lazy to execute. You guys all rock. 

Enjoy the read.

Concept to Code: Shaping and Shipping Innovative User Experience Solutions for the Enterprise

By Karen Scipi

It was an exciting event here at Oracle Headquarters as our User Experience AppsLab (@theappslab) Director Jake Kuramoto (@jkuramot) recently hosted an internal design jam called Shape and ShipIt. Fifteen top-notch members of the newly expanded team got together for two days with a packed schedule to research and innovate cutting-edge enterprise solutions, write use cases, create wireframes, and build and code solutions. They didn’t let us down.

The goal: Collaborate and rapidly design practical, contextual, mobile Oracle Applications Cloud solutions that address real-world user needs and deliver enterprise solutions that are streamlined, natural, and intuitive user experiences.

The result: Success! Four new stellar user experience solutions were delivered to take forward to product development teams working on future Oracle Application Cloud simplified user interface releases.

banner_final_a

Shape and ShipIt event banner

While I cannot share the concepts or solutions with you as they are under strict lock and key, I can share our markers of the event’s success with you.

The event was split into two days:

Day 1: A “shape” day during which participants received invaluable guidance from Bill Kraus on the role of context and user experience, then researched and shaped their ideas through use cases and wireframes.

Day 2: A “ship” day during which participants coded, reviewed, tested, and presented their solutions to a panel of judges that included Jeremy Ashley (@jrwashley), Vice President of the Oracle Applications User Experience team.

It was a packed two days full of ideas, teamwork, and impressive presentations.

IMG_1590

Participants Anthony Lai, Bill Kraus, and Luis Galeana [photo: Sandra Lee (@SandraLee0415)]

The participants formed four small teams that comprised managers, architects, researchers, developers, and interaction designers whose specific perspectives proved to be invaluable to the tasks at hand. Their blend of complementary skills enabled the much needed collaboration and innovation.

IMG_0917

Diversity drives more innovation at Oracle. Participants Mark Vilrokx, Osvaldo Villagrana, Raymond Xie Julia Blyumen, and Joyce Ohgi hard at work. [photo: Karen Scipi (@KarenScipi)]

Although participants were charged with a short timeframe for such an assignment, they were quick to adapt and refine their concepts and produce solutions that could be delivered and presented in two days. Individual team agility was imperative for designing and delivering solutions within a two-day timeframe.

Participants were encouraged to brainstorm and design in ways that suited them. Whether it was sitting at tables with crayons, paper, notebooks and laptops, or hosting walking meetings outside, the participants were able to discuss concepts and ideate in their own, flexible ways.

 Sandra Lee]

Brainstorming with notebooks and pens: Cindy Fong and Tony Orciuoli [photo: Sandra Lee]

 Karen Scipi]

Brainstorming with laptops: Noel Portugal and Ben Bendig
[photo: Karen Scipi]

As with all of our simplified user interface design efforts, participants kept a “context produces magic” perspective front and center throughout their activities. In the end, team results yielded responsive, streamlined, context-driven user experience solutions that were simple yet powerful.

Healthy “brain food” and activity breaks were encouraged, and both kept participants engaged and focused on the important tasks at hand. Salads, veggies, dips, pastas, wraps, and sometimes a chocolate chip cookie (for the much needed sugar high) were on the menu. The activity break of choice was an occasional competitive game of table tennis at the Oracle Fitness Center, just a stone’s throw from the event location. The balance of think-mode and break-mode worked out just right for participants.

 Karen Scipi]

Healthful sustenance: Lunch salads [photo: Karen Scipi]

Our biggest marker of success, though, was how wrong we were. Yes. Wrong. While we expected one team’s enterprise solution to clearly stand out from among all of the others, we were pleasantly surprised as all four were equally impressive, viable, and well-received by the design jam judges. Four submissions, four winners. Nice job!

 Karen Scipi]

Participants (standing) Cindy Fong, Sarahi Mireles, and Tony Orciuoli present their enterprise solution to the panel of judges (seated): Jake Kuramoto, Jatin Thaker, Tim Dubois, Jeremy Ashley, and Bill Kraus [photo: Karen Scipi]

Stay tuned to the Usable Apps Blog to learn more about such events and what happens to the innovative user experiences that emerge!Possibly Related Posts:

Musings on Samsung Developer Conference 2014

Mon, 2014-11-17 11:18

This year some of us at the AppsLab attended the Samsung Developer Conference aka #SDC2014. Last year it was Samsung’s first attempt and we were also there. The quality and caliber of presentations increased tenfold from last year. Frankly, Samsung is doing it really hard to resist to join their ecosystem.

sdc2014

Here are some of the trends I observed:

Wearables and Health:

There was a huge emphasis in Samsung’s commitment with wearable technology. They released a new Tizen based smartwatch (Samsung Gear S) as well as a biometric reference design hardware and software called SIMBAND. Along with their wearable strategy they also released S.A.M.I, a cloud repository to store all this data. All this ties together with their vision of “Voice of the Body.”

Voice of the Body from Samsung on Vimeo.

During the second day keynote we got to hear from Mounir Zok Senior Sports Technologist of the United States Olympic Committee. He told us of how wearable technology is changing they way Olympic athletes are training. It was only a couple years ago when athletes still had to go to a lab and “fake” actual activities to get feedback. Now they can actually get real data on the field thanks to wearable technology.

Virtual Reality:

Samsung released the Gear VR in partnership with Oculus. This goggles can only work with a mounted Galaxy Note 4 in the front. The gaming experiences with this VR devices are amazing. But they are also exploring other cases like virtual tourism and virtual movie experiences. They released a 3D 360+spherical view camera called “Project Beyond.”

IoT – Home Automation:

Samsung is betting big with IoT and Home Automation and they are putting their money where their mouth is by acquiring SmartThings. The SmartThings platform is open sourced and has the ability to integrate with a myriad of other  home automation products. They showcased a smart home powered by SmartThings platform.

Mobile Innovation: 

I actually really like their new Galaxy Note Edge phablet. Samsung is showing true innovation here with the “edge” part of the device. It has it’s own SDK and it feels great on the hand!

Overall I’m pretty impressed with what Samsung is doing. It seems like their spaghetti-on-the-wall approach (throwing a bunch spaghetti and see what sticks) is starting to pay off.  Their whole UX across devices looks seamless. And in my humble approach they are getting ready to take off on their own without having to use Android for their mobile devices. Tizen keeps maturing, but I shall leave that for another post!

Please feel free to share your experience with Samsung devices as well!Possibly Related Posts:

Asteroid Hackathon – The Winning Team

Mon, 2014-11-17 09:57

Editorial Note: This is a guest post by friend of the ‘Lab and colleague DJ Ursal. Also be sure to check out our Hackathon entry here:

P1230269(2)

EchoUser (@EchoUser), in partnership with SpaceGAMBIT, Maui Makers, the Minor Planet Center, NASA, the SETI Institute, and Further by Design, hosted an Asteroid Hackathon. The event was in response to the NASA Grand Challenge, “focused on finding all asteroid threats to human populations and knowing what to do about them.”

I had a wonderful opportunity to participate in the Asteriod Hackathon last week. MY team name was NOVA. Our team comprised for 4 team members – DJ Ursal, Kris Robison, Daniel Schwartz, Raj Krishnamurthy

We were given live data from NASA and Minor Planet site and literally just had 5 hours to put together a working prototype and solution to the Asteroid big data problem.  We created a web application (works not only on your MAC or PC but also on your iPad and your latest Nexus 7 Android devices) which would help scientists, astronomers and anyone who is interested in Asteriods discover, learn and share information in a fun and interactive way.

P1230307(9)

Our main them was Finding Asteroids Before They Find Us. The goal was to help discover, learn and share Asteroids information to increase awareness within the community.  We created an interactive web app that allowed users to make use of chart filters to find out about the risk for possibilities of future impact with Earth. Find out about the distance of the asteroids to Earth, absolute brightness and rotation of the Asteroid. It allowed users to click and drag on any chart to filter, so that they could transform the filters in multidimensional  way in order to explorer, discover , interesting facts and share data on asteroids with riends and community. We made use of Major Tom who  is an astronaut referenced in David Bowie’s songs “Space Oddity. “Space Oddity” depicts an astronaut who casually slips the bonds of the world to journey beyond the stars. Users could post questions to Major Tom and could also play his song.

The single most important element about WINNING this hackathon  strategically was  team composition. Having a team that is effective working together. Collaboration and communication skills were the two of most critical personal skills demanded of all members as time was limited and communication and coordination of utmost importance.

Winning TEAM NOVA- DJ Ursal, Kris Robison, Daniel Schwartz, Raj Krishnamurthy Possibly Related Posts:

Asteroid Hackathon

Mon, 2014-11-17 09:49

A couple weeks ago Jeremy Ashley (@jrwashley), Bill Kraus, Raymond Xie and I participated in the Asteroid Hackathon hosted by @EchoUser. The main focus was “to engage astronomers, other space nerds, and the general public, with information, not just data.”

asteroid-hackathon-small

As you might already know, we here at the AppsLab, are big fans of Hackathons as well as ShipIt days or FedEx days. The ability to get together, get our collective minds together and being able to create something in a short amount of time is truly amazing. It also helps to keep us on our toes, technically and creatively.

Our team built what we called “The Daily Asteroid.” The idea behind our project was to highlight the asteroid profile of the current date’s closed approach to Earth or near Earth object (NEO) data. What this means is to show which asteroid is the closest to earth today. A user could “favorite” today’s asteroid and start a conversation with other users about it, using a social network like Twitter.

Screen Shot 2014-11-17 at 9.47.36 AM

We also added the ability to change the asteroid properties (size, type, velocity, angle) and play a scenario to see what damage could it cause if it hit the earth. And to finish up,  we created an Asteroid Hotline using Twilio (@twilio) where you can call to get the latest NEO info using your phone!

We were lucky to be awarded 3rd place or “Best Engagement,” and we had a blast doing it. Considering the small amount time we had, we came out really proud of our results.Possibly Related Posts:

The Cloud UX Lab

Mon, 2014-11-10 09:57

There’s a post over on VoX about a OAUX new lab at Oracle HQ, the Cloud UX Lab.

labwidewithJacopy

Jeremy Ashley, VP, in the new lab, image used with permission.

Finished just before OOW in September, this lab is a showcase for OAUX projects, including a few of ours.

The lab reminds me of a spacecraft from the distant future, the medical bay or the flight deck. It’s a very cool place, directly inspired and executed by our fearless leader, Jeremy Ashley (@jrwashley), an industrial designer by trade.

I actually got to observe the metamorphosis of this space from something that felt like a doctor’s office waiting room into the new hotness. Looking back on those first meetings, I never expected it would turn out so very awesome.

Anyway, the reason why I got to tag along on this project is because our team will be filling the control room for this lab with our demos. Noel (@noelportugal) and Jeremy have a shared vision for that space, which will be a great companion piece to the lab and equally awesome.

So, if you’re at Oracle HQ, book a tour and stop by the new Cloud UX Lab, experience the new hotness and speculate on what Noel is cooking up behind the glass.Possibly Related Posts:

Pseudo-Philosophical Observations on Wearables, Part 1

Wed, 2014-11-05 11:53

Jawbone announced the Up3 today, reportedly its most advanced fitness tracker to date.

As with all fitness trackers, the Up3 has an accelerometer, but it also has sensors for measuring skin and ambient temperature, as well as something called bioimpedence. As these data collected by the Up3 are used by a new feature called Smart Coach.

You can imagine what the Smart Coach does. It sounds like a cool, possibly creepy, feature.

This post is not about the Up3.

This post is about my journey into the dark heart of the quantified self. The Up3 has just reminded me to coalesce my thoughts.

Earlier this year, I started wearing my first fitness tracker, the Misfit Shine. I happily wore it for about two months before the battery died, and then I realized it had control of me.

Misfit calculates activity based on points, and my personal goal of 1,000 points was relatively easy to reach every day, even for someone who works from home. What I realized quickly was that the Shine pushed me to chase points, not activity.

Screenshot_2014-11-05-08-18-56

My high score.

 

The Shine uses its accelerometer to measure activity, so depending on where I wore it on my person, a run could be worth more points. This isn’t unique to the Shine. I’ve seen people spinning at the gym wearing their fitness trackers on their ankles.

As the weeks passed, I found myself avoiding activities that didn’t register a lot of points, definitely not good behavior, and even though my goal was 1,000 points, I avoided raising it for fear of missing my daily goal-achievement dopamine high.

Then, mid-Summer, Misfit dropped an update that added some new game mechanics, and one day, my Shine app happily informed me that I’d hit my goal 22 days in a row.

This streak was the beginning of the end for me.

On the 29th day of my streak, the battery died. I replaced it, crisis averted, streak in tact. Then, later that day, the Shine inexplicably died. I tried several new batteries and finally had to contact support.

All the while, I worried about my streak. I went to gym, but it felt hollow and meaningless without the tangible representation, the coaching, as it were, from my Shine.

This is not a good look.

Misfit replaced my Shine, but in the days that elapsed, during my detox, I decided to let it go. Turns out the quantified self isn’t for obsessive, overly-competitive personality types like me.

And I’m not the only one in this group.

In September, I read an article called Stepping Out: Living the Fitbit Life, in which the author, David Sedaris, describes a similar obsession with his Fitbit. As I read it, I commiserated, but I also felt a little jealous of the level of his commitment. This dude makes me look like a rank amateur.

Definitely worth a read.

Anyway, this is not in any meant to be an indictment of the Shine, Fitbit, Jawbone or any fitness tracker. Overall, these devices offer people a positive and effective way to reenforce healthy behavior and habits.

But for people like, they lead to unanticipated side effects. As I read about the Up3, its sensors and Smart Coach, all of which sound very cool, I had to remind myself of the bad places where I went with the Shine.

And the colloquial, functionally-incorrect but very memorable, definition of insanity.

In Part 2, when I get around to it, I’ll discuss the flaws in the game mechanics these companies use.

Find the comments.Possibly Related Posts:

Google Glass, Android Wear, and Apple Watch

Tue, 2014-10-28 15:43

I have both the Google Glass and Android Wear (Samsung Gear Live, Moto 360), and often times I wear them together.  People always come up with a question:  “How do you compare Google Glass and Android watches?”  Let me address couple of the view points here.  I would like to talk about Apple Watch, but since it has not been officially released yet, let’s say that shape-wise it is square and looks like a Gear Live, and features seem to be pretty similar to Android Wear, with the exceptions of the attempt to add more playful colors and features.  Lets discuss more about it once it is out.

unnamed                             Moto-360-Dynamic-Black

423989-google-glass              10-apple-watch.w529.h352.2x

I am the first batch of the Google Glass Explorer and got my Glass mid 2013.  In the middle of this year, I first got the Gear Live, then later Moto 360.  I always find it peculiar that Glass is an old technology while Wear is a newer technology.  Should it not be easier to design a smart watch first before a glassware?

I do find a lot of similarities between Glass and Wear.  The fundamental similarity is that both are Android devices.  They are voice-input enabled and show you notifications.  You may install additional Android applications for you to personalize your experience and maximize your usage.  I see these as the true values for wearables.

Differences?  Glass does show a lot of capabilities that Android Wear is lack of at the moment.  Things that probably matter for most people would be sound, phone calls, video recording, pictures taking, hands-free with head-on display, GPS, wifi.  Unlike Android Wear, it can be used standalone;  Android Wear is only a companion gadget and has to be paired up with a phone.

Is Glass more superior?   Android Wear does provide a better touch-based interaction, comparing to the swiping at the side of the Glass frame.  You can also play simple games like Flopsy Droid on your watch.  Also commonly included are pedometers and heart activity sensor.  Glass also tends to get over-heated easily.  Water-resistance also plays a role here: you would almost never want to get your Glass wet at all, while Android Wear is water-resistant to certain degree.  When you are charging your watch at night, it also serves as a bedtime clock.

php71o7v6

For me, personally, although I own Glass longer than Wear, I have to say I prefer Android Wear over Glass for couple reasons.  First, there is the significant price gap ($1500 vs $200 price tag).  Second, especially when you add prescription to Glass, it gets heavy and hurts the ear when wearing it for an extended period of time.  Third, I do not personally find the additional features offered by Glass useful to my daily activities;  I do not normally take pictures other than at specific moments or while I am traveling.

I also find that even Glass is now publicly available within the US, Glass is still perceived as an anti-social gadget.  The term is defined in the Urban Dictionary as well.  Most of the people I know of who own Glass do not wear it themselves due to all various reasons.  I believe improving the marketing and advertising strategy for Glass may help.

Gadget preference is personal.  What’s yours?Possibly Related Posts:

Glorious Data Visualizations for Your Friday

Fri, 2014-10-24 09:00

If you’ve read here for more than a hot minute, you’ll know that I love me some data visualization.

This love affair dates back to when Paul (@ppedrazzi) pointed me to Hans Rosling’s (@hansrosling) first TED talk. I’m sure Hans has inspired an enormous city of people by now, judging by the 8 million plus views his TED talk has garnered. Sure, those aren’t unique view, but even so.

There’s an interesting meta-project: visualize the people influenced by various visualization experts, like a coaching tree or something.

sandwich

Classic comic from xkcd, used under CC 2.5

Back on track, if you haven’t yet, watch the BBC documentary on him, “The Joy of Stats,” fantastic stuff, or if you have seen it, watch it again.

As luck would have it, one area of specialization of our newest team members is, wait for it, data visualization.

Last week, I got to see them in action in a full-day workshop on data visualization, which was eye-opening and very informative.

I’m hoping to get a few blog posts out of them on the subject, and while we wait, I wanted to share some interesting examples we’ve been throwing around in email.

I started the conversation with xkcd because, of course I did. Randal Munroe’s epic comic isn’t usually mentioned as a source for data visualizations, but if you read it, you’ll know that he has a knack for exactly that. Checking out the Google Image search for “xkcd data visualization” reminded me of just how many graphs, charts, maps, etc. Randal has produced over the years.

I also discovered that someone has created a D3 chart library as an homage to the xkcd style.

Anyway, two of my favorite xkcd visualizations are recent, possibly a function of my failing memory and not coincidence, Pixels and Click and Drag.

I probably spent 10 minutes zooming into Pixels, trying to find the bottom; being small-minded, I gave up pretty early on Click and Drag, assuming it was small. It’s not.

How much time did you spend, cough, waste, on these?

During our conversation, a couple interesting examples have come back to me, both worth sharing.

First is Art of the Title, dedicated to the opening credits of various films. In a very specific way, opening credits are data visualizations; they set the mood for the film and name the people responsible for it.

Second is Scale of the Universe, which is self-explanatory and addictive.

So, there you go. Enjoy investigating those two and watch this space for more visualization content.

And find the comments.Possibly Related Posts:

Mind Control?

Mon, 2014-10-13 16:37

Editor’s note: Hey look, a new author. Here’s the first post from Raymond Xie, who joined us nearly a year ago. You may remember him from such concept demos as geo-fencing or Pebble watchface. Raymond has been busy at work and wants to share the work he did with telekinesis. Or something, you decide. Enjoy.

You put on a headband, stare at a ball, tilt your head back-forth and left-right . . . the ball navigates through a simple maze, rushing, wavering, changing colors, and finally hitting the target.

That is the latest creation out of AppsLab: Muse Sphero Driver. When it was first showed at OAUX Exchange during OOW, it amused many people, as they would call it “mind control” game.

The setup consists of  Muse – a brain-sensing headband, Sphero – a robotic ball, and a tablet to bridge the two.

Technically, it is your brainwave data (Electroencephalography – EEG) driving the Sphero (adjusting speed and changing color with spectrum from RED to BLUE, where RED: fast, active;  BLUE: slow, calm);  and head gesture (3d Accelerarometer- ACC) controlling the direction of Sphero movement.  Whether or not you call that as “mind control” is up to your own interpretation.

You kind of drive the ball with your mind, but mostly brainwave noises instead of conscious thought. It is still too early to derive accurate “mind control” from EEG data out of any regular person, for the reasons:

1. For EEG at Scalp level, the noise-to-signal ratio is very poor;
2. Need to establish the correlation between EEG and mind activity.

But it does open up a dialog in HCI, such as voice-control vs mind-control (silence); or in Robotics, instead of asking machine to “see”/”understand”, we can “see”/”understand” and impersonate it with our mind and soul.

While it is difficult to read out “mind” (any mind activity) transparently, we think it is quite doable to map your mind into certain states, and use the “state” as command indirectly.

We may do something around this area. So stay tuned.

Meanwhile, you can start to practice Yoga or Zen, to get better noise-to-signal ratio, and to set your mind into certain state with ease.Possibly Related Posts:

Here We Grow Again

Mon, 2014-10-13 12:18

Cheesy title aside, the AppsLab (@theappslab) is growing again, and this time, we’re branching out into new territory.

As part of the Oracle Applications User Experience (@usableapps) team, we regularly work with interaction designers, information architects and researchers, all of whom are pivotal to ensuring that what we build is what users want.

Makes sense, right?

So, we’re joining forces with the Emerging Interactions team within OAUX to formalize a collaboration that has been ongoing for a while now. In fact, if you read here, you’ll already recognize some of the voices, specifically John Cartan and Joyce Ohgi, who have authored posts for us.

For privacy reasons (read, because Jake is lazy), I won’t name the entire team, but I’m encouraging them to add their thoughts to this space, which could use a little variety. Semi-related, Noel (@noelportugal) was on a mission earlier this week to add content here and even rebrand this old blog. That seems to have run its course quickly.

One final note, another author has also joined the fold, Mark Vilrokx (@mvilrokx); Mark brings a long and decorated history of development experience with him.

So, welcome everyone to the AppsLab team.Possibly Related Posts:

Did You See Our Work in Steve Miranda’s Keynote?

Fri, 2014-10-10 09:28

Last week at OpenWorld, a few of our projects were featured in Steve Miranda’s (@stevenrmiranda) keynote session.

Jeremy (@jrwashley) tweeted the evidence.

jatweet

Debra (@debralilley) noticed too. I wasn’t able to attend the keynote, so I found out thanks to the Usable Apps (@usableapps) Storify, which chronicled “Our OpenWorld 2014 Journey.”

And today, I finally got to see the video, produced by Friend of the ‘Lab, Martin Taylor, who you might remember from other awesome videos like “A Smart Holster for Law Enforcement.”

Noel (@noelportugal) and Anthony (@anthonyslai) both play developers in the short film. Noteworthy, the expression on Noel’s face as he drives the Sphero ball with the Muse, brain-sensing headband.

Thanks to Martin for making this video, thanks to Steve for including it in his keynote, and thanks to you for watching it.Possibly Related Posts:

ESP8266 – Cheap WiFi for your IoT

Thu, 2014-10-09 21:14

About a month ago, hackaday.com broke the news of a new Wifi chip called ESP8266 that costs about $5. This wireless system on a chip (SoC) took all the IoT heads (including me) by surprise. Until now if you wanted to integrate wifi to any DIY project you had to use more expensive solutions. To put this into perspective, my first wifi Arduino shield was about $99!

F0FZH4CI0RYTMAP.LARGE

So I ordered a few of them (I think I’m up to 10 now!) and went to test the possibilities. I came up with a simple Instructable to show how can you log a room temperature to the Cloud. I used an Arduino to do this, but one of the most amazing things about this chip is that you can use it as stand alone! Right now documentation is sparse, but I was able to compile the source code using a gcc compiler toolchain created by the new esp8266 community.

But why is this important to you even if you haven’t dabble with DIY electronics? Well this chip comes from China and even though it doesn’t have an FCC stamp of  approval (yet), it signals the things about to come. This is what I call the Internet of Things r(evolution). Prices of these chips are at a historical low, and soon we will see more and more products connecting to the Internet/Cloud. From light switches, light bulbs, to washer machines, dishwashers. Anything that needs to be turned on or off could potentially have one of these. Anything that can collect data like thermostats, smoke detectors etc. could also potentially have it.

So you scared or will you welcome our new internet overlords?Possibly Related Posts:

iBeacons or The Physical Web?

Tue, 2014-10-07 06:55

For the past year at the AppsLab we have been exploring the possibilities of advanced user interactions using BLE beacons. A couple days ago, Google (unofficially) announced that one of their Chrome teams is working on what I’m calling the gBeacon. They are calling it the Physical Web.
This is how they describe it:

“The Physical Web is an approach to unleash the core superpower of the web: interaction on demand. People should be able to walk up to any smart device – a vending machine, a poster, a toy, a bus stop, a rental car – and not have to download an app first. Everything should be just a tap away.

The Physical Web is not shipping yet nor is it a Google product. This is an early-stage experimental project and we’re developing it out in the open as we do all things related to the web. This should only be of interest to developers looking to test out this feature and provide us feedback.

Here is a short run down of how iBeacon works vs The Physical Web beacons:

iBeacon

The iBeacon profile advertises a 30 byte packet containing three values that combined make a unique identifier: UUID, Major, Minor. The mobile device will actively listen for these packets. When it gets close to one of them it will query a database (cloud) or use hard-coded values to determine what it needs to do or show for that beacon. Generally the UUID is set to identify a common organization. Major value is an asset within that organization, and Minor is a subset of assets belonging to the Major.
iBeacon_overview.001
For example, if I’m close to the Oracle campus, and I have an Oracle application that is actively listening for beacons, then as I get within reach of any beacon my app can trigger certain interactions related to the whole organization (“Hello Noel, Welcome to Oracle.”) The application had to query a database to know what that UUID represents. As I reach building 200, my application picks up another beacon that contains a Major value of lets say 200. Then my app will do the same and query to see what it represents (“You are in building 200.”) Finally when I get close to our new Cloud UX Lab, a beacon inside the lab will broadcast a Minor ID that represents the lab (“This is the Cloud UX lab, want to learn more?”)

iBeacons are designed to work as full closed ecosystem where only the deployed devices (app+beacons+db) will know what a beacon represents. Today I can walk to the Apple store and use a Bluetooth app to “sniff” BLE devices, but unless I know what their UUID/Major/Minor values represent I cannot do anything with that information. Only the official Apple Store app will know what do with when is nearby beacons around the store (“Looks like you are looking for a new iPhone case.”)

As you can see the iBeacon approach is a “push” method where the device will proactively push actions to you. In contrast the Physical Web beacon proposes to act as a “pull” or on-demand method.

Physical Web

The Physical Web gBeacon will advertise a 28 bytes packet containing an encoded URL. Google wants to use the familiar and established method of URLs to tell an application, or an OS, where to find information about physical objects. They plan to use context (physical and virtual) to top rank what might be more important to you at the current time and display it.

gBeacon

Image from https://github.com/google/physical-web/blob/master/documentation/introduction.md

The Physical Web approach is designed to be a “pull” discovery service where most likely the user will initiate the interaction. For example, when I arrive to the Oracle campus, I can start an application that will scan for nearby gBeacons or I can open my Chrome browser and do a search.  The application or browser will use context to top rank nearby objects combined with results. It can also use calendar data, email or Google Now to narrow down interests.  A background process with “push” capabilities could also be implemented. This process could have filters that can alert the user of nearby objects of interest.  These interests rules could be predefined or inferred by using Google’s intelligence gathering systems like Google Now.

The main difference between the two approaches is that iBeacons is a closed ecosystem (app+beacons+db) and the Physical Web is intended to be a public self discovered (app/os+beacons+www) physical extension of the web. Although the Physical Web could also be restricted by using protected websites and encrypted URLs.

Both approaches are accounting to prevent the misconception about these technologies: “I am going to be spammed as soon as I walk inside a mall?”  The answer is NO. iBeacons is an opt-in service within an app and the Physical Web beacons will mostly work on-demand or will have filter subscriptions.

So there you have it. Which method do you prefer?Possibly Related Posts:

Oracle OpenWorld and JavaOne 2014 Cometh

Mon, 2014-09-22 11:28

This time next week, we’ll be in the thick of the Oracle super-conference, the combination of Oracle OpenWorld and JavaOne.

This year, our team and our larger organization, Oracle Applications User Experience, will have precisely a metric ton of activities during the week.

For the first time, our team will be doing stuff at JavaOne too. Anthony (@anthonyslai) will be talking about the IFTTPi workshop we built for the Java team for MakerFaire back in May on Monday, and Tony will be showing those workshop demos in the JavaOne OTN Lounge at the Hilton all week.

If you’re attending either show or both, stop by, say hello and ask about our custom wearable.

Speaking of wearables, Ultan (@ultan) will be hosting a Wearables Meetup a.k.a. Dress Code 2.0 in the OTN Lounge at OpenWorld on Tuesday, September 30 from 4-6 PM. We’ll be there, and here’s what to expect:

  • Live demos of wearables proof-of-concepts integrated with the Oracle Java Cloud.
  • A wide selection of wearable gadgets available to try on for size.
  • OAUX team chatting about use cases, APIs, integrations, UX design, fashion and how you can use OTN resources to build your own solutions.

Update: Here are Bob (@OTNArchBeat) and Ultan talking about the meetup.

Here’s the list of all the OAUX sessions:

Oracle Applications Cloud User Experiences: Trends, Tailoring, and Strategy

Presenter: Jeremy Ashley, Vice President, Applications User Experience; Jatin Thaker, Senior Director, User Experience; and Jake Kuramoto, Director, User Experience

The Oracle Applications Cloud user experience design strategy is about simplicity, mobility, and extensibility. See what we mean by simplicity as we demo our latest cloud user experiences and show you only the essential information you need for your work. Learn how we are addressing mobility, by delivering the best user experience for each device as you access your enterprise data in the cloud. We’ll also talk about the future of enterprise experiences and the latest trends we see emerging in the consumer market. And finally, understand what we mean by extensibility after hearing a high-level overview of the tools designed for tailoring the cloud user experience. With this team, you will always get a glimpse into the future, so we know you will be inspired about the future of the cloud.

Session ID: CON7198
Date: Monday, September. 29, 2014
Time: 2:45 p.m. – 3:30 p.m.
Location: Moscone West – 3007

Learn How to Create Your Own Java and Internet of Things Workshop

Presenter: Anthony Lai, User Experience Architect, Oracle

This session shows how the Applications User Experience team created an interactive workshop for the Oracle Java Zone at Maker Faire 2014. Come learn how the combination of the Raspberry Pi and Embedded Java creates a perfect platform for the Internet of Things. Then see how Java SE, Raspi, and a sprinkling of user experience expertise engaged Maker Faire visitors of all ages, enabling them to interact with the physical world by using Java SE and the Internet of Things. Expect to play with robots, lights, and other Internet-connected devices, and come prepared to have some fun.

Session ID: JavaOne 2014, CON7056
Date: Monday, Sept. 29, 2014
Time: 4 p.m. – 5 p.m.
Location: Parc 55 – Powell I/II

Oracle HCM Cloud User Experiences: Trends, Tailoring, and Strategy

Presenters: Jeremy Ashley, Vice President, Applications User Experience, Oracle; Aylin Uysal, Director, Human Capital Management User Experience, Oracle

The Oracle Applications Cloud user experience design strategy is about simplicity, mobility, and extensibility. See what we mean by simplicity as we demo our latest cloud user experiences and show you only the essential information you need for your work. Learn how we are addressing mobility, by delivering the best user experience for each device as you access your enterprise data in the cloud. We’ll also talk about the future of enterprise experiences and the latest trends we see emerging in the consumer market. And finally, understand how you can extend with the Oracle tools designed for tailoring the cloud user experience. With this team, you will always get a glimpse into the future. Come and get inspired about the future of the Oracle HCM Cloud.

Session ID: CON8156
Date: Tuesday, Sept. 30, 2014
Time: 12:00 p.m. – 12:45 p.m.
Location: Palace – Presidio

Oracle Sales Cloud: How to Tailor a Simple and Efficient Mobile User Experience

Presenters: Jeremy Ashley, Vice President, Applications User Experience, Oracle; Killian Evers, Senior Director, Applications User Experience, Oracle

The Oracle Applications Cloud user experience design strategy is about simplicity, mobility, and extensibility. In this session, learn how Oracle is addressing mobility by delivering the best user experience for each device as you access your enterprise data in the cloud. Hear about the future of enterprise experiences and the latest trends Oracle sees emerging in the consumer market. You’ll understand what Oracle means by extensibility after getting a high-level overview of the tools designed for tailoring the cloud user experience, and you’ll also get a glimpse into the future of Oracle Sales Cloud.

Session ID: CON7172
Date: Wednesday, Oct. 1 2014
Time: 4:30 p.m. – 5:15 p.m.
Location: Moscone West – 2003

Oracle Applications Cloud: First-Time User Experience

Presenters: Laurie Pattison, Senior Director, User Experience; and Mindi Cummins, Principal Product Manager, both of Oracle

So you’ve bought and implemented Oracle Applications Cloud software. Now you want to get your users excited about using it. Studies show that one of the biggest obstacles to meeting ROI objectives is user acceptance. Based on working directly with thousands of real users, this presentation discusses how Oracle Applications Cloud is designed to get your users excited to try out new software and be productive on a new release ASAP. Users say they want to be productive on a new application without spending hours and hours of training, experiencing death by PowerPoint, or reading lengthy manuals. The session demos the onboarding experience and even shows you how a business user, not a developer, can customize it.

Session ID: CON7972
Date: Thursday, Oct. 2, 2014
Time: 12 p.m. – 12:45 p.m.
Location: Moscone West – 3002

Using Apple iBeacons to Deliver Context-Aware Social Data

Presenters: Anthony Lai, User Experience Architect, Oracle; and Chris Bales, Director, Oracle Social Network Client Development

Apple’s iBeacon technology enables companies to deliver tailored content to customers, based on their location, via mobile applications. It will enable social applications such as Oracle Social Network to provide more relevant information, no matter where you are. Attend this session to see a demonstration of how the Oracle Social Network team has augmented the mobile application with iBeacons to deliver more-context-aware data. You’ll get firsthand insights into the design and development process in this iBeacon demonstration, as well as information about how developers can extend the Oracle Social Network mobile applications.

Session ID: Oracle OpenWorld 2014, CON8918
Date: Thursday, Oct. 2, 2014
Time: 3:15 p.m. – 4 p.m.
Location: Moscone West – 2005

Hope to see you next week.Possibly Related Posts:

Our Very Own Wearable

Wed, 2014-09-17 16:33

Noel (@noelportugal) and Raymond have been hard at work building a custom wearable, a.k.a. the secret OpenWorld project. The finished product is ready for a closeup.

Click to view slideshow.

The components are:

The Bean is an amazingly little board, Arduino-compatible with a Bluetooth Low Energy module, plus an RGB LED and an 3-axis accelerometer.

I can’t tell you what we’re doing with this custom wearable, yet, but it will happen during OpenWorld. If you’ll be at the big show, OpenWorld or JavaOne, you’ll have a chance to see it in action and chat with the guys who built it.

Oh, and Noel will be writing up the details of the build, the story behind it and the journey, as well as all the nerdy bits. Stay tuned for that.Possibly Related Posts:

Autonomous Quadcopters Playing Some Catch

Wed, 2014-09-17 16:04

Tony went to a talk by Salim Ismail (@salimismail), the Founding Executive Director of Singularity University recently. He may/may not post his thoughts on the talk, which sounds fascinating, but this video is worth sharing either way, and not just because we have quadcopter fever.

Yeah, that’s autonomous flight, So refer to the list of horrifying things that should not be allowed.Possibly Related Posts:

Filler or Curated Content?

Wed, 2014-09-17 15:30

I consider these types of posts to be filler, but I suppose you could look at it as curated content or something highbrow like that. Take your pick.

10 Horrifying Technologies That Should Never Be Allowed

I scanned this post first, thought it would be interesting and left it to read later. Then I read it, and now, I’m terrified. Here’s the list, make sure to hit the link and read all about the sci-fi horrors that aren’t really sci-fi anymore.

  • Weaponized Nanotechnology
  • Conscious Machines
  • Artificial Superintelligence
  • Time Travel
  • Mind Reading Devices
  • Brain Hacking Devices
  • Autonomous Robots Designed to Kill Humans
  • Weaponized Pathogens
  • Virtual Prisons and Punishment
  • Hell Engineering

xkcd on watches

This is exactly how I feel about watches.

This is Phil Fish

I only know who Phil Fish is because I watched Indie Game: The Movie. This short documentary by Ian Danskin is quite good and is newsworthy this week thanks to Marcus Persson’s reference to it in his post about why he’s leaving Mojang (h/t Laurie for sharing), the makers of Minecraft, after Microsoft completes its acquisition of the company.

I have often wondered why so many people hate Nickelback, and now I have a much better understanding of why, thanks to Ian. Embedded here for your viewing pleasure.

https://www.youtube.com/watch?v=PmTUW-owa2wPossibly Related Posts:

Wearables Should be Stylish

Tue, 2014-09-09 13:18

To no one’s surprise, Apple announced the Apple Watch today.

Very apropos because I just read Sandra Lee’s (@SandraLee0415) post over on Usable Apps about fashionable tech, one of Ultan’s (@ultan) main talking points about wearables.

Ultan, our wearables whisperer, has style and flair; if you’ve ever met him, you know this. His (and Sandra’s) point about wearable tech needing to be stylish is one that Apple has made, again, to precisely no one’s surprise. Appearance matters to people, and smartwatches and other wearables are accessories that should be stylish and functional.

The market has spoken on this. To the point, the Android Wear smartwatch people want is the round Moto 360, which sold out in less a day earlier this week.

The Apple Watch looks very sleek, and if nothing else, the array of custom bands alone differentiate it from smartwatches like the Samsung Gear Live and the LG G, both of which are also glass rectangles, but with boring rubber wristbands.

I failed to act quickly enough to get a Moto 360 and settled instead on a Gear Live, which is just as well, given I really don’t like wearing watches. We’ve been building for the Pebble for a while now, and since the announcement of Android Wear earlier this year, we’ve been building for it as well, comparing the two watches and their SDKs.

IMG_20140909_121201

Like Google Glass, the Gear Live will be a demo device, not a piece of personal tech. However, for Anthony, his Android Wear watch has replaced Glass as his smartphone accessory of choice. Stay tuned for the skinny on that one.

I haven’t read much about the Apple Watch yet, but I’m sure there will be coverage aplenty as people get excited for its release early in 2015. Now that Apple’s in the game, wearables are surely even more of a thing than they were yesterday.

And they’re much more stylish.

Find the comments.Possibly Related Posts:

On Disney Parks, Data Science, Drones and Wearables

Fri, 2014-09-05 09:16

As the parent of a toddler, I have no choice but to pay attention to Disney and its myriad of products and services.

Case in point, this Summer we took our daughter to Disneyland for the first time, which was a whole thing. Pause to h/t Disneyland expert, Friend of the ‘Lab and colleague Kathy for all her park and travel protips.

Being who I am, I found myself wandering around Disneyland and California Adventure thinking about how many hardcore analytics geeks they must employ to come up with systems like FASTPASS.

For the unfamiliar, FASTPASS is a system that allows you to skip some, if not all, of the line-standing for the most popular attractions in the parks. Although it’s difficult to explain in words, the system is rather simple once you get your first pass.

Being in the park, you can feel all the thought and craft that has gone into the experience. Disney is a $45 billion company, and it’s no surprise their R&D is cutting edge. But what makes it so successful?

Attendees of Disney parks are in a very similar to employees of an enterprise in that they will gladly opt-in to new technologies because the value they receive in return is clear and quantifiable.

Put into examples, if Google Glass helps me do my job more effectively, I’ll wear them. If I receive discounted benefits for wearing a fitness tracker, I’ll do it.

If a MagicBand allows me to leave my wallet in my room, not worry about losing the room keycards, and use FastPass+, I’ll wear it, even though it will allow Disney World to track my location at a very fine-grained level. Who cares? FastPass+ is worth it, right?

Odd branding note, the official ways to write these two terms are indeed FASTPASS and FastPass+, according to Disney’s web site.

If you’re interested in reading more about the MagicBand, what’s inside and Disney uses it at Disney World, check out Welcome to Dataland. Imagine all the data science that goes into creating and iterating on these enormous data sets; this is embiggened Big Data when you consider Disney parks occupied the top eight spots in the 2012 Theme Park Index, comprising well over 100 million visits.

It boggles my mind, although for someone like Bill, it would be Christmas every day.

The post also recounts Walt Disney’s futurist vision, which seems to drive their R&D today. It also encompasses the my point nicely:

Rather, because Disney’s theme parks don’t have the same relationship to reality that Google and Costco and the NSA do. They are hybrids of fantasy and reality.

I read Welcome to Dataland only because I’d just been to Disneyland myself. Then came news that Disney had filed severals patents concerning the use of drones for its park shows, one for floating pixels, one for flying projection screens, one for transporting characters, h/t Business Insider.

Click to view slideshow.

We’ve been experimenting (ahem, playing) with quadcopters, and it struck me that Ultan (@ultan) had sent me a Disney video about customized wearables. This one:

That was posted in August 2012.

So beyond casual interest as the father of a daughter who loves Disney Princesses, suddenly it’s obvious that I need to watch Disney much more carefully to see how they’re adopting emerging technologies.

Oh and become a willing data point in their data set.

Find the comments.Possibly Related Posts: