Skip navigation.

Oracle AppsLab

Syndicate content
Driving Innovation
Updated: 16 hours 15 min ago

Asteroid Hackathon – The Winning Team

Mon, 2014-11-17 09:57

Editorial Note: This is a guest post by friend of the ‘Lab and colleague DJ Ursal. Also be sure to check out our Hackathon entry here:


EchoUser (@EchoUser), in partnership with SpaceGAMBIT, Maui Makers, the Minor Planet Center, NASA, the SETI Institute, and Further by Design, hosted an Asteroid Hackathon. The event was in response to the NASA Grand Challenge, “focused on finding all asteroid threats to human populations and knowing what to do about them.”

I had a wonderful opportunity to participate in the Asteriod Hackathon last week. MY team name was NOVA. Our team comprised for 4 team members – DJ Ursal, Kris Robison, Daniel Schwartz, Raj Krishnamurthy

We were given live data from NASA and Minor Planet site and literally just had 5 hours to put together a working prototype and solution to the Asteroid big data problem.  We created a web application (works not only on your MAC or PC but also on your iPad and your latest Nexus 7 Android devices) which would help scientists, astronomers and anyone who is interested in Asteriods discover, learn and share information in a fun and interactive way.


Our main them was Finding Asteroids Before They Find Us. The goal was to help discover, learn and share Asteroids information to increase awareness within the community.  We created an interactive web app that allowed users to make use of chart filters to find out about the risk for possibilities of future impact with Earth. Find out about the distance of the asteroids to Earth, absolute brightness and rotation of the Asteroid. It allowed users to click and drag on any chart to filter, so that they could transform the filters in multidimensional  way in order to explorer, discover , interesting facts and share data on asteroids with riends and community. We made use of Major Tom who  is an astronaut referenced in David Bowie’s songs “Space Oddity. “Space Oddity” depicts an astronaut who casually slips the bonds of the world to journey beyond the stars. Users could post questions to Major Tom and could also play his song.

The single most important element about WINNING this hackathon  strategically was  team composition. Having a team that is effective working together. Collaboration and communication skills were the two of most critical personal skills demanded of all members as time was limited and communication and coordination of utmost importance.

Winning TEAM NOVA- DJ Ursal, Kris Robison, Daniel Schwartz, Raj Krishnamurthy Possibly Related Posts:

Asteroid Hackathon

Mon, 2014-11-17 09:49

A couple weeks ago Jeremy Ashley (@jrwashley), Bill Kraus, Raymond Xie and I participated in the Asteroid Hackathon hosted by @EchoUser. The main focus was “to engage astronomers, other space nerds, and the general public, with information, not just data.”


As you might already know, we here at the AppsLab, are big fans of Hackathons as well as ShipIt days or FedEx days. The ability to get together, get our collective minds together and being able to create something in a short amount of time is truly amazing. It also helps to keep us on our toes, technically and creatively.

Our team built what we called “The Daily Asteroid.” The idea behind our project was to highlight the asteroid profile of the current date’s closed approach to Earth or near Earth object (NEO) data. What this means is to show which asteroid is the closest to earth today. A user could “favorite” today’s asteroid and start a conversation with other users about it, using a social network like Twitter.

Screen Shot 2014-11-17 at 9.47.36 AM

We also added the ability to change the asteroid properties (size, type, velocity, angle) and play a scenario to see what damage could it cause if it hit the earth. And to finish up,  we created an Asteroid Hotline using Twilio (@twilio) where you can call to get the latest NEO info using your phone!

We were lucky to be awarded 3rd place or “Best Engagement,” and we had a blast doing it. Considering the small amount time we had, we came out really proud of our results.Possibly Related Posts:

The Cloud UX Lab

Mon, 2014-11-10 09:57

There’s a post over on VoX about a OAUX new lab at Oracle HQ, the Cloud UX Lab.


Jeremy Ashley, VP, in the new lab, image used with permission.

Finished just before OOW in September, this lab is a showcase for OAUX projects, including a few of ours.

The lab reminds me of a spacecraft from the distant future, the medical bay or the flight deck. It’s a very cool place, directly inspired and executed by our fearless leader, Jeremy Ashley (@jrwashley), an industrial designer by trade.

I actually got to observe the metamorphosis of this space from something that felt like a doctor’s office waiting room into the new hotness. Looking back on those first meetings, I never expected it would turn out so very awesome.

Anyway, the reason why I got to tag along on this project is because our team will be filling the control room for this lab with our demos. Noel (@noelportugal) and Jeremy have a shared vision for that space, which will be a great companion piece to the lab and equally awesome.

So, if you’re at Oracle HQ, book a tour and stop by the new Cloud UX Lab, experience the new hotness and speculate on what Noel is cooking up behind the glass.Possibly Related Posts:

Pseudo-Philosophical Observations on Wearables, Part 1

Wed, 2014-11-05 11:53

Jawbone announced the Up3 today, reportedly its most advanced fitness tracker to date.

As with all fitness trackers, the Up3 has an accelerometer, but it also has sensors for measuring skin and ambient temperature, as well as something called bioimpedence. As these data collected by the Up3 are used by a new feature called Smart Coach.

You can imagine what the Smart Coach does. It sounds like a cool, possibly creepy, feature.

This post is not about the Up3.

This post is about my journey into the dark heart of the quantified self. The Up3 has just reminded me to coalesce my thoughts.

Earlier this year, I started wearing my first fitness tracker, the Misfit Shine. I happily wore it for about two months before the battery died, and then I realized it had control of me.

Misfit calculates activity based on points, and my personal goal of 1,000 points was relatively easy to reach every day, even for someone who works from home. What I realized quickly was that the Shine pushed me to chase points, not activity.


My high score.


The Shine uses its accelerometer to measure activity, so depending on where I wore it on my person, a run could be worth more points. This isn’t unique to the Shine. I’ve seen people spinning at the gym wearing their fitness trackers on their ankles.

As the weeks passed, I found myself avoiding activities that didn’t register a lot of points, definitely not good behavior, and even though my goal was 1,000 points, I avoided raising it for fear of missing my daily goal-achievement dopamine high.

Then, mid-Summer, Misfit dropped an update that added some new game mechanics, and one day, my Shine app happily informed me that I’d hit my goal 22 days in a row.

This streak was the beginning of the end for me.

On the 29th day of my streak, the battery died. I replaced it, crisis averted, streak in tact. Then, later that day, the Shine inexplicably died. I tried several new batteries and finally had to contact support.

All the while, I worried about my streak. I went to gym, but it felt hollow and meaningless without the tangible representation, the coaching, as it were, from my Shine.

This is not a good look.

Misfit replaced my Shine, but in the days that elapsed, during my detox, I decided to let it go. Turns out the quantified self isn’t for obsessive, overly-competitive personality types like me.

And I’m not the only one in this group.

In September, I read an article called Stepping Out: Living the Fitbit Life, in which the author, David Sedaris, describes a similar obsession with his Fitbit. As I read it, I commiserated, but I also felt a little jealous of the level of his commitment. This dude makes me look like a rank amateur.

Definitely worth a read.

Anyway, this is not in any meant to be an indictment of the Shine, Fitbit, Jawbone or any fitness tracker. Overall, these devices offer people a positive and effective way to reenforce healthy behavior and habits.

But for people like, they lead to unanticipated side effects. As I read about the Up3, its sensors and Smart Coach, all of which sound very cool, I had to remind myself of the bad places where I went with the Shine.

And the colloquial, functionally-incorrect but very memorable, definition of insanity.

In Part 2, when I get around to it, I’ll discuss the flaws in the game mechanics these companies use.

Find the comments.Possibly Related Posts:

Google Glass, Android Wear, and Apple Watch

Tue, 2014-10-28 15:43

I have both the Google Glass and Android Wear (Samsung Gear Live, Moto 360), and often times I wear them together.  People always come up with a question:  “How do you compare Google Glass and Android watches?”  Let me address couple of the view points here.  I would like to talk about Apple Watch, but since it has not been officially released yet, let’s say that shape-wise it is square and looks like a Gear Live, and features seem to be pretty similar to Android Wear, with the exceptions of the attempt to add more playful colors and features.  Lets discuss more about it once it is out.

unnamed                             Moto-360-Dynamic-Black

423989-google-glass              10-apple-watch.w529.h352.2x

I am the first batch of the Google Glass Explorer and got my Glass mid 2013.  In the middle of this year, I first got the Gear Live, then later Moto 360.  I always find it peculiar that Glass is an old technology while Wear is a newer technology.  Should it not be easier to design a smart watch first before a glassware?

I do find a lot of similarities between Glass and Wear.  The fundamental similarity is that both are Android devices.  They are voice-input enabled and show you notifications.  You may install additional Android applications for you to personalize your experience and maximize your usage.  I see these as the true values for wearables.

Differences?  Glass does show a lot of capabilities that Android Wear is lack of at the moment.  Things that probably matter for most people would be sound, phone calls, video recording, pictures taking, hands-free with head-on display, GPS, wifi.  Unlike Android Wear, it can be used standalone;  Android Wear is only a companion gadget and has to be paired up with a phone.

Is Glass more superior?   Android Wear does provide a better touch-based interaction, comparing to the swiping at the side of the Glass frame.  You can also play simple games like Flopsy Droid on your watch.  Also commonly included are pedometers and heart activity sensor.  Glass also tends to get over-heated easily.  Water-resistance also plays a role here: you would almost never want to get your Glass wet at all, while Android Wear is water-resistant to certain degree.  When you are charging your watch at night, it also serves as a bedtime clock.


For me, personally, although I own Glass longer than Wear, I have to say I prefer Android Wear over Glass for couple reasons.  First, there is the significant price gap ($1500 vs $200 price tag).  Second, especially when you add prescription to Glass, it gets heavy and hurts the ear when wearing it for an extended period of time.  Third, I do not personally find the additional features offered by Glass useful to my daily activities;  I do not normally take pictures other than at specific moments or while I am traveling.

I also find that even Glass is now publicly available within the US, Glass is still perceived as an anti-social gadget.  The term is defined in the Urban Dictionary as well.  Most of the people I know of who own Glass do not wear it themselves due to all various reasons.  I believe improving the marketing and advertising strategy for Glass may help.

Gadget preference is personal.  What’s yours?Possibly Related Posts:

Glorious Data Visualizations for Your Friday

Fri, 2014-10-24 09:00

If you’ve read here for more than a hot minute, you’ll know that I love me some data visualization.

This love affair dates back to when Paul (@ppedrazzi) pointed me to Hans Rosling’s (@hansrosling) first TED talk. I’m sure Hans has inspired an enormous city of people by now, judging by the 8 million plus views his TED talk has garnered. Sure, those aren’t unique view, but even so.

There’s an interesting meta-project: visualize the people influenced by various visualization experts, like a coaching tree or something.


Classic comic from xkcd, used under CC 2.5

Back on track, if you haven’t yet, watch the BBC documentary on him, “The Joy of Stats,” fantastic stuff, or if you have seen it, watch it again.

As luck would have it, one area of specialization of our newest team members is, wait for it, data visualization.

Last week, I got to see them in action in a full-day workshop on data visualization, which was eye-opening and very informative.

I’m hoping to get a few blog posts out of them on the subject, and while we wait, I wanted to share some interesting examples we’ve been throwing around in email.

I started the conversation with xkcd because, of course I did. Randal Munroe’s epic comic isn’t usually mentioned as a source for data visualizations, but if you read it, you’ll know that he has a knack for exactly that. Checking out the Google Image search for “xkcd data visualization” reminded me of just how many graphs, charts, maps, etc. Randal has produced over the years.

I also discovered that someone has created a D3 chart library as an homage to the xkcd style.

Anyway, two of my favorite xkcd visualizations are recent, possibly a function of my failing memory and not coincidence, Pixels and Click and Drag.

I probably spent 10 minutes zooming into Pixels, trying to find the bottom; being small-minded, I gave up pretty early on Click and Drag, assuming it was small. It’s not.

How much time did you spend, cough, waste, on these?

During our conversation, a couple interesting examples have come back to me, both worth sharing.

First is Art of the Title, dedicated to the opening credits of various films. In a very specific way, opening credits are data visualizations; they set the mood for the film and name the people responsible for it.

Second is Scale of the Universe, which is self-explanatory and addictive.

So, there you go. Enjoy investigating those two and watch this space for more visualization content.

And find the comments.Possibly Related Posts:

Mind Control?

Mon, 2014-10-13 16:37

Editor’s note: Hey look, a new author. Here’s the first post from Raymond Xie, who joined us nearly a year ago. You may remember him from such concept demos as geo-fencing or Pebble watchface. Raymond has been busy at work and wants to share the work he did with telekinesis. Or something, you decide. Enjoy.

You put on a headband, stare at a ball, tilt your head back-forth and left-right . . . the ball navigates through a simple maze, rushing, wavering, changing colors, and finally hitting the target.

That is the latest creation out of AppsLab: Muse Sphero Driver. When it was first showed at OAUX Exchange during OOW, it amused many people, as they would call it “mind control” game.

The setup consists of  Muse – a brain-sensing headband, Sphero – a robotic ball, and a tablet to bridge the two.

Technically, it is your brainwave data (Electroencephalography – EEG) driving the Sphero (adjusting speed and changing color with spectrum from RED to BLUE, where RED: fast, active;  BLUE: slow, calm);  and head gesture (3d Accelerarometer- ACC) controlling the direction of Sphero movement.  Whether or not you call that as “mind control” is up to your own interpretation.

You kind of drive the ball with your mind, but mostly brainwave noises instead of conscious thought. It is still too early to derive accurate “mind control” from EEG data out of any regular person, for the reasons:

1. For EEG at Scalp level, the noise-to-signal ratio is very poor;
2. Need to establish the correlation between EEG and mind activity.

But it does open up a dialog in HCI, such as voice-control vs mind-control (silence); or in Robotics, instead of asking machine to “see”/”understand”, we can “see”/”understand” and impersonate it with our mind and soul.

While it is difficult to read out “mind” (any mind activity) transparently, we think it is quite doable to map your mind into certain states, and use the “state” as command indirectly.

We may do something around this area. So stay tuned.

Meanwhile, you can start to practice Yoga or Zen, to get better noise-to-signal ratio, and to set your mind into certain state with ease.Possibly Related Posts:

Here We Grow Again

Mon, 2014-10-13 12:18

Cheesy title aside, the AppsLab (@theappslab) is growing again, and this time, we’re branching out into new territory.

As part of the Oracle Applications User Experience (@usableapps) team, we regularly work with interaction designers, information architects and researchers, all of whom are pivotal to ensuring that what we build is what users want.

Makes sense, right?

So, we’re joining forces with the Emerging Interactions team within OAUX to formalize a collaboration that has been ongoing for a while now. In fact, if you read here, you’ll already recognize some of the voices, specifically John Cartan and Joyce Ohgi, who have authored posts for us.

For privacy reasons (read, because Jake is lazy), I won’t name the entire team, but I’m encouraging them to add their thoughts to this space, which could use a little variety. Semi-related, Noel (@noelportugal) was on a mission earlier this week to add content here and even rebrand this old blog. That seems to have run its course quickly.

One final note, another author has also joined the fold, Mark Vilrokx (@mvilrokx); Mark brings a long and decorated history of development experience with him.

So, welcome everyone to the AppsLab team.Possibly Related Posts:

Did You See Our Work in Steve Miranda’s Keynote?

Fri, 2014-10-10 09:28

Last week at OpenWorld, a few of our projects were featured in Steve Miranda’s (@stevenrmiranda) keynote session.

Jeremy (@jrwashley) tweeted the evidence.


Debra (@debralilley) noticed too. I wasn’t able to attend the keynote, so I found out thanks to the Usable Apps (@usableapps) Storify, which chronicled “Our OpenWorld 2014 Journey.”

And today, I finally got to see the video, produced by Friend of the ‘Lab, Martin Taylor, who you might remember from other awesome videos like “A Smart Holster for Law Enforcement.”

Noel (@noelportugal) and Anthony (@anthonyslai) both play developers in the short film. Noteworthy, the expression on Noel’s face as he drives the Sphero ball with the Muse, brain-sensing headband.

Thanks to Martin for making this video, thanks to Steve for including it in his keynote, and thanks to you for watching it.Possibly Related Posts:

ESP8266 – Cheap WiFi for your IoT

Thu, 2014-10-09 21:14

About a month ago, broke the news of a new Wifi chip called ESP8266 that costs about $5. This wireless system on a chip (SoC) took all the IoT heads (including me) by surprise. Until now if you wanted to integrate wifi to any DIY project you had to use more expensive solutions. To put this into perspective, my first wifi Arduino shield was about $99!


So I ordered a few of them (I think I’m up to 10 now!) and went to test the possibilities. I came up with a simple Instructable to show how can you log a room temperature to the Cloud. I used an Arduino to do this, but one of the most amazing things about this chip is that you can use it as stand alone! Right now documentation is sparse, but I was able to compile the source code using a gcc compiler toolchain created by the new esp8266 community.

But why is this important to you even if you haven’t dabble with DIY electronics? Well this chip comes from China and even though it doesn’t have an FCC stamp of  approval (yet), it signals the things about to come. This is what I call the Internet of Things r(evolution). Prices of these chips are at a historical low, and soon we will see more and more products connecting to the Internet/Cloud. From light switches, light bulbs, to washer machines, dishwashers. Anything that needs to be turned on or off could potentially have one of these. Anything that can collect data like thermostats, smoke detectors etc. could also potentially have it.

So you scared or will you welcome our new internet overlords?Possibly Related Posts: