Skip navigation.

Oracle AppsLab

Syndicate content
Driving Innovation
Updated: 16 hours 40 min ago

IFTTT Easy Button

Sat, 2015-08-15 10:00


The Amazon Dash button, it’s all the buzz lately. Regardless whether you think it is the greatest invention or just a passing fad, it is a nice little IoT device. There is already work underway to try and make it work with custom code.

There are a couple crowdfunding projects (flic and btn) that are attempting to create custom IoT buttons as well. But these often come with a high price tag (around $100).

This is where the up and coming ESP8266 mcu can shine. For under $3 you can have a wifi chip plus a programable micro-controller. You just need to add a cheap button (like the Staples Easy Button for around $7.) Add good ol’ IFTTT Maker Channel and you will be set to go with your custom IoT button for about $10.


Check my project ( to learn how to make your own.

Possibly Related Posts:

What Kids Tell Us about Touch and Voice

Fri, 2015-08-14 12:28

Recently, my four year-old daughter and her little bestie were fiddling with someone’s iPhone. I’m not sure which parent had sacrificed the device for our collective sanity.

Anyway, they were talking to Siri. Her bestie was putting Siri through its paces, and my daughter asked for a joke, because that’s her main question for Alexa, a.k.a. the Amazon Echo.


Siri failed at that, and my daughter remarked something like “Our Siri knows the weather too.”

Thus began an interesting comparison of what Siri and “our Siri” i.e. the Echo can do, a pretty typical four year-old topping contest. You know, mine’s better, no mine is, and so forth.

After resolving that argument, I thought about how natural it was for them to talk to devices, something that I’ve never really liked to do, although I do find talking to Alexa more natural than talking to Google Now or Siri.

I’m reminded of a post, which I cannot find, Paul (@ppedrazzi) wrote many years ago about how easily a young child, possibly one of his daughters, picked up and used an iPhone. This was in 2008 or 2009, early days for the iPhone, and the child was probably two, maybe three, years old. Wish I could find that post.

From what I recall, Paul mused on how natural touch was as an input mechanism for humans, as displayed by how a child could easily pick up and use an iPhone. I’ve seen the same with my daughter, who has been using iOS on one device or another since she was much younger.

I’m observing that speech as equally natural to her.

Kids provide great anecdotal research for me because they’re not biased by what they already know about technology.

When I use something like gesture or voice control, I can’t help but compare it to what I know already, i.e. keyboard, mouse, which colors my impressions.

Watching kids use touch and voice input, the interactions seem very natural.

This is obvious stuff that’s been known forever, but it took how long for someone, Apple, to get touch right? Voice is in an earlier phase, advancing, but not completely natural.

One point Noel (@noelportugal) makes about about voice input is that having a wake word is awkward, i.e. “Alexa” or “OK Google,” but given privacy concerns, this is the best solution for the moment. Noel wants to customize that wake word, but that’s only incrementally better.

When commanding the Amazon Echo, it’s not very natural to say “Alexa” and pause to ensure she’s listening. My daughter tends to blurt out a full sentence without the pause, “Alexa tell us a joke” which sometimes works.

That pause creates awkward usability, at least I think it does.

Since its release, Noel has led the charge for Amazon Echo research, testing and hacking (lots of hacking) on our team, and we’ve got some pretty cool projects brewing to test our theories. I’ve been using it around my home for a while, and I’m liking it a lot, especially the regular updates Amazon pushes to enhance it, e.g. IFTTT integration, smart home controlGoogle Calendar integration, reordering items from Amazon and a lot more.

Amazon is expanding its voice investment too, providing Alexa as a service, VaaS or AVS as they call it.

I fully believe the not-so-distant future will feature touch and speech, and maybe gestures, at the glance and scan layers of interaction, with the old school keyboard and mouse for heavy duty commit interactions.

Quick review, glance, scan, commit is our strategic design philosophy. Check out Ultan (@ultan) explaining it if you need a refresher.

So, what do you think? Thank you Captain Obvious, or pump the brakes Jake?

Find the comments.Possibly Related Posts:

Biohacking, Here Come the Cyborgs

Tue, 2015-08-11 11:21

For me, 2015 has been the year of the quantified self.

I’ve been tracking my activity using various wearables: Nike+ Fuelband, Basis Peak, Jawbone UP24, Fitbit Surge, and currently, Garmin Vivosmart. I just set up Automatic to track my driving; check out Ben’s review for details. I couldn’t attend QS15, but luckily, Thao (@thaobnguyen) and Ben went and provided a complete download.

And, naturally, I’m fascinated by biohacking because, at its core, it’s the same idea, i.e. how to improve/modify the body to do more, better, faster.


Professor Kevin Warwick of the University of Reading

Ever since I read about RFID chip implanting in the early 00s, I’ve been curiously observing from the fringe. This post on the Verge today included a short video about biohacking that was well worth 13 and half minutes.

If you like that, check out the long-form piece, Cyborg America: inside the strange new world of basement body hackers.

This stuff is fascinating to me. People like Kevin Warwick and Steve Mann have modified themselves for the better, but I’m guessing the future of biohacking lies in healthcare and military applications, places where there’s big money to be made.

My job is to look ahead, and I love doing that. At some point during this year, Tony asked me what the future held; what were my thoughts on the next big things in technology.

I think the human body is the next frontier for technology. It’s an electrical source that could solve the modern battery woes we all have; it’s an enormous source for data collection, and you can’t forget it in a cab or on a plane. At some point, because we’ll be so dependent on it, technology will become parasitic.

And I for one, welcome the cyborg overlords.

Find the comments.Possibly Related Posts:

Jeremy and Noel Talk IoT at Kscope15

Mon, 2015-08-10 10:42

By now, you know all about the Scavenger Hunt we ran a Kscope15 in partnership with our good friends at ODTUG and YCC.

Noel (@noelportugal) talked about the technical bits in a post last week, and today, ODTUG posted an interview featuring our fearless leader, Jeremy Ashley (@jrwashley), and Noel from the conference wherein they talk about Internet of Things (IoT) and the IoT bits included in the Hunt.

If you read here, you’ll know that IoT has been a long-time passion of Noel’s, dating back to well before Internet-connected devices were commonplace and way before they had an acronym.

Thanks to ODUTG for giving us the opportunity to do something cool and fun using our nerdy passion, IoT.Possibly Related Posts:

Guerrilla Testing at OHUG

Mon, 2015-08-10 02:52

The Apple Watch came out, and we had a lot of questions: What do people want to do on it? What do they expect to be able to do on it? What are they worried about? And more importantly, what are they excited about?

But we had a problem—we wanted to ask a lot of people about the Apple Watch, but nobody had it, so how could we do any research?

Our solution was to do some guerrilla testing at the OHUG conference in June, which took place in Las Vegas. We had a few Apple Watches at that time, so we figured we could let people play around with the watch, and then ask them some targeted questions. This was the first time running a study like this, so we weren’t sure how hard it would be to get people to participate by just asking them while they were at the conference.

It turned out the answer was “not very.” We should have known—people both excited and skeptical were curious about what the watch was really like.


Friend of the ‘Lab and Oracle ACE Director Gustavo Gonzalez and Ben enjoy some Apple humor.

Eventually we had to tell the people at our recruiting desk to stop asking people if they want to participate! Some sessions went on for over 45 minutes, with conference attendees chatting about different possibilities and concerns, brainstorming use cases that would work for them or their customers.


The activity was a great success, generating some valuable insights not only about how people would like to use a smartwatch (Apple or not), but how they want notifications to work in general. Which, of course, is an important part of how people get their work done using Oracle applications.


Our method was pretty simple: We had them answer some quick survey questions, then we put the watch on them and let them explore and ask questions. While they were exploring, we sent them some mock notifications to see what they thought, and then finished up asking them more in depth about what they want to be able to accomplish with the watch.

At the end, they checked off items from a list of notifications that they’d like to receive on the watch. We recorded everything so we didn’t have to have someone taking notes during the interviews. It took some time to transcribe everything, but it was extremely valuable to have actual quotes bringing to life the users’ needs and concerns with notifications and how they want things to work on a smartwatch.

Most usability activities we run at conferences involve 5–10 people, whether it’s a usability test or a focus group, and usually they all have similar roles. It was valuable here to get a cross-section of people from different roles and levels of experience, talking about their needs for not only a new technology, but also some core functionality of their systems.

In retrospect, we were a little lucky. It would probably be a lot more difficult to talk to the same number of people for an appreciable amount of time just about notifications, and though we did learn a good deal about wants and needs for developing for the watch, it was also a lot broader than that.

So one takeaway is to find a way to take advantage of something people will be excited to try out—not just in learning about that specific new technology, but other areas that technology can impact.Possibly Related Posts:

Game Mechanics of a Scavenger Hunt

Fri, 2015-08-07 13:17

android512x512This year we organized a scavenger hunt for Kscope15 in collaboration with the ODTUG board and YCC.

As we found out, scavenger hunts are a great way to get people to see your content, create buzz and have fun along the way.  We also used the scavenger hunt as a platform to try some of the latest technologies. The purpose was to have conference attendees complete tasks using Internet of Things (IoT), Twitter hashtags and pictures and compete for a prize.

Here is a short technical overview of the technologies we used.


We opted to use a Node.js back-end and a React front-end to do a clever Twitter name autocomplete. As you typed your Twitter handle, the first and last name fields were completed for you. Once you filled all your information the form submitted to a REST endpoint based on Oracle APEX. This piece was built by Mark Vilrokx (@mvilrokx) and we were all very happy with the results.


Smart Badge

We researched two possible technologies: Bluetooth Low Energy (BLE) Beacons and Near Field Communication (NFC) stickers and settled for NFC. The reason behind choosing NFC was the natural tendency we have to touch something (NFC scanner) and get something in return (notifications + points). When we tested with BLE beacons the “check-in” experience was more transparent but not as obvious when trying to complete a task.

We added a NFC sticker to all scavenger hunt participants’ badges so they could get points by scanning their badge in our Smart Scanners.  To provision each NFC badge we build an Android app that took the tag id and associated with the user profile.


Smart Scanner

The Smart Scanner was a great way to showcase IoT. We used the beloved Raspberry Pis to host an NFC reader. We used the awesome blink(1) USB LED light to indicate whether the scan was successful or not. We also added a Mini USB Wi-Fi dongle and a high capacity battery to assure complete freedom from wires.

Raymond Xie (@YuhuaXie) did a great job using Java 8 to read the NFC stickers and send the information to our REST server. The key part for these scanners was creating a failover system in case of internet disconnection. In such case we would still read the NFC tag and register it, then it would post it to our server as soon as connectivity was restored.


Twitter and SMS Bots

Another key component was to create a twitter and SMS bot. Once again, Mark used Node.js to consume the Twitter stream. We looked for tweets mentioning #kscope15 and #taskhashtag. Then we posted to our REST server which made sure that points were given to the right person for the right task. Again we were pleasantly surprised by the flexibility and power of Node.js. Similarly we deployed a Twilio SMS server that listened for SMS subscriptions and sent SMS notifications.


We just didn’t settled by creating a web client to keep track of points. We created a web mobile client (React), an iOS app and an Android app. This was part of our research to see how people used each platform. As a bonus we created Apple Watch and Android Wear companion apps. One of the challenges we had was to create a similar experience across platforms.



We needed a way to manage all task and player administration. Since we used APEX  and PLSQL to create our REST interface, it was a no brainer to use APEX for our admin front-end. The added bonus was that APEX has user authentication and sessions management, so all we had to do is create admin users with different roles.

Screen Shot 2015-08-07 at 1.54.25 PM


Creating a scavenger hunt for tech conference is no easy task. You have to take into consideration many factors, from what are the right task for the conference attendees to having the optimal wifi connection. Having an easy registration and provisioning process are also paramount for easy uptake.

We really had fun using the latest technologies and we feel we successfully showcased what good UX can do for you across different devices and platforms. Stay tuned to see if we end up doing another similar activity. You wont want to miss it!Possibly Related Posts:

Royal High School Students Visit Oracle

Thu, 2015-08-06 20:29

Last week a group of high school students from Royal High School visited Oracle Headquarters in Redwood Shores, California.

Royal High School, a public school in Simi Valley, California, is launching an International Business Pathway program. This program is part of California’s Career Pathways Trust (CCPT), which was established in 2013 by the California State Legislature to better prepare students for the 21st Century workplace.

The goal of the visit was to introduce students to real life examples of what they will be studying in the year ahead, which include Business Organization and Environment, Marketing, Human Resources, Operations, and Finance.

I was honored to be invited to be on a career panel with three other Oracle colleagues and share our different careers and career paths.

L-R Chris Kite, VP Finance A&C/NSG; Jessica Moore, Sr Director Corporate Communications; Thao Nguyen, Director Research & Design; Kym Flaigg, College Recruiting Manager

While Oracle is known as a technology company, it is comprised of many different functional areas beyond engineering. The panel shared our diverse backgrounds and education, our different roles within the organization, the different cultures within Oracle, and more.

Since these are students in an international business program, we also discussed Oracle as a global business. The panelists shared our individual involvement and impact on Oracle’s international business – from working with Oracle colleagues located throughout the world to engaging with global customers, partners, and journalists.

By the end, the students heard stories of our professional and personal journeys to where we are now. The common themes were to be authentic and true to yourself, change is inevitable, and it is a lifetime of learning. All of the panelists started on one path but ultimately found new interests and directions.

The students learned there are many different opportunities in companies and many different paths to achieve career and life goals. Bring your passion to work and you’ll succeed.

On a personal note, I grew up in the same area as these students, that being the San Fernando Valley in Southern California. I moved from the San Fernando Valley to the Silicon Valley years ago, but thanks to Oracle Giving, I am able to give back to my roots and proud to participant in Oracle’s community outreach .Possibly Related Posts:

Usability of Text Analytics

Wed, 2015-08-05 15:44

User experience design as a career fell largely on the era of GUI. Thus most people in my profession are visual thinkers if not by birth than by experience. When it comes to presenting information, we think visualization. Times are changing, and with that we are challenged to present information verbally. This is where text analytics meets UX. I only worked on a handful of projects that are about text, and only with a handful of text technologies, but the experience has been worth mentioning.

Text analytics, more or less meaning the same as text mining, is “devising of patterns and trends from text through means such as statistics…” (Oh, Wikipedia!)

There are many areas of text analytics – text summarization, information retrieval, sentiment analysis, named entity recognition, and on… The tools and techniques are constantly getting better, it is exciting. I get an impression that the text mining companies are intoxicated with the coolness of technologies they build, so they think of it first and think of possible industry applications later. As I am conditioned to think in an opposite direction, it was interesting for me to see how the same technique can be so useful in one case and completely irrelevant in another.

Here is my use case inventory. Take a brand manager versus a sales representative. A brand manager might like daily sentiment analysis of her brands and those of her competitor. On the other hand, the sales representatives we have interviewed are not at all into sentiment analysis. What they look for is highly tuned searches that would brief them daily on what’s happening with their top clients. They also search for industry news that they can retweet with a hope to influence the clients. A money manager might need to use text analytics to contextualize the jump in a stock price, while a marketer would rather have a predictive text mining tool to target customers for a purchasing recommendation. I often research different design topics and am interested in text analytics that would make me see at a glance what a collection of papers or articles is about. I also like to see daily summaries of trending topics in design and technology.

So the first lesson I’ve learned is how all text analytics use cases are different.

The second lesson is how the devil is in detail.

For one of my project, I wanted to have a condensed representation of press coverage for the new release of HCM applications, specifically, its user experience. For my purposes, I wanted to have it as a cloud of words. I have collected a number of press releases and reviews, and fed them through four text analytics tools I could put my hands on, namely Semantria, Open Calais, TagCrowd, and Oracle’s own Social Relationship Management (SRM) Listen and Analyze.

Here are the results.


For the fairness of the comparison, I have stripped the lists of its the original formatting (the products have drastically different interfaces), and limited the results to 20 items. Moreover, some packages categorize the results into “themes,” “entities,” etc. I kind of had to either pick or merge. SRM doesn’t allow me to feed corpus of text to it to analyze, so I had to create a search query about OAUX instead.

You can see that the differences are dramatic. I believe some differences are the results of subtle choices made by the product designers – frequency thresholds, parts of speech included, the choice of either 1, 2 or 3 word phrases, etc. Other differences are the results of the actual algorithms beneath – bag of words, word vectors, neural nets, skip grams, chaining, deep learning, … . At first, I was determined to figure them all out. I quickly realized that there is no way I can get through the math of it. So I decided to approach it in a chocolate tasting way. If I like the taste, I’ll make an effort to read the ingredients.

Semantria I liked the most. I liked the combination of themes and entities; I thought the length of the phrases was well balanced. I read the ingredients. Instead of plain word frequencies, Semantria uses something called “lexical chaining” to score themes. “The algorithm takes context and noun-phrase placement into account when scoring themes.” I put “lexical chaining” high on my list of likes.

OpenCalais looked totally solid, though heavy on terms and nouns, and light on themes and adjectives. This is to no surprise, as Named Entity Recognition is OpenCalais’ core competency, and there it is unsurpassed. The new “generic relations” feature in a shape of a “subject-predicate-object” is amazing.

TagCrowd’s was definitely too plain to represent what the collection is about. This is a very simple well-meaning word frequency tool, with the stop words (the and a removed) being its only “lexical analysis” feature. From TagCrowd I’ve learned that the word frequencies can take you only that far.

Finally, there is SRM. SRM uses latent semantic analysis, which is a type of vectorial technique.

And what’s your favorite?Possibly Related Posts:


Tue, 2015-08-04 21:56

In my previous post I argued that the hunt is on for a better way to code, a way more suited for a designer’s need to test new interactions. I said I wanted a process less like solving a Rubik’s cube and more like throwing a pot. What does this actually mean?

“I want to grab a clump of clay and just continuously shape it with my hands until I am satisfied.”

There are two key concepts here: “continuously shape” and “with my hands.”

Code that is continuously shaped is called reactive programming. A familiar example is the spreadsheet: change a single cell and the rest of the sheet automatically updates. There is no need to write a series of instructions and then “run” them to see what happens; instead every change you make instantly affects the outcome.

“With my hands” refers to a kinesthetic or visuospatial style of thinking which leverages our ability to perceive and manipulate spatial relationships. Traditional programming languages are frustrating for visual thinkers; they rely on a phonological style which uses hands only to type and eyes only to read.

In theory, any written language can instead be represented as a collection of elements arranged and connected in space; this is the idea behind visual programming languages. Instead of typing instructions, you drag objects around and connect them together to express ideas.

 Origami (Quartz Composer), Coral, Scratch, Form

Clockwise from upper left: Origami (Quartz Composer), Coral, Scratch, Form

The image above includes some typical examples. Block style IDEs (e.g. Scratch) let you snap together commands like Lego bricks. The others let you drag boxes around and string wires between them.

I think it’s easy to see at a glance the problem with this approach: it doesn’t scale. Stringing wires or snapping bricks gets really messy really fast. Reaching elbow deep into a rat’s nest of wires is not anything like shaping clay.

But it doesn’t have to be this bad. The problem these examples have is that, although visual, they slavishly adhere to an imperative style of coding where instructions are listed in order and even the words within each instruction must follow a specific syntax. This forces connections into arbitrary knots and loops, creating more tangles and going against the overall flow. A visual style demands a simpler, more fluid kind of logic.

Enter an old idea in computer science which has seen a recent resurgence: functional programming. In place of a sequence of instructions which focus on how to do things, functional programming languages use chains of transformations that focus on the desired result at each point. Loops are banished and each node can have only one output so everything naturally flows in the same direction. A classic example is Lisp; a more modern functional language now gaining traction is Clojure.  Don’t be scared.

So what we need is a functional reactive programming language with a responsive, fun to use visual IDE, designed specifically for artists. Extra bonus points if it includes natural scrubbing interactions for setting values ala Bret Victor.

Meet NodeBox. NodeBox is an open source, cross-platform GUI originally developed for generative artists. I first encountered it at the OpenVis conference in 2013. The video of that presentation is a great introduction; you can skip to 22:00 to see a demo of NodeBox in action which shows how quickly and easily you can shape a visualization. This is what I mean by shaping clay.

 Recursive Pentagons

A simple Nodebox network: Recursive Pentagons

This NodeBox “network” draws a set of nested pentagons. The structure is so simple you can see how it works just by looking at it. Make a pentagon node, color it, hook it to a “nextChild” subnetwork that makes a smaller copy, repeat three more times, then combine all five pentagons into a single display.

You can double-click on any node to render it on the main screen; a white triangle in the lower right corner indicates the currently rendered node. You can then single-click any other node to adjust its parameters – in this case the original pentagon node. By scrubbing (dragging the mouse across) the radius field I can increase or decrease its size; making the top pentagon bigger will automatically make all its children bigger. In this way I can quickly scrub values to get the result I want.

A NodeBox network which can draw itself

A NodeBox network which can draw itself

Another (somewhat mind-bending) example: a NodeBox network which can draw itself. On the right is a set of nodes that opens a JSON file, analyzes the contents, and plots it as a series of rectangles and connecting lines. On the left is what happens when that JSON file happens to contain this network’s own structure (taken directly from it’s .ndbx file).

I’ve been playing with NodeBox for about six months now and have created over forty networks which let me play with and try out various visualizations and data-driven animations. I find that some things which are easy to do in other languages are hard to do in NodeBox (or just hard for me to figure out how to do). But the reverse is also true: some things that are difficult or time-consuming to do in any other language are spectacularly easy in NodeBox.

Debugging, in particular, is much less time-consuming and almost fun. I catch most bugs instantly since every change I make is instantly rendered. When something unexpected does happen I can just click on each node in turn to follow the steps of a process. When something is too big or too small or in the wrong place I can simply scrub a parameter or even just grab the offending object and drag it where it needs to go.

Scaling up to large projects is manageable, but remains problematic. If you think clearly enough you can encapsulate everything into a handful of subnetworks and sub-subnetworks. But this can only go so far. NodeBox’s functional approach eliminates “side effects;” a change made to one function cannot affect distant functions unless those two functions are physically linked. This prevents the nasty hard-to-trace bugs which plague procedural languages, but it also means there are no global variables, which in turn means that if you want a variable to effect twenty different functions you will need to create at least twenty separate links.

You can alleviate this somewhat by using Null nodes as cable ties. If two clumps of nodes have many interlinkages, you can physically separate them, lay one cable across the void to a Null node, and then distribute its output from there. After I get something working in NodeBox I usually spend some more time “tidying up,” rearranging nodes into related clumps and positioning nodes to reduce the number of crossing lines. I regard this not as a nuisance, but as a pleasant, almost meditative ritual that helps me optimize my code.

NodeBox does have one major limitation: it doesn’t do input. It was designed to produce intricate still images and animations, not to facilitate end user interactions. So there are no input fields, no buttons, no sliders, no checkboxes – no way to create a standalone interactive prototype. These things could all be done in theory, it’s just that NodeBox does not currently provide any *nodes* to do them.

This is ironic because the NodeBox IDE itself is richly interactive. It’s vector-based ZUI (zoomable user interface) is a joy to use. So as a designer I can experience wonderful interactions by scrubbing node parameters and zooming in and out, but I can’t create a similar experience for my end users.

My use of NodeBox, therefore, is limited to creating sketches and animations. This is no small thing – it allows me to play and try and then convey the essence of ideas which are inherently hard to test and demonstrate. But for now I will still have to move to other languages if I need to create stand-alone interactives.

I think the deeper value of NodeBox is that it shows what is possible. There are better ways of imagining, better ways of coding. If we hope to create ever better experiences for our users, we need to keep searching for these better ways.Possibly Related Posts:

Better Ways to Play and Try

Tue, 2015-07-28 20:50
  • Fact 1: Dazzling animated displays (sprites, shaders, parallax, 3D) are more plentiful and easier to make than ever before.
  • Fact 2: More natural and expressive forms of input (swiping, pinching, gesturing, talking) are being implemented and enhanced every day.
  • Fact 3: Put these two together and the possible new forms of human computer interaction are endless. The only limit is our imagination.

That’s the problem: our imagination. We can’t build new interactions until A) someone imagines them, and B) the idea is conveyed to other people. As a designer in the Emerging Interactions subgroup of the AppsLab, this is my job – and I’m finding that both parts of it are getting harder to do.

If designers can’t find better ways of imagining – and by imagining I mean the whole design process from blank slate to prototype – progress will slow and our customers will be unable to unleash their full potential.

So what does it mean to imagine and how can we do it better?

Imagination starts with a daydream or an idle thought. “Those animations of colliding galaxies are cool. I wonder if we could show a corporate acquisition as colliding org charts. What would that look like?”

What separates a mere daydreamer from an actual designer is the next step: playing. To really imagine a new thing in any meaningful way you have to roll up your sleeves and actually start playing with it in detail. At first you can do this entirely in your mind – what Einstein called a “thought experiment.”  This can take weeks of staring into space while your loved ones look on with increasing concern.

Playing is best done in your mind because your mind is so fluid. You can suspend the laws of physics whenever they get in the way. You can turn structures inside out in the blink of an eye, changing the rules of the game as you go. This fluidity, this fuzziness, is the mind’s greatest strength, but also its greatest weakness.

So sooner or later you have to move from playing to trying. Trying means translating the idea into a visible, tangible form and manipulating it with the laws of physics (or at least the laws of computing) re-enabled. This is where things get interesting. What was vaguely described must now be spelled out. The inflexible properties of time and space will expose inconvenient details that your mind overlooked; dealing with even the smallest of these details can derail your entire scheme – or take it wild, new directions. Trying is a collaboration with Reality.

Until recently, trying was fairly easy to do. If the thing you were inventing was a screen layout or a process flow, you could sketch it on paper or use a drawing program to make sure all the pieces fit together. But what if the thing you are inventing is moving? What if it has hundreds of parts each sliding and changing in a precise way? How do you sketch that?

My first step in the journey to a better way was to move from drawing tools like Photoshop and OmniGraffle to animation tools like Hype and Edge – or to Keynote (which can do simple animations). Some years ago I even proposed a standard “animation spec” so that developers could get precise frame-by-frame descriptions.

The problem with these tools is that you have to place everything by hand, one element at a time. I often begin by doing just that, but when your interface is composed of hundreds of shifting, spinning, morphing shapes, this soon becomes untenable. And when even the simplest user input can alter the course and speed of everything on the screen, and when that interaction is the very thing you need to explore, hand drawn animation becomes impossible.

To try out new designs involving this kind of interaction, you need data-driven animation – which means writing code. This is a significant barrier for many designers. Design is about form, color, balance, layout, typography, movement, sound, rhythm, harmony. Coding requires an entirely different skill set: installing development environments, converting file formats, constructing database queries, parsing syntaxes, debugging code, forking githubs.

A software designer needs a partial grasp of these things in order to work with developers. But most designers are not themselves coders, and do not want to become one. I was a coder in a past life, and even enjoy coding up to a point. But code-wrangling, and in particular debugging, distracts from the design process. It breaks my concentration, disrupts my flow; I get so caught up in tracking down a bug that I forget what I was trying to design in the first place.

The next stage of my journey, then, was to find relatively easy high-level programming languages that would let me keep my eye on the ball. I did several projects using Processing (actually Processing.js), a language developed specifically for artists. I did another project using Python – with all coding done on the iPad so that I could directly experience interactions on the tablet with every iteration of the code.

These projects were successful but time-consuming and painful to create. Traditional coding is like solving a Rubik’s Cube: twist and twist and twist until order suddenly emerges from chaos. This is not the way I want to play or try. I want a more organic process, something more like throwing a pot: I want to grab a clump of clay and just continuously shape it with my hands until I am satisfied.

I am not the only one looking for better ways to code. We are in the midst of an open source renaissance, an explosion of literally thousands of new languages, libraries, and tools. In my last blog post I wrote about people creating radically new and different languages as an art form, pushing the boundaries in all directions.

In “The Future of Programming,” Eric Elliott argues for reactive programming, visual IDEs, even genetic and AI-assisted coding. In “Are Prototyping Tools Becoming Essential?,” Mark Wilcox argues that exploring ideas in the Animation Era requires a whole range of new tools. But if you are only going to follow one of these links, see Bret Victor’s “Learnable Programming.”

After months of web surfing I stumbled upon an interesting open source tool originally designed for generative artists that I’ve gotten somewhat hooked on. It combines reactive programming and a visual IDE with some of Bret Victor’s elegant scrubbing interactions.

More about that tool in my next blog postPossibly Related Posts:

Four Weeks with Nothing on My Wrist

Wed, 2015-07-22 12:06

After wearing the Fitbit Surge for seven weeks, I developed an ugly skin rash. So, I took a break and let my skin breathe for a while.

I’m all better now, thanks for asking.

For most of the year, I’ve been test-driving various fitness bands and super watches and journaling my impressions here as one man’s research. After all, wearables are, and have been, a thing for a while now. So, I need to know as much as possible.

First came three weeks with the Nike+ Fuelband, then four with the Basis Peak, then four and a day with the Jawbone UP24, followed by seven with the Fitbit Surge.

If you’re scoring at home, that’s 18 weeks with something on my wrist, a lot for me after 23 years, give or take, of nothing on my wrist; I’m not really a watch guy.

Here's a random picture of chairs congregating outside Building 200

Here’s a random picture of chairs congregating outside Building 200. Enjoy.

Physical bands aside, I was also tracking and quantifying myself, my fitness and general activity data and my sleep data. I’m a fan of the quantified self and better living through statistics and math. Looking at raw numbers forces introspection that can be very revealing, in good and bad ways.

If you read here, you’ll recall Thao (@thaobnguyen) and Ben both attended QS15, and Ben has an interest quantified self devices, like Automatic. So, I’m not alone on the team.

Anyway, before I put on another device, I decided to capture the pros and cons of not wearing one, at least in terms of what was missing when I had a naked wrist.

The Pros

Not having something on my wrist all the time is pro enough. I generally don’t like encumbrances, and having my wrist free again is nice.

Typing on a keyboard is another plus. I still don’t know how people with watches do it. A guy I used to work with wore a watch, and his Macbook Pro showed the scratch damage it did to the unibody aluminum.

Being free of data collection is liberating, but it cuts both ways. On the plus side, I don’t obsess about my step count. Wearing a fitness tracker has made it painfully obvious that my life is dangerously sedentary.

If it weren’t for running on a treadmill, there are many days when I wouldn’t reach the 10,000 steps magic number.

Why is this a pro? Now that I know, I can adjust accordingly, without a tracker, and I have a general idea of how much activity generates 10,000 steps.

Taking a break from testing has given me time to reflect on the four devices I’ve used without being too close to the one I’m currently testing. When I finish this research experiment, I should take a similar break to reflect.

The Cons

On the downside, I really got used to having the time on my wrist, which is something I missed when I wore the Jawbone UP24 as well.

Even though I did find myself checking the time as a nervous habit, the utility outweighed the nervous tick.

I really miss the phone and text notifications that the two super watches, the Basis Peak and Fitbit Surge provide.

On the data collection side, I find myself needing to be pushed by numbers. It’s weird, I know; I’ll recognize something that generates more activity, like walking vs. driving, but I need the extra push to do it.

I also miss my morning data review. It became routine for me to review my night’s sleep and browse through my data each morning, my own a personal, daily report.

Now that Google has Your Timeline for Maps, you can begin to see the value of aggregating data summaries; yes, it’s creepy, especially the implications of kismet or whatever the opposite of that is, but I remain in the optimistic camp that hopes to correlate and improve based on personal data sets.

Anyway, figured since I’d been sharing my wearables observations, I might as well share my lack of wearable observations.

Sometime in the next few weeks, I’ll get started on a new one. Stay tuned.Possibly Related Posts:

That Time I Killed My Phone

Tue, 2015-07-21 15:19

I don’t particularly like protective cases for phones because they ruin the industrial design aesthetics of the device.

Here at the ‘Lab, we’ve had spirited debates about cases or not, dating back the original team and continuing to our current team.

I am not careful with phones, and the death of my Nexus 5, which I’ve only had since October 2014, was my fault. It was also very, very bad luck.

I usually run with a Bluetooth headset, the Mpow Swift, which I quite like (hey Ultan, it’s green), specifically because I had a few instances where my hand caught the headset cord and pulled my phone off the treadmill deck and onto the belt, causing the phone to fly off the back like a missile.

Yes, that happened more than once, but in my defense, I’ve seen it happen to other people too.

However, on July 8, I was running on the treadmill talking to Tony on the phone, using a wired headset. I’ve found the Mpow doesn’t have a very strong microphone, or maybe I wasn’t aiming my voice in the right direction. Whatever the reason, the Mpow hasn’t been good for talking on the phone.

While talking to Tony, possibly mid-sentence, I caught the cord and pulled the phone off the deck.

Unlike the other times, this time, the phone slipped under the treadmill belt, trapping it between the belt and housing, sliding it the length of the belt, and dragging it over the back drum.

I stopped the treadmill and looked under, but it was trapped inside the machine. After sheepishly asking for help, we were able to get the machine to spit up my mangled phone.

IMG_20150708_190309 IMG_20150708_185854 IMG_20150708_185900 IMG_20150708_185910 IMG_20150708_185757 IMG_20150708_185805

Interestingly, the screen is completely intact, which gives an idea of how tough it really is. The phone’s body is sadly bent in an aspect that describes its journey over that drum. Luckily, its battery hasn’t leaked.

The device didn’t die right away. While it wouldn’t boot, when I connected it to my Mac via USB, it was recognized, although it wouldn’t mount the storage like it normally would. Something about the device consuming too much power for USB.

I tried with a powered USB hub, but I think the battery gave up the ghost.

Happily for me, I had recently bought a second generation Moto X on sale, and I’d been postponing the switch.

Unhappily, every time I switch phones, I lose something, even though I keep backups. When my Nexus 4 died mysteriously, I lost all my photos. This time, I lost my SMS/MMS history.

Like I said, I’m careless with phones.Possibly Related Posts:

The Pen is Mightier with the User’s Experience

Wed, 2015-07-15 10:04

If you’re involved in enterprise user experience (UX) it will come as no surprise that the humble pen and paper remains in widespread use for everyday business.

Sales reps, for example, are forever quickly scribbling down opportunity info. HR pros use them widely. Accountants? Check. At most meetings you will find both pen and paper and digital technology on the table.

That’s what UX is all about, understanding all the tools, technology, and job aids, and the rest, that the user touches along that journey to getting the task done.

Although Steve Jobs famously declared that the world didn’t need another stylus, innovation in digital styli, or digital pens (sometimes called smartpens), has never been greater.

Microsoft is innovating with the device, h/t @bubblebobble. Apple is ironically active with patents for styli, and the iPen may be close. Kickstarter boasts some great stylus ideas such as the Irish-designed Scriba (@getbscriba), featured in the Irish Times.

It is the tablet and the mobility of today’s work that has reinvigorated digital pen innovation, whether it’s the Apple iPad or Microsoft Surface.

Livescribe Echo smartpen and notebook

Livescribe Echo smartpen and notebook

I’ve used digital pens, or smartpens, such as the Livescribe Echo for my UX work. The Echo is great way to wireframe or create initial designs quickly and to communicate the ideas to others working remotely, using a pencast.

Livescribe Echo pencast viewed from the desktop

Livescribe Echo pencast viewed from the desktop

Personally, I feel there is a place for digital pens, but that the OG pen and paper still takes some beating when it comes to rapid innovation, iteration, and recall, as pondered on UX StackExchange.

An understanding of users demands that we not try to replace the pen and paper altogether but to enhance or augment their use, depending on the context. For example, using the Oracle Capture approach to transfer initial strokes and scribbles to the cloud for enhancement later.


You can read more about this in the free Oracle Applications Cloud User Experience Innovations and Trends eBook.

Sure, for some users, a funky new digital stylus will rock their world. For others, it won’t.

And we’ll all still lose the thing.

The pen is back? It’s never been away.

Cross-posted from Ültan’s Über Üsable Apps, thanks Ultan (@ultan).Possibly Related Posts:

On Oracle Corporate Citizenship

Tue, 2015-07-14 19:39

Yesterday, our entire organization, Oracle Applications User Experience (@usableapps) got a treat. We learned about Oracle’s corporate citizenship from Colleen Cassity, Executive Director of the Oracle Education Foundation (OEF).


I’m familiar with Oracle’s philanthropic endeavors, but only vaguely so. I’ve used the corporate giving match, but beyond that, this was all new information.

During her presentation, we learned about several of Oracle’s efforts, which I’m happy to share here, in video form.

First, there’s the OEF Wearable Technology Workshop for Girls, which several of our team members supported.

Next Colleen talked about Oracle’s efforts to support and promote the Raspberry Pi, which is near and dear to our hearts here. We’ve done a lot of Raspi projects here. Expect that to continue.

Next up was Wecyclers, an excellent program to promote recycling in Nigeria.

And finally, we learned about Oracle’s 26-year-old, ongoing commitment to the Dian Fossey Gorilla Fund.

This was an eye-opening session for me. Other than the Wearable Technology Workshop for Girls, I hadn’t heard about Oracle’s involvement in these other charitable causes, and  I’m honored that we were able to help with one.

I hope we’ll be able to assist with similar, charitable events in the future.

Anyway, food for thought and possibly new information. Enjoy.Possibly Related Posts:

More Kscope15 Impressions

Wed, 2015-07-08 09:03

Kscope15 (#kscope15) was hosted at Diplomat resort along beautiful Hollywood Beach, and the Scavenger Hunt from OAUX AppsLab infused a hint of fun and excitement between the packed, busy, and serious sessions.


The Scavenger Hunt was quite a comprehensive system for people to win points in various ways, and keep track of events, points and a leaderboard. And of course, we had one Internet of Things (IoT) component that people could search for and tap to win points.

And here is the build, with powerful battery connected to it, complete with anti-theft feature, which is double-sided duct tape :) All together, it is a stand-alone, self-contained, and definitely mobile, computer.

Isn’t it cool? I overheard on multiple occasions people say it was the coolest thing at the conference.


One of the bartenders at the Community Night reception wanted to trade me the “best” drink of the night for my Raspberry Pi.

I leased it to him for two hours, and he gave me the drink. That fact is that I would put the Raspberry Pi on his table anyway for the community night event, and he would give me the drink anyway if I knew how to order it.


On the serious side, APEX (Oracle Applications Express) had a good showing with many sessions. Considering our Scavenger Hunt Web Admin was built on APEX, I am interested in learning it too. After two hands-on sessions, I did feel that I’d use it for quick web app in the future.

On the database side, the most significant development is ORDS (Oracle REST Data Services) and the ability to call a web end-point from within database. This opens up possibility of monitoring data/state change at the data level, and triggering events into a web server, which in turn can trigger client reaction via WebSocket.

Again the Kscope15 was a very fruitful event for us, as we demonstrated Scavenger Hunt game and provoked lots of interest. It has some potential for large event and enterprise application, so stay tuned while we make some twist to it in the future.

Editor’s note: Raymond (@yuhuaxie) forgot to mention how much fun he had at Kscope15. Pics because it happened:


ODTUG (@odtug) commissioned a short film, which was shot, edited and produced during the week that was Kscope15. It debuted during the Closing Session, and they have graciously shared it on YouTube. It’s 10 minutes, but very good at capturing what I like about Kscope so much.

Noel appears to talk about the Scavenger Hunt at 7:29. Watch it here.

Possibly Related Posts:

Lots of OAUX Updates

Tue, 2015-07-07 11:35

While I spent June wrapping up conference season at OHUG and Kscope, Ultan (@ultan), Misha (@mishavaughan) and company (@usableapps) have been busily publishing content.

This here is a wrap-up of that content, but let’s be honest. If you like OAUX content, you really should follow the official blogs of OAUX: Usable Apps in the CloudVoXuser experience assistance: cloud design & development.

Oh and follow @usableapps too. That’s done, so let’s get recapping.

Strategy Anyone?

Over on VoX, you can read all about Oracle’s Cloud Application user experience strategy in three short posts.

In the first part, read about how we apply Simplicity, Mobility, Extensibility to Cloud Applications. In part two, read about big-picture innovation and how it drives our Glance, Scan, Commit design philosophy. Finally, in the big finish, read about how we ap
ply all this to designing and building experiences for our cloud users.

As a bonus, our team, our projects and our strategic approach to emerging technologies are mentioned in each post. So, yay us.


More Apple Watch

You’ve read our takes, and Ultan’s, on the Apple Watch, and now our GVP, Jeremy Ashley (@jrwashley) has shared his impressions. Good stuff in there, check it out if you’re looking for reasons to buy a smartwatch.


Not convinced of the value? Longtime friend of the ‘Lab, David Haimes (@dhaimes) might have what you need to go from cynic to believer.

We Heart APIs

Channeling his inner Mark (@mvilrokx), Ultan has a two-minute tech tip for Bob Rhubart of OTN (@OTNArchbeat) about APIs, how valuable they are, and how good ones make all the difference.

We love us some APIs, especially the good ones. Developers are users too.

Speaking of APIs and developers, check out two videos that tie developer use cases with PaaS4SaaS.

Big Finish, ERP Cloud and Cake

And finally, let’s finish with some ERP Cloud goodness, a post on UX, ROI and cake and a post on cake starring David.


Told you they’ve been busy.Possibly Related Posts:

Kscope15 Impressions

Mon, 2015-07-06 10:34

As per Jake’s post, we got to spend a few days in Florida to support the Scavenger Hunt that we created for the Kscope15 conference.  Since it ran pretty smoothly, we were able to attend a few sessions and mingle with the attendees and speakers, here are my impressions of the event.


This was my first time at Kscope.  Jake hyped it up as a not-to-miss conference for Oracle developers and despite my high expectations of the event, it did not disappoint.  The actual conference started Sunday but we arrived Saturday to setup everything for the Scavenger Hunt, dot a few i’s and cross some t’s.

We also ran a quick training session for the organizers helping with the administration of the Scavenger Hunt and later that night started with actually registering players for the hunt.  We signed up about 100 people on the first evening.  Registration continued Sunday morning and we picked up about 50 more players for a grand total of 150, not bad for our first Scavenger Hunt.


The number of sessions was a bit overwhelming so I decided to focus on the Database Development and the Application Express track and picked a few session from those tracks.  The first one I attended was called “JSON and Oracle: A Powerful Combination” where Dan McGhan (@dmcghan) from Oracle, explained how to produce JSON from data in an Oracle Database, how to consume JSON in the Oracle Database and even how to use it in queries.

It turns out that Oracle has some new, really cool features to work with JSON so be sure to check those out.  Interestingly, our Scavenger Hunt backend is using some of these techniques, and we got some great tips from Dan on how to improve what we were doing. So thanks for that Dan!

Next I went to “A Primer on Web Components in APEX” presented by my countryman Dimitri Gielis (@dgielis).  In this session, Dimitri demonstrated how you can easily integrate Web Components into an APEX application.  He showed an impressive demo of a camera component that took a picture right from the web application and stored it on the database.  He also demoed a component that integrated voice control into an APEX application, this allowed him to “ask” the database for a row and it would retrieve that row and show it on the screen, very cool stuff.

That night also featured the infamous “APEX Open Mic” where anybody can walk up to the mic and get five minutes to show off what they’ve built with APEX, no judging, no winners or losers, just sharing with the community, and I must say, some really impressive applications where shown, not the least of which one by Ed Jones (@edhjones) from Oracle, who managed to create a Minecraft-like game based on Oracle Social Network (OSN) data where treasure chests in the game represent OSN conversations. Opening the chest opens the conversation in OSN. Be sure to check out his video!

The next day, I attend two more sessions, one by our very own Noel Portugal (@noelportugal) and our Group Vice President, Jeremy Ashley (@jrwashley), I am sure they will tell you all about this through this channel or another so I am leaving that one for them.


The other session was called “An Introduction to JavaScript Apps on the Oracle Database,” presented by Dan McGhan.  Dan demonstrated how you can use Node.js to enhance your APEX application with among other things, WebSocket functionality, something not natively offered by APEX.  Here I also learned that Oracle 12c has a feature that allows you to “listen” for particular changes in the database and then broadcast these changes to interested parties (Node.js and then WebSockets in this case), this is for sure something that we are going to be using in the future in some of our demos.

The 3rd day was Hands-On day and I attend 2 more sessions , first “Intro to Oracle REST Data Services” by Kris Rice (@krisrice) from Oracle, and then “Soup-to-Nuts of Building APEX Applications” by David Peake (@orcl_dpeake) from Oracle.

In the first one we were introduced to ORDS, a feature in the Oracle DB that allows you to create REST services straight on top of the Database, no middle tier required!  I’ve seen this before in MySQL, but I did not know you could also do this in an Oracle DB. Again this is a supper powerful feature that we will be using for sure in future projects.

The second, two-hour, session was a walk through of a full fledged APEX application from start to finish by the always entertaining David Peake.  I must admit that by that time I was pretty much done, and I left the session half way through building my application. However, Raymond @yuhuaxie) managed to sit through the whole thing so maybe he can give some comments on this session.

All I can say is that APEX 5.0 was extremely easy to get started with and build a nice Web Application with.

And that was KScope15 in a nutshell for me.  It was an awesome, exhausting experience, and I hope I can be there again in 2016.


Mark.Possibly Related Posts:

Seven Weeks with the Fitbit Surge

Thu, 2015-07-02 14:35

As my wearables odyssey continues, it’s time to document my time with the Fitbit Surge.

I ended up wearing the Surge a lot longer than I’d worn the Nike+ Fuelband, the Basis Peak and the Jawbone UP24 because June was a busy month, and I didn’t have time to switch.

For comparison’s sake, I suggest you read Ultan’s (@ultan) review of the Surge. He’s a hardcore fitness dude, and I’m much more a have-to-don’t-like-to exercise guy, which makes for a nice companion read.

As usual, this isn’t a review, more loosely-coupled observations. You can find lots of credible reviews of the Surge, billed as a “Super Watch” by the recently IPO’ed Fitbit, e.g. this one from Engadget.

Here we go.

The watch

As with most of the other wearables I’ve used, the Surge must be setup from software installed on a computer. It also requires the use of a weird USB doohickey for pairing, after which the watch firmware updates.


I get why they provide ways for people to sync to software installed on computers, but I wonder how many users really eschew the smartphone app or don’t have a smartphone.

Anyway, despite Fitbit Connect, the software you have to install, saying the firmware update process will take five to ten minutes, my update took much longer, like 30 minutes.

Physically, the Surge is chunky. Its shape reminds me of a door-stop, like a wedge. While this looks weird, it’s really a nice design idea, essentially tilting the display toward the user, making it easier to read at a glance.


I found wearing the device to be comfortable, although the rubber of the band did make my skin clammy after a while, see the Epilogue for more on that.

The display is easy to read in any light, and the backlight comes on automatically in low light conditions.

Surge carries water resistant rating of 5 ATM, which amounts to 50 meters deep, but for some reason, Fitbit advises against submerging it. Weird, right?

Not one to follow directions, I took the Surge in a pool with no ill effects. However, once or twice during my post-workout steam, the display did show some condensation under the glass. So, who knows?

The device interface is a combination of touches and three physical buttons, all easy to learn through quick experimentation.

The watch screens show the day’s activity in steps, calories burned, miles, and floors climbed. It also tracks heart rate via an optical heart rate sensor.

In addition, you can start specific activity tracking from the device including outdoor running with GPS tracking, which Ultan used quite a lot, and from what I’ve read, is the Surge’s money feature. I only run indoors on a treadmill (lame), so I didn’t test this feature.

The Surge does have a treadmill activity, but I found its mileage calculation varied from the treadmill’s, e.g. 3.30 miles on the treadmill equated to 2.54 on the Surge. Not a big deal to me, especially given how difficult tracking mileage would be for a device to get right through sensors.

Speaking of, the Surge packs a nice array of sensors. In addition to the aforementioned GPS and optical heart rate sensor, it also sports a 3-axis accelerometer and a 3-axis gyroscope.

The Surge tracks sleep automatically, although I’m not sure how. Seemed to be magically accurate though.

Fitbit advertises the Surge’s battery life as seven days, but in practice, I only got about four or five days per charge. Luckily, Fitbit will inform you when the battery gets low via app notifications and email, both of which are nice.

Happily, the battery charges very quickly, albeit via a proprietary charging cord. Lose that cord, and you’re toast. I misplaced mine, which effectively ended this experiment.

The app and data

As Ultan mentioned in his post, the Fitbit Aria wifi scale makes using any Fitbit device better. I’ve had an Aria for a few years, but never really used it. So, this was a great chance to try it with the Surge.

Fitbit provides both mobile and web apps to track data.

I mostly used the mobile app which shows a daily view of activity, weight and food consumption, if you choose to track that manually. Tapping any item shows you details, and you can swipe between days.

Screenshot_2015-07-02-13-01-58 Screenshot_2015-07-02-13-02-03 Screenshot_2015-07-02-13-02-17 Screenshot_2015-07-02-13-02-57 Screenshot_2015-07-02-13-02-45 Screenshot_2015-07-02-13-03-26

It’s all very well-done, easy to use, and they do a nice job of packing a lot information into a small screen.

From within the app, you can set up phone notifications for texts and calls, a feature I really liked from wearing the Basis Peak.

Noel, send me a text message.

Noel, send me a text message.

Unfortunately, I only got notified about half the time, not ideal, and I’m not the only one with this issue. Danny Bryant (@dbcapoeira) and I chatted about our Surge experiences at Kscope, and he mentioned this as an issue for him as well.

Fitibit offers Challenges to encourage social fitness competition, which seems nice, but not for me. There are badges for milestones too, like walking 500 miles, climbing 500 floors, etc. Nice.

Sleep tracking on the mobile app is pretty basic, showing number of times awake and number of times restless.

Fitbit’s web app is a dashboard showing the same information in a larger format. They hide some key insights in the Log section, e.g. the sleep data in there is more detailed than what the dashboard shows.

Fitbit Dashboard

Fitbit Dashboard

Track My Sleep on Fitbit

Fitbit Log

Track My Activities on Fitbit

Fitbit Log

I have to say I prefer the Jawbone approach to viewing data; they only have a mobile app which dictates the entire experience and keeps it focused.

Fitbit sends weekly summary emails too, so yet another way to view your data. I like the emails, especially the fun data point about my average time to fall asleep for the week, usually zero minutes. I guess this particular week I was well-rested.


I did have some time zone issues when I went to Florida. The watch didn’t update automatically, and I did some digging and found a help article about traveling with your Fitbit with this tip:

Loss of data can occur if the “Set Automatically” timezone option in the app’s “Settings” is on. Toggle the “Set Automatically” timezone option to off.

So for the entire week in Hollywood, my watch was three hours slow, not a good look for a watch.

And finally, data export out of Fitbit’s ecosystem is available, at a cost. Export is a premium feature. “Your data belongs to you!” for for $50 a year. Some consolation though, they offer a free trial for a week, so I grabbed my data for free, at least this time.

Overall, the Surge compares favorably to the Basis Peak, but unlike the Jawbone UP24, I didn’t feel sad when the experiment ended.


Perhaps you’ll recall that Fitbit’s newer devices have been causing rashes for some users. I’m one of those users. I’m reporting this because it happened, not as an indictment of the device.

I wore the Surge for seven weeks, pretty much all the time. When I took it off to end the experiment, my wife noticed a nasty red spot on the outer side of my arm. I hadn’t seen it, and I probably would never have noticed.

IMG_20150629_131631 IMG_20150629_131702

It doesn’t itch or anything, just looks gnarly. After two days, it seems to be resolving, no harm, no foul.

The rash doesn’t really affect how I view the device, although if I wear the Surge again, I’ll remember to give my skin a break periodically.

One unexpected side effect of not wearing a device as the rash clears up is that unquantified days feel weird. I wonder why I do things if they’re not being quantified. Being healthy for its own sake isn’t enough. I need that extra dopamine from achieving something quantifiable.

Strange, right?

Find the comments.Possibly Related Posts:

The Week That Was Kscope15

Tue, 2015-06-30 10:11

Noel (@noelportugal), Raymond (@yuhuaxie), Mark (@mvilrokx) and I traveled to sunny Hollywood, Florida last week to attend Kscope15 (#kscope15), the annual conference of the Oracle Development Tools User Group (@odtug).

Check out some highlights of our week.

IMG_20150621_181205 IMG_20150621_192710 IMG_20150621_195446 download_20150622_223725

If you read here, you probably know that this year, Noel had cooked up something new and different for the conference, a scavenger hunt.

This year was my fourth Kscope, and as we have in past years, we planned to do something fun. At the end of Kscope14, Monty Latiolais (@monty_odtug), the President of the ODTUG Board of Directors, approached us to collaborate on something cool for Kscope15.

We didn’t know what exactly, but we all wanted to do something new, something fun, something befitting of Kscope, which is always a great conference. So, we spent the next few months chatting with Crystal (@crystal_walton), Lauren (@lprezby) and Danny (@dbcapoeira) intermittently, developing ideas.

We eventually settled on a scavenger hunt, which would allow attendees to experience all the best parts of the conference, almost like a guided tour.

Once we had a list of tasks, Noel developed the game, and with Mark and Raymond pitching in, they built it over the course of a few months. Tasks were completed one of three ways, by checking in to a Raspberry Pi station via NFC, by staff confirmation, and by tweeting a picture or video with the right hashtags.

We arrived in Hollywood unsure of how many players we’d get. We didn’t do much promotion in advance, and we decided to limit the game to 500 players to ensure it didn’t get too crazy.

Over the first few days, we registered nearly 150 players, and of them, about 100 completed at least one task, both well above my conservative expectations.

During the conference, we had a core of about 10-20 dedicated players who made the game fun to watch. They jockeyed back and forth in the top spots, trolling each other on Twitter, and waiting to complete tasks to allow fleeting hope to the other players.

In the end, we had a tie that we had to break at the conference’s closing session. Here are the final standings:


Congratulations winners, and thank you to everyone who played for making the game a success.

And finally an enormous thank you to ODTUG and the Kscope15 organizers for allowing us this opportunity. We’re already noodling ways to improve the game for Kscope16 in Chicago.

Stay tuned for other Kscope15 posts.Possibly Related Posts:

QS15: Measurement with Meaning

Sun, 2015-06-28 10:02

Walking into something as a newcomer is always an adventure of reality interacting with expectations. Though I wasn’t quite sure what to expect at the Quantified Self conference, it wasn’t what I expected. But in a good way.

QS15 Twitter robot

Tweet-painting robot at QS15

The conference was structured around three main activities: talks given on the main stage, breakout sessions, which took place at different smaller areas during the talks, and break times, where one might check out the vendors, grab a snack, or chat with fellow attendees.

The talks, about ten minutes each, were mostly about the speaker’s successes in changing some aspect of their life via quantifying and analyzing it. This is partly what I wasn’t expecting—the goal-focused and very positive nature of (most) everyone’s projects.

True, some of the presenters might be tallied on the obsessive side of the spectrum, but by and large, it was all about improving your life, and not recording everything as a method of self-preservation.

On this last point, one presenter even provided this quote from Nabokov, which generated a touch of controversy: “the collecting of daily details … is always a poor method of self-preservation.”

One important theme I saw, however, is the role of measuring itself—that the very act of quantifying your behaviors, whether it’s diet, exercise, TV watching, or your productivity, can change your behavior for the better.

Granted, there can also be profound personal insights from analyzing the data, especially when combining multiple sources, but it’s possible some of these benefits come from simply tracking. Especially when it’s done manually, which takes a great deal of persistence, with many people petering out after a few weeks at the most.

This presents an interesting question about technology’s increasing proficiency at passive tracking, and the aim to provide insights automatically. For instance, the Jawbone UP platform’s Smart Coach is supposed look at your exercise and activity data with your sleep data and give you advice about how to get better sleep.

If someone had tracked this manually, and done the analysis themselves, they may not only be a lot more familiar with the facts about their own sleep and exercise, but any insights derived might be more likely to be absorbed and translate to genuine change.

When insights are automatically provided will they lead to just as much adoption?

Probably not, but they could reach a lot more people who may not be able to keep up with measuring. So it’s probably still a good thing in the end.

The other important theme was something that I’ve also been encountering in other areas of my work—the importance of good questions.

For most of the QS projects, this took the form of achieving a personal goal, but sometimes it was simply a specific inquiry into a realm of one’s life. Just looking at data can be interesting, but without a good question motivating an analysis, it’s often not very useful.

In the worst case, you can find spurious connections and correlations within a large set of data that may get you off in the wrong direction.

And while at the beginning of the conference it was made clear that QS15 was not a tech conference, there was plenty of cool technology in the main hall to check out and discuss.

There are too many to cover in much detail, but here are a few that intrigued me:

  • Spire, a breath tracking device that says it can measure focus by analyzing your breathing pattern. If someone is interested in examining their productivity, this could be a promising device to check out. Also, it can let you know when you need a deep breath, which has various physiological and emotional benefits.
  • Faurecia manufactures seats for automobiles, and they were showing off a prototype that uses piezoelectric bands within the chair itself to measure heart rate and breathing patterns. This is great because it can do this through your clothing, and detect when you’re falling asleep, and possibly institute some countermeasures. The data could also sync up with your phone, say through Apple’s Healthkit, if you want to add it to your logs.
  • Oura
    is an activity and sleep tracker that uses a ring form factor, which for some people may be easier to sleep with than a wrist band. Their focus is on sleep and measuring how restorative your rest is. I look forward to seeing how this one develops.

The conference had a lot to offer—some inspiration, some cool technologies, surprisingly good lunches, and quite a bit to think about.Possibly Related Posts: