Skip navigation.

Oracle AppsLab

Syndicate content
Driving Innovation
Updated: 3 hours 31 min ago

Another Take on Maker Faire 2015

Wed, 2015-05-20 09:05

Editor’s note: Here’s another Maker Faire 2015 post, this one from Raymond. Check out Mark’s (@mvilrokx) recap too for AppsLab completeness.

I went to the Maker Faire 2015 Bay Area show over the weekend. A lot of similarity to last year, but a few new things.

In place of our spot last year, it was HP-Sprout demo stations. I guess HP is the main sponsor this year.

hp-sprout

Sprout is an acquisition by HP, that they build a large touchpad and projector, as attachment to HP computer. It is kind of combination of projector, extended screen, touch screen, and working pad – that seems to blend physical things with virtual computer objects, such as capture objects into 3D graphics.

TechHive’s Mole-A-Whack is quite good station too – it is a reverse of classical Whack-A-Mole.

mole-a-whack

Here’s a video of it in action:

They use arduino-controlled Mole to whack kids who hide in the mole holes, but need raise head out of the hole cover (which is arduino-monitored), and reach to push a button (MaKey connected) to earn points.

The signals go into a Scratch program on computer for tally the winner.

This pipe organ is an impressive build:

fire-pipe-organ

As usual, lots of 3D printers, CNC mills, etc. and lots of drones flying.

Also I saw many college groups attending the events this year, bringing in all kinds of small builds for various applications.Possibly Related Posts:

Maker Faire 2015

Tue, 2015-05-19 09:17

This weekend the 10th Annual Maker Faire Bay Area took place in my backyard and rather than fighting traffic for 2 days with the +130,000 attendees I decided, as I have for the last 9 years, to join them.

Unlike last year, Oracle had no presence at the Maker Faire itself, so I had plenty of time to walk around the grounds and attend sessions.  This post is an overview of what I saw and experienced in the 2 day madness that is called the Maker Faire.

For those of you who have never been to the Maker Faire, the easiest way to describe it is as a mix of Burning Man and a completely out of control hobbyist’s garage, where the hobbyist’s hobbies include, but are not limited to: everything tech related, everything food related, everything engineering related and everything art related, all wrapped up in a family friendly atmosphere, my kids love the Maker Faire.

You can find the tech giants of the world next to the one person startup, beer brewers next to crazy knitting contraptions, bus sized, fire breathing rhino’s next to giant cardboard robots etc.  And nobody takes themselves too seriously, e.g. Google was handing out Google Glasses to everybody … Google Safety Glasses that is :-)

Google Safety Goggles

My new Google Glasses :-)

The first thing I noticed was that the Faire expanded . . . again.  A huge tent was erected on what was a parking lot last year that was housing the Make:Labs, I didn’t actually get to spend any time in there but it contained an exploratorium, startup stuff and a section for Young Makers.

Which brings me to the first trend I observed, makers are getting younger and younger and the faire is doubling down on these young folk.

Don’t get me wrong, the faire has always attracted young kids, and some of them were making stuff, but there seem to be more and more of them, the projects they bring are getting more and more impressive and the faire’s expansions all seem to be to cater to these younger makers.

One of the sessions I attended was called “Meet Some Amazing Young Makers” where a 14 year old girl showed of a semi-autonomous robot that could map the inside of caves.  She was showing us the second iteration, she build the first version . . . when she was 8!  Another young man, 13, build a contraption that solved a Rubik’s cube in under 90 seconds.  It wasn’t just that they build these things, they gave solid presentations to a majority adult audience talking about their builds and future plans.

Another trend that was hard to ignore is that the Internet of Things (IoT) is getting huge and it’s definitely here to stay.  There weren’t just many, many vendors promoting their brand of IoT hardware, but a whole ecosystem is developing around them.

From tools that let you visualize all the data collected by your “things” to remote configuration and customization.  This trend will not just Cross the Chasm, it’s going to rocket right passed it.

I attended a panel discussion with Dominic Pajak (Director IoT Segments, ARM), Paul Rothman (Director of R&D at littleBits Electronics), Andrew Witte (CTO, Pebble), Alasdair Allan (scientist, tinkerer) and Pierre Roux (Atmel) about the current state of IoT and the challenges that lay ahead.

One of the interesting points raised during the discussions is that there currently is no such thing as the Internet of Things!  All these “things” have to be tethered to a phone or other internet capable device (typically using BLE), they cannot connect to the internet directly.

Furthermore, they cannot communicate with each other directly.  So it’s not really an IoT rather the regular “human internet” with regular computers/phones connecting to it, which in turn happen to have have some sensors attached to them that use the internet as a communication vehicle, but that doesn’t really roll of the tongue that well.

There is no interoperability standard at the moment so you can’t really have one device talk to a random other device.  This is one of the challenges the panel felt has to be solved in the sort term.  This could happen with the adoption of IP in BLE or some other mechanism like Fog Computing.

Another challenge brought up was securing IoT devices, especially given that some of the devices could be broadcasting extremely personal information.  This will have to be solved at the manufacturing level as well as at the application level.

Finally, they also mentioned that lowering power consumption needs to be a top priority for these devices.  Even though they have already come a long way, there still is a lot of work to be done.  The ultimate goal would be self sufficient devices that need no external power at all but can harvest the energy they need from their environment.

One such example mentioned is a button/switch that when pressed, uses the energy you put in to press it to generate enough power to send a on/off signal to another device.

Massimo Banzi, co-founder of the Arduino Project, also gave a talk (as he does every year) about the State of Arduino.  It seems that a lot of that state is in legal limbo at the moment as there are now seemingly 2 arduino companies (arduino.cc and arduino.org) with different views of the future of the project.

As part of his vision, Massimo introduced a partnership with Adafruit to let them produce arduino’s in the USA.  Also as a result of the legal issues with the Arduino brand name, he introduced a new “sister” brand called Genuino (Get it? Genuine Arduino) which will allow them to keep producing at least in the US.

Other announcements included the release of the Arduino Gemma, the smallest Arduino ever, the Modulino, a arduino like product designed and produced in their Bangalore, India, office and a focus on online tools to manage and program arduino’s.

I also attended a few sessions that talked about the BeagleBone board.  I am interested in this board because it bridges that gap between the Raspberry Pi and the Arduino, on the one hand it has a Linux OS, but on the other hand it also has Real Time GPIO pins making it interesting for IoT projects that require this.

It also can be easily programmed using JavaScript (it comes with a node server build in) which is something I am currently working with, I’ll probably write up another blog post about my findings with that board when I get some time to play with it (yes, I got one at the Maker Faire :-).

And finally, some other things you can find at the Maker Faire:

Game of Drones:

Fire and Art:

IMG_5591

Robots that solve Rubik’s cubes:

Cheers,

Mark.Possibly Related Posts:

Design Time @ Run Time: Apple Watch Put Through Its Paces in Beijing

Mon, 2015-05-18 10:50

Observations on UX research and road-testing wearable tech in the wild. The vehicle for today’s message is Ultan O’Broin (@usableapps), taking advantage of Oracle Applications User Experience events and outreach to evaluate the fitness and health option on the Apple Watch—and to continue his Fitbit Surge exploration—this time in China.

Emirates Apple Watch app used during the OAUX Asia trip. Emirates Apple Watch app used during the OAUX Asia trip.

The Watch Ethnography (say what?)

All the warnings about running in Beijing proved wrong: that my clothes would turn black; my skin would turn grey; I’d need a facemask; I wouldn’t see any other runners; I’d attract the attention of security personnel with my blue hair.

None of this happened.

I shoulda guessed. Running is one of the most “unasked-for-advice” activities out there, usually from non-runners or “joggers.”

Instead, I saw lots of other runners in Beijing’s parks and streets, mostly locals, with a small number of “ex-pats.” At times there were so many runners—and power walkers—early in the morning that I had to weave hard to get by them. On the long, straight streets of Beijing, I saw hardcore runners in action, percentage-wise more than, say, in Dublin.

Running in Beijing. Scene from Temple of Sun Park.

Running in Beijing. Scene from Temple of Sun Park.

I saw lots of runners sporting colorful running gear; more than I’ve seen in San Francisco, though the styling was far short of the effortless funky co-ordination of the lemons, oranges, and blacks of the Nordic scene. Yes, I’m a running fashion snob. It was kinda hard to tell what fitness devices the Beijing crowd was packing, but I did see some Garmins: a sure sign of serious runners.

I did one run to the Forbidden City and Tiananmen Square, a 10 miler; hauling myself around the Central Business District and diplomatic zones on other days. The eyes of Chinese security guards swiveled to follow me as I strode by, but generally they seemed nonplussed with my blue hair and obvious Apple Watch. I was kinda disappointed I didn’t end up on CNN.

Running to the Forbidden City. Alas, selfie sticks were not forbidden.

Running to the Forbidden City. Alas, selfie sticks were not forbidden.

The best time to run in Beijing is clearly in the early morning. Public parks were open by 5:30 AM and full of runners and walkers by the time I arrived. There is very bad air pollution in Beijing, but growing up in pre-smokeless-coal-carbon-fuel-ban Dublin, it really didn’t seem that menacing. However, I did detect a markedly poorer air quality later in the day. Your mileage may vary on that one, I guess.

The Device Findings

These runs in Beijing were another opportunity to test out the Fitbit Surge but really to try out the newer Apple Watch in another location. There are other comparisons between these two devices.

Both performed flawlessly, though I preferred the superior build quality of the Apple Watch, which is outstanding, and its UX with configurable glances display and superior styling. Henry Ford’s “Any Color As Long As It’s Black” as applied to smartwatches and fitness bands is #fashtech #fail by this stage.

Again, I was particularly impressed with the rapid GPS acquisition and holding capability of the Surge. I’ve used it on three continents now, and I love its robustness and long life battery.

Fitbit Surge GPS recording from Tiananmen Square run (on iOS)

Fitbit Surge GPS recording from Tiananmen Square run (on iOS)

The Apple Watch’s built-in Workout app proved easy to use for my runs. It has indoor and outdoor options for other activities too, whether with target metrics, distance, time, or calories, or you can use it for an “open” hustle. I was a little disappointed that the watch app doesn’t enable wearers to recall more basic run details from the last activity but being able to see real-time progress was great. I also enjoyed using the Apple Watch built-in Activity app too. Its simple and colorful progress analytics for exercise, moving, and standing were fun to glance at throughout the day, though the data is not for any serious runners or QS fanbois out there.

Using both of these Apple Watch apps together provided a compelling health and fitness experience.

Apple Watch Activity App

Apple Watch Activity App

Apple Watch Activity App

Apple Watch Activity App

Being able to use both devices without carrying a smartphone with me on a run was the UX joy. Being freed from dodgy Bluetooth pairing and GPS signal worries, and that tricky music selection procedure required by a smartphone, saved me 5 mins (about three quarters of a mile distance at my speeds) at the start of each run. Being able to see my performance in real time—on the go—without having to fish out a smartphone, was awesome.

That’s what a smartwatch glance UX is all about: being kept in the moment.

The battery life of the Apple Watch didn’t make it longer than 10 hours because of my runs, though without this kind of exertion, it seemed to last most of my waking day, which is reasonable.

What’s Next?

I normally carry a smartphone when running as my music platform, but increasingly to take Instagram images during my journey. The Strava app GPS integration with Instagram is a fave running experience. I did carry my Apple iPhone 5 in Beijing, to take pictures—no, I don’t really carry a selfie stick—and to try out the Strava app for comparison. The Instagram integration seemed to be DOA though.

So, my thoughts on wearable tech super watch evolution, and the emergence of the standalone wearable device as the way to go for smartwatches, were reinforced from my Beijing experience.

However, a super watch UX needs to be flexible and offer more capability. I’d like to see onboard music and image capture capability on the watches themselves somehow. Audio notifications for time, speed and distance and geographic points would also enhance the experience immensely. However, what such enhancements would mean for the bane of wearable tech UX right now—battery life—yet alone device size, remains just another challenge to be solved. And it will be.

And what UX research methodology lessons might be gleaned from running in Beijing with wearable tech? Firstly, don’t assume anything about your ethnographic experience upfront. Try it yourself on a dry run first to iron out any possible kinks. Run at different times of the day, over different distances and routes, in varying weather conditions, and, of course, with different devices along the way. Most importantly, find real native runners to follow around, and record what they do from start to finish, what they do offline as well as online, and with what tools, on their runs.

Running, just like user experience, is about the complete journey, a total contextual experience, not just where your rubber meets the road.Possibly Related Posts:

Amazon Echo Official SDK

Sun, 2015-05-17 16:03

Image from wired.com

Back in February I was invited to participate in an pre-beta release of the Amazon Echo SDK. I was under NDA so I couldn’t share any of my finding here. But now that NDA has expired and I can share some of the integrations I did with this interesting device.

First of all I want to comment on the fact that not any of the OS level voice assistants in the market are quite getting it right when it comes to interacting with third party integrations. Let me explain, neither Google Now nor Siri or Amazon Echo will let you interact with a voice “app” unless you “open” or “start” that app first. For example to start an app in the any of the OSes mentioned above I have to do the following:

“[Ok Google], [Hey Siri], or [Alexa] open [name of application]”…”close” or “exit” [name of application]

Then I can start interacting with that application. This interaction paradigm belongs to a desktop model where you are used to open and close programs. And furthermore these actions are not even part of the mobile experience.

My proposal solution to fix this problem would be for the systems to create an “intent” model where a user could decide what to do with certain defined utterances. For example:

“[Ok Google], [Hey Siri], or [Alexa] do I have any new mail?”

In this case, the user should have the option to decide which will be the default application to handle “mail” through settings or through a first program run.

When you install app for the first time the system should ask:

“Would you like to use this app to handle your voice command for mail?”

Voice as the next user interface

Voice recognition and natural language processing (NLP) algorithms have advanced exponentially. These systems are getting truly ready for primetime. The use cases are only limited by our futuristic view of interacting with our systems with just our voice.

This is where the Amazon Echo shines. The idea of picking up my phone and commanding it with my voice, feels unnatural to me.  The Amazon Echo just sits there on my desk and is always ready for my commands. One could argue that Google Now and Siri could do the same but the lack of the rich sound presence and visual cues (RGB ring around the top) of the Echo are enough to have a better experience.

Demos

Without further ado, here are two demos of service integration I did with the Echo.  I used Temboo libraries for the Facebook, Twitter and Uber integrations. For IMAP mail, iCal, Philips Hue I created my own. All this of course was done in Java.

Office Automation

Internet of Things demo

So would you get an Amazon Echo?Possibly Related Posts:

Four Weeks and a Day with the Jawbone UP24

Wed, 2015-05-13 12:46

After three weeks with the Nike+ Fuelband and four weeks with the Basis Peak, I moved on to the Jawbone UP24.

The UP24 has been out for quite a while now. Back in January 2014, Noel (@noelportugal) and Luis (@lsgaleana) did cursory evaluation, and not much has changed in the Jawbone lineup since then.

At least, not until recently when the new hotness arrived, the UP2, UP3 and soon, the UP4, pushing the venerable UP24 into retirement. Honestly, I would have bought one of the new ones (because shiny objects), but they had yet to be released when I embarked on this journey of wearables discovery.

After starting out with a fitness band and moving to a super watch, going back to the comparatively feature-poor UP24 was a bit shocking initially. I had just become accustomed to having the time on my wrist and all that other stuff.

However, what it lacks in features, the UP24 more than makes up for in comfort. Makes sense, fewer features, smaller form factor, but even compared to the other fitness bands I’ve worn (the Fuelband and Misfit Shine), the rubbery industrial design makes it nice to wear.

Aside from comfort, surprisingly, one feature that made the UP24 sticky and enjoyable was the Smart Coach, which I expected to dislike. Jawbone has a very usable mobile app companion that all its devices share, and inevitably, that is what retains users, not the hardware on the wrist.

Overall, despite its relative age, I enjoyed wearing the UP24. I even decided to wear it a bit longer, hence the extra day.

IMG_20150512_091139

Here are my observations.

The band

Yes, there’s yet another initial software install required to configure the UP24 for use the first time. Yes, that still annoys me, but I get why it’s needed.

As I’ve said, the band is comfortable to wear, mainly because of its flexible, rubber material. Smart Coach reminded me a few times to be gentle with the band, saying something about there being a bunch of electronics packed in there.

I’m not sure if this was a regular reminder or if the band somehow detected that I was being too rough, hoping for the former. The Coach also reminded me that the band isn’t waterproof. While I did get it wet, I wasn’t brave enough to submerge it.

These reminders made me curious about the sensors Jawbone packed inside the UP24, and while looking for a teardown, I found this cool X-ray of the band.

JawboneUp24-X-Ray1

Image from Creative Electron

Impressive industrial design. One minor correction, the audio plug is 2.5 mm, not the standard 3.5 mm, something Noel and Luis found out quickly. From my use, it didn’t really matter, since the UP24 comes with a custom USB-2.5 mm audio adapter for charging.

IMG_20150405_100135

 

The UP24 uses a button to set specific modes, like Stopwatch (for exercise) and Sleep. These took a bit of learning, like anything new. I expected to have push-sequence failure, i.e. using the wrong push and hold combination, but no.

Aside from being red, which seemed to fade to orange, the band is unobtrusive. I found myself wearing it upside down to allow for scratch-free typing, a very nice plus.

The fit did seem to loosen over time, probably just the rubber losing some of its elasticity. Not a big deal for a month, but not a good long-term sign.

The battery life was nice, about nine days initially, but the app seems to misrepresent the remaining charge. One night, it reported five days charge left, and overnight, the band died. Same thing happened a week later when the app reported seven days of charge.

Because the UP24 isn’t constantly connected to Bluetooth, to save battery, I guess maybe the charge wasn’t reported accurately. Although when the app opens, the band connects and dumps its data right away.

Bit of a mystery, but happily, I didn’t lose my sleep data, which tells me the band still had some charge. The sleep data it collected on those nights wasn’t as detailed as the other nights. Maybe the band has some intelligence to preserve its battery.

Sleep data from a low battery. Sleep data from a charged battery

The UP24 didn’t attract the same amount of curious attention that the Basis Peak did, thank you Apple Watch, but a few people did ask what Fitbit I had, which tells me a lot about their brand recognition.

Is Fitbit the Kleenex of facial tissue? The Reynolds wrap of aluminum foil?

The app and data

Jawbone only provides the data collected by its bands and the Smart Coach through its mobile apps. Their web app only manages account information, which is fine, and bonus, you can download your device data in csv format from the web app.

There are, however, several different Jawbone UP mobile apps, so finding the right one was key.

The app is quite nice, both visually and informationally. I really like the stream approach (vs. a dashboard), and again, Smart Coach is nice. Each day, I checked my sleep data and read the tips provided, and yeah, some were interesting.

The stream is easily understood at a glance, so kudos to the UX. Orange shows activity, purple sleep. There are other things you can add, weight, mood, etc. I did those for a few days, but that didn’t last, too lazy.

Screenshot_2015-05-12-09-14-29 Screenshot_2015-05-12-09-14-34 Screenshot_2015-05-12-09-14-44

Each item in the stream can be tapped for details.

Unlike the Fuelband and the Peak, the UP24 uses very minimal game mechanics. The Smart Coach did congratulate me on specific milestones and encourage me to do more, but beyond that, the entire experience was free from gamified elements.

-63993410 Screenshot_2015-05-07-07-31-18 Screenshot_2015-05-05-16-19-33

Did I mention I liked the Smart Coach? Yeah, I did.

In addition to the stream, the UP24 provides historic data as days and aggregated into months and years, which is equally nice and easy to understand.

Screenshot_2015-05-12-09-15-04 Screenshot_2015-05-12-09-15-08

Jawbone has an integration with IFTTT among many other apps, making its ecosystem attractive to developers. I didn’t find any IFTTT recipes that made sense for me, but I like having the option.

There’s social stuff too, but meh.

Data sync between the band and app was snappy. As I mentioned above, the band isn’t always connected to Bluetooth, or at least, you won’t see it in the Bluetooth settings. Maybe it’s connected but not listed, dunno, but Noel would.

Minor downsides I noticed, sleep tracking is an absolute mystery. The UP24 lists both light and deep sleep, but who knows how it can tell. Not that I really need to know, but looking at its guts above, what combination of sensor data would track that?

Speaking of sensors, nearly every run I completed on a treadmill showed a wide variance, e.g. the treadmill says 3.25 miles, whereas UP24 says 2.45 miles. I tried calibrating the band after each run, but that didn’t seem to help.

I saw the same variance with steps.

Not a bid deal to me and definitely a difficult nut to crack, but some people care deeply about the accuracy of theses devices, like this guy who filed a lawsuit against Fitbit for overestimating sleep.

What I’m finding through personal experience and stories like that is that these little guys are very personal devices, much more so than a simple watch. I actually felt a little sad to take off my UP24.

I wonder why. Thoughts?

Find the comments.Possibly Related Posts:

Watch-First Design and Development

Tue, 2015-05-12 17:36

 

So as you might already know, it has been all about THE Watch these past days.

Laucher Home

So having this new toy in my wrist made me want to explore the possibilities. So I set myself to push my skill boundaries and dove right into WatchKit development. To kick off the wheels I spent this past weekend doing what  I like to call “Noel’s Apple Watch weekend hackathon,” my favorite kind of event, because somehow I always end up as a finalist.

Detail Glance

So as the title suggests, I focused in watch-first design (remember mobile-first? thats so 2014!) My goal was to start with a Watch app as the main feature and not even worry about a mobile companion app. As it stands now, Apple Watch, as well as Android Wear rely on “parent” mobile apps.

The result of my weekend fun was an app that I simply called “MyFamily”. The ideas is to add simple reminders, tasks, goals, etc., based on each individual member of my little family (which btw, names have been changed.) The app include an Apple Watch “Glance” which is some sort of a widget, or live tile with very limited dynamic content and interactions.

Having so limited real-estate and features really makes you think twice on how you want to present your data. The WatchKit interface objects are limited to a few subset of their parent iOS counterparts. Most of the design layout can be done by grouping labels (WKInterfaceLabel), images (WKInterfaceImage), and a couple other interface objects available (table, separator, and buttons.)

xCode copy

Having no keyboard (thank goodness!) one needs to rely in voice input to insert new data. During my test the voice recognition worked as advertised. Also during this exercise I finally realized that apps can display a “contextual” menu by “force touching” the screen. I opted to put a text hint (to delete item) , because even after a couple weeks of wearing the watch I didn’t realize this feature was available.

Speech Menu

After creating my Storyboard layouts, it was almost trivial to add data to them. I created custom classes to bind each Interface Controller. Override lifecycle events (awakeWithContext,willActivate,didDeactivate). Created a “member” object and an “event” object. And finally added data to the the tables with something like this:

- (void)setupTable
{
    _membersData = [Member membersList];
    [tableView setNumberOfRows:_membersData.count withRowType:@"MemberRow"];
    for (NSInteger i = 0; i < tableView.numberOfRows; i++)
    {
        NSObject *row = [tableView rowControllerAtIndex:i];
        Member *member = _membersData[i];
        MemberRow *memberRow = (MemberRow *) row;
        [memberRow.memberImage setImage:[UIImage imageNamed:member.memberImage]];
        [memberRow.memberName setText:member.memberName];
        [memberRow.memberEventCount setText:member.memberEventCount];   
    }
}

In conclusion, the WatchKit DX (development experience) is pretty smooth. This is due the the limited and minimalistic set of tools available to you. I suspect I will add more functionality to this app in the future by adding “Mobile-second, and Web-third” design. Oh, and maybe even going “public” and put it in the App Store.

IMG_1048

Photo Proof

Possibly Related Posts: