Skip navigation.

Oracle AppsLab

Syndicate content
Driving Innovation
Updated: 1 day 46 min ago

Saying Wearables in Spanish

Fri, 2014-05-23 08:53

Friend of the ‘Lab, Bob Rhubart (@otnarchbeat) recently recorded a segment with our own Noel (@noelportugal) and Sarahi Mireles (@sarahimireles), a UX developer from our Mexico Development Center.

The topic was wearables, but I only know this because they told me. Google Translate wasn’t very helpful, unless “Manos libres y vista al frente: Con el futuro puesto” means “Handsfree and front view: With the future since.”

Anyway, enjoy.

Update: Noel pointed me to an English version on the same topic.

Possibly Related Posts:

Whereable is the Killer Wearable Translation Use Case? Glass, Word Lens and UX

Wed, 2014-05-21 08:45

Editor’s note: Here comes a guest post from our old pal and colleague, Ultan O’Broin (@ultan). You can read more of his thoughts and musing at User Experience Assistance and Not Lost in Translation. Enjoy.

Whereable is the Killer Wearable Translation Use Cases? Glass, Word Lens and UX
By Ultan O’Broin

I just can’t escape the Word Lens-AppsLab vortex. I blogged about the Quest Visual Word Lens augmented reality (AR) mobile translation app for Jake (@jkuramot) a while ago. I’m involved in user experience (UX) outreach with Noel (@noelportugal) or Anthony (@anthonyslai) demoing the Google Glass version of Word Lens. Blogged about that, too.

noelGlass

Noel does his translation thang with Word Lens on Google Glass in the UK.

Now, Google have just announced an acquisition of Word Lens. That’s good news for the Glass version. The current version “works” but hardly at the level UX aspires to. The AR translation is impacted by stuff like how often you drink in certain bars in San Francisco the Glass wearer’s head moves, the fonts used in the target image, and so on. Likely this acquisition will mean Google Translate’s overall UX improves, offering an upped experience in terms of optimized UIs on different devices, all pivoting around a cloud-based real-time translation of a wide range of language combinations.

Up to now these mobile translation apps (there’s a ton) seem like a hammer in search of a nail. Finding a consumer nail, let alone an enterprise one, often leaves me scratching at the bottom of the toolbox.  Besides the translation quality, contextual factors are forgotten. Stuff like cost of operation or the device, or the very environment you’re got to work in.

Take Word Lens on Glass. Great for wearables ideation,  the promise of an immersive UX,  the AR potential, and people just love the awesome demos. But would you ever use it for real, right now?

Consider this: I’m a Glass Explorer and a runner. I recently did a 10 miler in Tel Aviv using Strava Run Glassware. Yeah, more of our global experiment to see if normal people would notice or care about Glass being in their faces (they didn’t).

stravaUltan

Strava Dashboard with that Tel Aviv run using Google Glass

It was a great user experience, sure.  But the cost of using my Glass tethered to my EU-registered Samsung Galaxy SIII for data on the move forced me back to reality: nearly 33 EUR (45 USD, today) in roaming charges. Over 3 Euros a mile.

Of course, there’s also the cost of Glass itself. Effectively, with taxes and bits added, it’s 1700 USD (1250 EUR). Not cheap. So, consider adding another real world problem. Running around sweating on your 1700 bucks. Nothing that some handy tape and a plastic bag can’t deal with in a sort of Nerdy 2.0 duct tape eyeglasses repairy way. But, not a UX for the stylish.

glassForRunners

Customizing Google Glass for runners on the go. Cut tape and plastic bag over the bone-conducting part of the device to exclude sweat.

I’ve no idea what Word Lens on Glass would cost to translate a foreign language dinner menu, billboard or sign when away on vacation. But, I’d bet if you’re going to try more serious translation tasks and stay connected during it, then it’ll likely be cheaper and a lot easier to just man up and try and ask someone local. Unless, the app is usable offline … and works outside in the rain.

Time will tell where Google Glass and Word Lens ends up. The message from all this is that in the enterprise, when it comes to gathering user requirements for wearable (and other) tech, it’s about more than just the end user and about taking technology at face value. Context is king.

Oh, we’ve a course on that, too.Possibly Related Posts:

Raspi Shutdown Key

Tue, 2014-05-20 02:11

Noel (@noelportugal) is a clever dude. He’s also passionate. If you’ve ever met him, you already know these things.

Although I haven’t yet jumped into Raspberry Pi, despite Noel’s unbridled passion about the little-computer-that-could, I have captured some metadata about it, just from being around him and his passion.

For example, I know the Raspi needs to be shutdown a certain way, or bad things happen. I recall Noel being very specific about this at the Raspi hackday he ran in January.

Since he created a Maker Faire DIY activity based around Raspi and Embedded Java, Noel needed an easy and standardized way to shutdown his Raspis without the benefit of any standard peripherals, no keyboard, no mouse, no monitor.

So, he made a key because of course he made a key. Duh.

IMG_20140516_125213

One of Noel’s Raspis and the shutdown key

I saw the guys use this key a few times during Maker Week, and I’m not entirely sure what’s on it, e.g. scripts or magic bits, but it certainly made shutting down more than a dozen of Raspis a breeze.

I hope he’ll share the magic, fingers crossed.

Update: Noel shared the secret sauce in comments, and I’ll add it here for post-completeness.

So here is the secret sauce:

#Get Device Id and Vendor Id from USB Wifi adaptor.
$ lsusb

Bus 001 Device 004: ID 0bda:8176 Realtek Semiconductor Corp. RTL8188CUS 802.11n WLAN Adapter

#Create a new udev rule
$ sudo vi /etc/udev/rules.d/10-local.rules


ACTION==”remove”, ENV{ID_VENDOR_ID}==”0bda”, ENV{ID_MODEL_ID}==”8176″, RUN+=”/sbin/shutdown -h now”


$ sudo udevadm control –reload-rules

If you want to make a shutdown key make sure you use a USB drive and not a USB wifi/bluetooth dongle. Hot-plugging a powered device will cause the raspberry pi (model b) to restart.Possibly Related Posts:

AppsLab at the Maker Faire

Mon, 2014-05-19 11:10

Last week was Maker Week for us, and the culmination was the Maker Faire over the weekend.

This was my first Maker Faire, and wow was it an absolute blast. I highly recommend taking in one of their events if there’s one in your town. Perhaps the best attribute of the Maker Faire is that it includes not just technology, but also handicrafts, woodcraft, pretty much every kind of craft. Walking around, you can feel the passion of everyone there, and it’s infectious.

Anyway, Oracle and Java were present in a big way at the Faire, and our team’s role was to build the DIY activity for the Java area in the Expo Center. Noel (@noelportugal) created an activity he calls “If this, then Pi” which pays homage to one of our favorite services, IFTTT, and highlights the power of Raspberry Pi and Embedded Java.

IFTTPi takes an input:

  • Twitter hashtag
  • Text message
  • Proximity sensor
  • Light sensor

And does something:

  • Tweets
  • Sends a text
  • Makes a call
  • Takes a picture
  • Turns on a fan
  • Rolls a Sphero
  • Moves a robot arm
  • Changes the color of a Philips Hue bulb
IMG_20140517_085236

Noel’s if-this-then-pi console

Noel and a few of the guys (Anthony, Raymond, Tony) put all this together in less than a month with some inspiration and guidance from Hinkmond Wong. Aside from showcasing Embedded Java on the Raspi, the overall goal was to show how actions can trigger IoT events with limited human interaction.

Aside from some network hiccups, the DIY activity was a big hit. We also brought along Noel’s Raspi target to entertain kids of all ages.

Noel also gave a short talk on Saturday, check it out here or on YouTube:

As he promises in the video, Noel expects to share the code he and the guys wrote for IFTTPi soon, and I’m going to nag him for a technical post on the whole shebang, sooner than later.

A nostalgic note, both Paul (@ppedrazzi) and Rich (@rmanalan) were at Maker Faire on Saturday, and they happened to come visit us at Java DIY area, creating an impromptu AppsLab reunion. Great to see those guys. They helped lay the groundwork for what we’re doing today.

Update: Almost forgot my favorite story from Saturday. We were in the Expo Hall, no AC, and it was hot and stuffy. Someone asked facilities if the doors could be opened to let some breeze into our area. Facilities said they had no more doorstops, and a guy said he could print some up on his 3D printer. I don’t know if he did, but the doors were eventually opened.

You wouldn’t get that kind of answer at any other event. Hilarious and very typical of the Maker Faire. No problems, only solutions.

Another update: We plan to take a scaled-down version of IFTTPi on the road with us to conferences, so look for it, possibly as soon as at Kscope 14 next month.

Enjoy some shots I took around the Faire and find the comments.

Click to view slideshow.Possibly Related Posts:

Randomly Humorous Correlations

Fri, 2014-05-16 11:15

Love this so much, Tyler Vigen’s Spurious Correlations, randomly compares data sets and creates correlations. Frequently, hilarity ensues.

cageCorrelation

Like The Onion, I’m sure its “findings” will be misrepresented as serious news soon.

h/t Flowing Data

 Possibly Related Posts:

Tweets from Jeremy’s MakerCon Keynote

Thu, 2014-05-15 14:18

Yesterday, our fearless leader, Jeremy Ashley (@jrwashley), gave a keynote at MakerCon. Unfortunately, I had to miss it, but the guys reported a positive reaction. Noel (@noelportugal) pointed me to tweets for rapid reaction.

Here’s a sample:

tweet


tweet1

tweet2

Yeah, I cherry-picked, but you can read the reactions yourself if you don’t believe me.

The keynote was recorded, and if it becomes freely-available, I’ll embed here. We also have awesome pictures on the way too.

So, stay tuned for more Maker Week content.Possibly Related Posts:

OK Google, Where’s My Car?

Thu, 2014-05-15 02:24

Google Now recently added a Parking Location card to help you solve the classic dude-where’s-my-car problem. According to The Verge:

The company’s Google Now assistant will now recognize when you’ve left a moving vehicle and automatically keep track of your car’s last location. There’s no magic happening here: Google does all of this using your smartphone’s bevy of sensors. It’s essentially guesswork, and the company readily admits that it may sometimes guess wrong. “You may see parking location cards even if you didn’t park your car,” the company says on a help page addressing the new feature. “For example, these cards could show up after you exit a bus or a friend’s car.”

A directionally challenged person like me will enjoy this feature. Of course, Google can get me to the parking structure, but I’ll probably still have to walk around clicking the alarm playing Marco Polo with the car.

I accidentally used this feature today when I dropped off a rental car at San Jose International Airport.

Screenshot_2014-05-14-06-09-21

Although they have not  yet, I sincerely hope Google adds the ability to recall the Parking Location with the phrase “OK Google, where’s my car?” which would make for any number of fantastic Easter Eggs. OK, so maybe only Chet (@oraclenerd), Jeff (@thatjeffsmith) and I would enjoy them, but still, fantastic to us.

Google Now continues to impress me, and it’s becoming a necessary travel assistant.

Thoughts? Comments, find them.Possibly Related Posts:

A Week with the Amazon Fire TV

Wed, 2014-05-14 02:18

Even though I’m more than content with the Chromecast, the Amazon Fire TV caught my eye for a very simple reason: Amazon Prime content.

I’ve been buying digital content from Amazon since they launched their MP3 Store, the first place to buy music without any DRM, back in 2007, and Amazon is the only place to get stuff like Dora and other Nickelodeon shows. Yeah, that’s a parenting problem.

Amazon doesn’t support the Chromecast and probably won’t anytime soon, and they don’t have an Instant Video app for Android, which is limiting for my household. So, I’ve been stuck using the Amazon app for my Smart TV, which isn’t ideal.

AmazonFireTV-578-80

Aside from being slow to launch and laggy, I can guess how much development support Amazon gives Smart TV apps compared to their Android, iOS and Fire OS apps.

So, when the Fire TV was announced, it immediately intrigued me. The capper came a couple weeks after when Amazon announced that much of HBO’s library would stream exclusively to Prime Instant Video. I’ve been putting off watching “The Wire” for years, and I can’t wait to rewatch “Oz.”

Another interesting aspect of the Fire TV is that Amazon mined comments on other, similar set-top boxes (e.g. Apple TV, Roku), using them as a large focus group of sorts. It’s rumored Amazon did this prior to launching the Fire HD as well.

This makes a ton of sense. Why wouldn’t Amazon look to compete in strategic areas where demand is strong and where they have a willing group of test subjects? This has to be in the fast-follower handbook.

Anyway, it’s been about a week since I got my Fire TV, and it’s about time to share some impressions.

Pros

Amazon claimed the Fire TV was fast, and the Fire TV is fast, screaming fast, like instantaneous fast. Compared to other ways I’ve used Instant Video, including the web app, it’s a joy to rummage through Amazon’s streaming collection, and videos play immediately. No snake chasing on a turntable, no spinning beach ball.

The Fire TV has Netflix and Hulu Plus apps, which means I can consolidate all my viewing on a single device. That ‘s nice, but the best part about this for me is that these apps are also significantly faster to load than the Chromecast or Smart TV versions.

Cons

I’ve read some poor reviews of the Fire TV; in fact, most of the reviews I’ve seen have been mixed at best. I suppose this is because the reviewers are comparing other set-top streaming boxes. I’ve only used the Chromecast, which I still love, and the Fire TV measures up well against it.

The two devices actually fit nicely into my household, with the Fire TV on the primary, living room TV, and Chromecasts on the smaller TVs in other spots.

My only con is about Voice Search, which Amazon is pushing hard as an innovative, differentiating feature. I tried it for kicks, and it took a couple tries for it to understand me. I may search for Gary Busey for giggles, but mostly, this just feels like a gimmick feature.

So, for me at least, the Fire TV is awesome. It fills a need and does so very nicely.

Thoughts? Find the comments.Possibly Related Posts:

The Nymi Nears Production

Tue, 2014-05-13 02:31

Noel (@noelportugal) chatted with the Bionym guys at SXSW earlier this year, and I know I speak for all of us here in the ‘Lab when I say I’m very stoked to get my Nymi and start playing with it.

Check out Techcrunch’s writeup and demo video.

Since I can’t embed TC’s video for some annoying reason, here is ninja cat instead.

Just pretend really hard, or watch the video over at TC.

Aside from just being an end user (please Agile Bits integrate Nymi for 1Password), I’m super excited about the cool demos we can do with the wristband and the SDK. Already noodling ideas for next year’s cool demo.Possibly Related Posts:

Miscellaneous Cool Stuff

Mon, 2014-05-12 02:46

I’m finally harvesting my open browser tabs, sharing some of the cool stuff that has been patiently waiting to be shared with you, gentle reader.

Chrome Remote Desktop for Android

Chrome Remote Desktop has been quietly awesome for a few years, and Google recently extended it to Android. So now, I can troubleshoot my parents’ computer from the road. Yay.

Project Ara Gets Closer

Squee! I’m not really sure why, but I’m so geeked for Project Ara phones, i.e. Google’s upcoming modular smartphones. Design your phone from the sensors up to the screen size, or something like that.

Everything is DIY now, so why not?

PiPhone

Speaking of everything being DIY now, some clever bloke built a Raspberry Pi smartphone.

The UX Drum

Longtime friend of the ‘Lab, Floyd Teter (@fteter) wrote a post about the importance of UX. I concur.

Linksys WRT1900AC

And finally because everyone gets excited about networking gear, especially this time of year, I give you my latest bit of nerd pr0n, the Linksys WRT1900AC. Short version, it’s a really fast wifi router, something every telecommuter should covet. Want the long version? Techcrunch did a review.

Speed comes at a price, namely $250, but I’m asking myself why pay for a big pipe when wifi has always been the choke point?

And Finally

Things Fitting Perfectly into Other Things

How about you? Care to share your open browser tab nuggets?

You know what to do.Possibly Related Posts:

Projecting Multiple Android Device Screens

Fri, 2014-05-09 02:30

On this team, we all carry Android devices, lots of them, including phones. Even Noel (@noelportugal) has finally been converted.

Everyone on the team, minus me, is an Android developer, and as they build for new devices like Google Glass and the upcoming Android Wear watches, the ability to project screen images becomes more essential.

49578622

Case in point, at a recent sales event, I was showing a Glass app and companion tablet app that Anthony (@anthonyslai) and Raymond built as a concept demo for Taleo interview evaluations.

Using Glass for the first time requires training, so I typically use the screencast option of the MyGlass app to see what the wearer sees. In this case, I was also showing an app on the tablet, so I couldn’t keep the screencast running.

Similarly, when I’m showing Glass or any Android apps to a room of people, projecting the screen images is a bit of an adventure.

Necessity being the mother of invention, Anthony decided to address our collective need for better Android projecting by modifying Android Projector, an open source Java project to support projecting from multiple Android devices.

You can find his code on GitHub.

Android Projector requires adb, part of the Android SDK. If you have adb, run:

adb devices

And copy the device ID you want to project. Then from the directory where you downloaded Anthony’s version of Android Projector, run:

./android-projector <device ID>

Want to show two devices? Open another terminal session, copy the other device ID, rinse, repeat.

And voila, you can see both device’s screens. If you’re giving a demo, you can now project your laptop’s screen to show all the screens.

dualProjections

Google Glass and Nexus 7 screencasts on the same machine.

Pretty cool, eh? Find the comments.Possibly Related Posts:

It’s Almost Maker Week

Thu, 2014-05-08 09:39

Maker Faire Bay Area (@makerfaire) is coming up quickly, May 17 and 18, and we’re excited to be participating.

mf14ba_badge_150

Here’s the backstory on why. Ultan (@ultan) and Justin (@kestelyn) spoke at Maker Faire 2012 (video), and the Java (@java) team is a major sponsor of the Maker Faire.

Unfortunately for him, Ultan won’t be around to attend, so when the Java team came looking for ideas for this year’s Faire, he asked if we’d help. Noel (@noelportugal), a longtime maker as well as a past Maker Faire attendee, jumped at the chance to represent Java and Applications User Experience at this year’s installment.

But wait, there’s more. This year, on May 13 and 14, there will be a conference the week before the Maker Faire, aptly named, MakerCon. This two-day conference focuses on the business of making and will be hosted at the Oracle Conference Center.

Our fearless leader, Jeremy Ashley (@jrwashley), himself an avid maker and tinkerer, will be delivering a keynote on May 13 to kick off the event.

So, for us at least, next week is Maker Week.

Noel has been feverishly assembling a DIY activity for the Java Embedded Playground at the Maker Faire involving some Internet of Things things and a bunch of Raspis. He teased these pictures to give a taste.

IMG_7245 IMG_7260 IMG_7217

Not to spoil the fun entirely, but what he’s building is a set of triggers and results (a la IFTTT), all automated. Visitors will choose an input (e.g. a sensor), a condition (e.g. keyword, hashtag) and an output (e.g. robot arm, Sphero) and watch the magic of IoT happen.

I’m excited to try this myself, especially the Sphero, which looks like outrageous fun, h/t to Tony for that one.

If you’ll be attending the Bay Area Maker Faire next week, stop by, say hi and try out Noel’s activity. Bonus, we’ll be hanging w Hinkmond Wong (@hinkmond), he of tweeting turkey fame.

Update: Worth noting that longtime friend and honorary member of the ‘Lab, David Haimes (@dhaimes) will be joining us in the Maker Faire tent to help over the weekend. Come by and see us in all our IRL glory.Possibly Related Posts: