Skip navigation.

Oracle AppsLab

Syndicate content
Driving Innovation
Updated: 1 day 1 hour ago

Autonomous Quadcopters Playing Some Catch

Wed, 2014-09-17 16:04

Tony went to a talk by Salim Ismail (@salimismail), the Founding Executive Director of Singularity University recently. He may/may not post his thoughts on the talk, which sounds fascinating, but this video is worth sharing either way, and not just because we have quadcopter fever.

Yeah, that’s autonomous flight, So refer to the list of horrifying things that should not be allowed.Possibly Related Posts:

Filler or Curated Content?

Wed, 2014-09-17 15:30

I consider these types of posts to be filler, but I suppose you could look at it as curated content or something highbrow like that. Take your pick.

10 Horrifying Technologies That Should Never Be Allowed

I scanned this post first, thought it would be interesting and left it to read later. Then I read it, and now, I’m terrified. Here’s the list, make sure to hit the link and read all about the sci-fi horrors that aren’t really sci-fi anymore.

  • Weaponized Nanotechnology
  • Conscious Machines
  • Artificial Superintelligence
  • Time Travel
  • Mind Reading Devices
  • Brain Hacking Devices
  • Autonomous Robots Designed to Kill Humans
  • Weaponized Pathogens
  • Virtual Prisons and Punishment
  • Hell Engineering

xkcd on watches

This is exactly how I feel about watches.

This is Phil Fish

I only know who Phil Fish is because I watched Indie Game: The Movie. This short documentary by Ian Danskin is quite good and is newsworthy this week thanks to Marcus Persson’s reference to it in his post about why he’s leaving Mojang (h/t Laurie for sharing), the makers of Minecraft, after Microsoft completes its acquisition of the company.

I have often wondered why so many people hate Nickelback, and now I have a much better understanding of why, thanks to Ian. Embedded here for your viewing pleasure.

https://www.youtube.com/watch?v=PmTUW-owa2wPossibly Related Posts:

Wearables Should be Stylish

Tue, 2014-09-09 13:18

To no one’s surprise, Apple announced the Apple Watch today.

Very apropos because I just read Sandra Lee’s (@SandraLee0415) post over on Usable Apps about fashionable tech, one of Ultan’s (@ultan) main talking points about wearables.

Ultan, our wearables whisperer, has style and flair; if you’ve ever met him, you know this. His (and Sandra’s) point about wearable tech needing to be stylish is one that Apple has made, again, to precisely no one’s surprise. Appearance matters to people, and smartwatches and other wearables are accessories that should be stylish and functional.

The market has spoken on this. To the point, the Android Wear smartwatch people want is the round Moto 360, which sold out in less a day earlier this week.

The Apple Watch looks very sleek, and if nothing else, the array of custom bands alone differentiate it from smartwatches like the Samsung Gear Live and the LG G, both of which are also glass rectangles, but with boring rubber wristbands.

I failed to act quickly enough to get a Moto 360 and settled instead on a Gear Live, which is just as well, given I really don’t like wearing watches. We’ve been building for the Pebble for a while now, and since the announcement of Android Wear earlier this year, we’ve been building for it as well, comparing the two watches and their SDKs.

IMG_20140909_121201

Like Google Glass, the Gear Live will be a demo device, not a piece of personal tech. However, for Anthony, his Android Wear watch has replaced Glass as his smartphone accessory of choice. Stay tuned for the skinny on that one.

I haven’t read much about the Apple Watch yet, but I’m sure there will be coverage aplenty as people get excited for its release early in 2015. Now that Apple’s in the game, wearables are surely even more of a thing than they were yesterday.

And they’re much more stylish.

Find the comments.Possibly Related Posts:

On Disney Parks, Data Science, Drones and Wearables

Fri, 2014-09-05 09:16

As the parent of a toddler, I have no choice but to pay attention to Disney and its myriad of products and services.

Case in point, this Summer we took our daughter to Disneyland for the first time, which was a whole thing. Pause to h/t Disneyland expert, Friend of the ‘Lab and colleague Kathy for all her park and travel protips.

Being who I am, I found myself wandering around Disneyland and California Adventure thinking about how many hardcore analytics geeks they must employ to come up with systems like FASTPASS.

For the unfamiliar, FASTPASS is a system that allows you to skip some, if not all, of the line-standing for the most popular attractions in the parks. Although it’s difficult to explain in words, the system is rather simple once you get your first pass.

Being in the park, you can feel all the thought and craft that has gone into the experience. Disney is a $45 billion company, and it’s no surprise their R&D is cutting edge. But what makes it so successful?

Attendees of Disney parks are in a very similar to employees of an enterprise in that they will gladly opt-in to new technologies because the value they receive in return is clear and quantifiable.

Put into examples, if Google Glass helps me do my job more effectively, I’ll wear them. If I receive discounted benefits for wearing a fitness tracker, I’ll do it.

If a MagicBand allows me to leave my wallet in my room, not worry about losing the room keycards, and use FastPass+, I’ll wear it, even though it will allow Disney World to track my location at a very fine-grained level. Who cares? FastPass+ is worth it, right?

Odd branding note, the official ways to write these two terms are indeed FASTPASS and FastPass+, according to Disney’s web site.

If you’re interested in reading more about the MagicBand, what’s inside and Disney uses it at Disney World, check out Welcome to Dataland. Imagine all the data science that goes into creating and iterating on these enormous data sets; this is embiggened Big Data when you consider Disney parks occupied the top eight spots in the 2012 Theme Park Index, comprising well over 100 million visits.

It boggles my mind, although for someone like Bill, it would be Christmas every day.

The post also recounts Walt Disney’s futurist vision, which seems to drive their R&D today. It also encompasses the my point nicely:

Rather, because Disney’s theme parks don’t have the same relationship to reality that Google and Costco and the NSA do. They are hybrids of fantasy and reality.

I read Welcome to Dataland only because I’d just been to Disneyland myself. Then came news that Disney had filed severals patents concerning the use of drones for its park shows, one for floating pixels, one for flying projection screens, one for transporting characters, h/t Business Insider.

Click to view slideshow.

We’ve been experimenting (ahem, playing) with quadcopters, and it struck me that Ultan (@ultan) had sent me a Disney video about customized wearables. This one:

That was posted in August 2012.

So beyond casual interest as the father of a daughter who loves Disney Princesses, suddenly it’s obvious that I need to watch Disney much more carefully to see how they’re adopting emerging technologies.

Oh and become a willing data point in their data set.

Find the comments.Possibly Related Posts:

Behold: The Simplified UI Rapid Development Kit

Wed, 2014-09-03 14:49

Editor’s note: The recent release of the Oracle Applications Cloud Simplified User Interface Rapid Development Kit represents the culmination of a lot of hard work from a lot of people. The kit was built, in large part, by Friend of the ‘Lab, Rafa Belloni (@rafabelloni), and although I tried to get him to write up some firsthand commentary on the ADF-fu he did to build the kit, he politely declined. 

We’re developers here, so I wanted to get that out there before cross-posting (read, copying) the detailed post on the kit from the Usable Apps (@usableapps) blog. I knew I couldn’t do better, so why try? Enjoy.

Simplified UI Rapid Development Kit Sends Oracle Partners Soaring in the Oracle Applications Cloud

A glimpse into the action at the Oracle HCM Cloud Building Simplified UIs workshop with Hitachi Consulting by Georgia Price (@writeprecise)

Building stylish, modern, and simplified UIs just got a whole lot easier. That’s thanks to a new kit developed by the Oracle Applications User Experience (OAUX) team that’s now available for all from the Usable Apps website.

The Oracle Applications Cloud Simplified User Interface Rapid Development Kit is a collection of code samples from the Oracle Platform Technology Solutions (PTS) Code Accelerator Kit, coded page templates and Oracle ADF components, wireframe stencils and examples, coding best practices, and user experience design patterns and guidance. It’s designed to help Oracle partners and developers quickly build—in a matter of hours—simplified UIs for their Oracle Applications Cloud use cases using Oracle ADF page types and components.

eBook_SUI_redCover_1

A key component of the simplified UI Rapid Development Kit—the Simplified User Experience Design Patterns for the Oracle Applications Cloud Service eBook—in use. Pic: Sasha Boyko, all rights reserved.

The kit was put to the test last week by a group of Hitachi Consulting Services team members at an inaugural workshop on building simplified UIs for the Oracle HCM Cloud that was hosted by the OAUX team in the Oracle headquarters usability labs.

The results: impressive.

During the workshop, a broad range of participants—Hitachi Consulting VPs, senior managers, developers, designers, and architects—learned about the simplified UI design basics of glance, scan, commit and how to identify use cases for their business. Then, they collaboratively designed and built—from wireframe to actual code—three lightweight, tablet-first, intuitive solutions that simplify common, every day HCM tasks.

Sona Manzo (@sonajmanzo), Hitachi Consulting VP leading the company’s Oracle HCM Cloud practice, said, “This workshop was a fantastic opportunity for our team to come together and use the new Rapid Development Kit’s tool s and techniques to build actual solutions that meet specific customer use cases. We were able to take what was conceptual to a whole different level.”

sona_manzo_hitachi_aug2014

Great leadership. Hitachi Consulting’s Sona Manzo gets the whole team into the spirit of building simplified UIs. Pic: Martin Taylor, all rights reserved.

Workshop organizer and host Ultan O’Broin (@ultan), Director, OAUX, was pleased with the outcome as well: “That a key Oracle HCM Cloud solution partner came away with three wireframed or built simplified UIs and now understands what remains to be done to take that work to completion as a polished, deployed solution is a big win for all.”

ultan_anna_hitachi_aug2014_med

OAUX Principal Interaction Designer Anna Budovsky (left) and Ultan O’Broin (right) facilitate Hitachi Consulting team members in working out solutions for customer use cases. Pics: Martin Taylor, all rights reserved.

Equally importantly, said Ultan, is what the OAUX team learned about “what such an Oracle partner needs to do or be able to do next to be successful.”

According to Misha Vaughan (@mishavaughan), Director of the OAUX Communications and Outreach team, folks are lining up to attend other building simplified UI workshops.

“The Oracle Applications Cloud partner community is catching wind of the new simplified UI rapid development kit. I’m delighted by the enthusiasm for the kit. If a partner is designing a cloud UI, they should be building with this kit,” said Misha.

Ultan isn’t surprised by the response. “The workshop and kit respond to a world that’s demanding easy ways to build superior, flexible, and yet simple enterprise user experiences using data in the cloud.”

The Oracle Applications Cloud Simplified User Interface Rapid Development Kit will now be featured at Oracle OpenWorld 2014 OAUX events and in OAUX communications and outreach worldwide.Possibly Related Posts:

Context in UX – What It Is, What It Isn’t, and Why It’s Important

Mon, 2014-09-01 19:54
Big Brown Bat (Eptesicus fuscus) in Flight

Copyright@2012 Bill Kraus, All rights reserved.

Our location is relentlessly tracked by our mobile devices. Our online transactions – both business and social – are recorded and stored in the cloud. And reams of biometric data will soon be collected by wearables.  Mining this contextual data offers a significant opportunity to enhance the state of human computer interaction. But this begs the question: what exactly is ‘context’ ?

Consider the following sentence:

“As Michael was walking, he observed a bat lying on the ground.”

Now take a moment and imagine this scene in your mind.

Got it? Good.

Now a few questions. First, does the nearby image influence your interpretation of this sentence? Suppose I told you that Michael was a biologist hiking through the Amazonian rain forest. Does this additional information confirm your assumptions?

Now, suppose I told you that the image has nothing to do with the sentence, but instead it’s just a photograph I took in my own backyard and inserted into this post because I have a thing for flying mammals.  Furthermore, what if I told you that Michael actually works as a ball boy at Yankee stadium? Do these additional facts alter your interpretation of the sentence? Finally, what if I confessed that I have been lying to you all along, that Michael is actually in Australia, his last name is Clarke, and that he was carrying a ball gauge? Has your idea of what I meant by ‘bat’ changed yet again? (Hint – Michael Clarke is a star cricket player.)

The point here is that contextual information – the who, what, where, and when of a situation – provides critical insights into how we interpret data. In pondering the sentence above, providing you with context – either as additional background statements or through presumed associations with nearby content – significantly altered how you interpreted that simple sentence.

At its essence, context allows us to resolve ambiguities. What do I mean by this? Think of the first name of someone you work with. Chances are good that there are many other people in the world (or at your company if your company is as big as Oracle) with that same first name. But if I know who you are (and ideally where you are) and what you are working on, and I have similar information about your colleagues, then I can make a reasonably accurate guess as to the identity of the person you are thinking of without you having to explicitly tell me anything other than their first name. Furthermore, if I am wrong, my error is understandable to you, precisely because my selection was the logical choice. Were you thinking of your colleague Madhuri in Mumbai that you worked with remotely on a project six months ago? But I guessed the Madhuri that has an office down the hall from you in Redwood City and with whom you are currently collaborating? Ok, I was wrong, but my error makes sense, doesn’t it?   (In intelligent human computer interactions, the machine doesn’t always need be right as long as any errors are understandable. In fact, Chris Welty of IBM’s Watson team has argued that intelligent machines will do very well to be right 80% of the time – which of course was more than enough to beat human Jeopardy champions.)

So why is the ability to use context to resolve ambiguities important? Because – using our example – I can now take the information derived from context and provide you with a streamlined, personalized user experience that does not require you to explicitly specify the full name of your colleague – in fact, you might not need to enter any name at all if I have enough contextual background about you and what you are trying to do.

When it comes to UX, context is actually a two-way street. Traditionally, context has flowed from the machine to the user, where layout and workflow – the consequence of both visual and interaction design – has been used to inform the user as to what something means and what to do next.  But as the availability of data and the complexity of systems have grown to the point of overwhelming the user, visualizations and interactions alone are not sufficient to stem the tide. Rather, context – this time emanating from the user to the machine – is the key for achieving  a more simplified, personalized user experience.

Context allows us to ask the right questions and infer the correct intentions. But the retrieval of the actual answers – or the execution of the desired task – is not part of context per se. For example, using context based on user identity and past history (demographic category, movies watched in the past) can help a recommendation engine provide a more targeted search result. But context is simply used to identify the appropriate user persona – the retrieval of recommendations is done separately. Another way to express this is that context is used to decide which view to put on the data, but it is not the data itself.

Finally, how contextual information is mapped to appropriate system responses can be divided into two (not mutually exclusive) approaches, one empirical, the other deductive. First, access to Big Data allows the use of machine learning and predictive analytics to discern patterns of behavior across many people, mapping those patterns back to individual personas and transaction histories. For example, if you are browsing Amazon.com for a banana slicer and Amazon’s analytics show that people who spend a lot of time on the banana slicer page also tend to buy bread slicers, then you can be sure you will see images of bread slicers.

But while Big Data can certainly be useful, it is not required for context to be effective. This is particularly true in enterprise, where reasonable assumptions can be made from a semantic understanding of the underlying business model, and where information-rich employee data can be mined directly by the company. Are you a salesperson in territory A with customers X, Y, and Z? Well then it is safe to assume that you are interested in the economic climate in A as well as news about X, Y, and Z without you ever having to explicitly say so.

So in closing, the use of context is essential for creating simple yet powerful user experiences – and like the term ‘user experience’ itself, there is no one single implementation of context – rather, it is a concept that should pervade all aspects of human computer interaction in its myriad of forms.Possibly Related Posts: