User Interface Design

When we began work on Aido’s UI, we realized it was a unique challenge. We had to factor in multiple elements in the interaction design – behavioral patterns, speech commands, motion and a large visual interface.

Our challenge with Aido’s interface was to balance likeability and utility. We decided to use a faceplate for Aido as the primary visual interface. Our consumer research showed that this gave Aido a warm and friendly look. However, the research also highlighted that we don’t appear ‘too close to being human’ – a threshold beyond which people begin finding robots creepy.

Designing the faceplate UI

Our UI design team led by Divesh Jaiswal iterated multiple designs with prospective users to figure out the faceplate UI. We finally split the faceplate visual design into three clusters:

1. The face

Eyes, they say, are windows to the soul. We subconsciously seek and meet the eyes of those we speak to, humans or pets. It would be natural to seek the same for a robot designed for the home.

Aido has a pair of friendly eyes as the primary communication medium. Aido blinks at regular intervals to indicate that he’s awake and ready to hear you.

The visual style of the eyes was chosen to reflect friendliness, but maintain a visual distance from being too human.

Aido has an iris that moves to indicate he has registered some visual input. A little bridge between the eyes balances the eyes and serves as a voice guide when Aido speaks or listens. This makes the user comfortable that Aido is actively engaging with him/her during conversations.

image

2. Notifications

Aido can monitor and manage a lot of your smart home devices as well as connected services. In addition to controls, many of these send notifications that Aido has to display.

Aido uses large flat icons with a sharp background to display notifications. These have been tested to ensure that they can be seen at a distance or when the device is moving, both essential for the environment that Aido will be used in.

While a user may ‘see’ a notification, s/he may often then ‘speak the next command. The visual interface accommodates voice and touch commands, so you can see a notification icon at a distance and just ask Aido to act on it (eg: see an email notification and ask Aido to read out the message)

image

3. App Interface

In addition to running regular Android apps, Aido comes preloaded with a set of apps specially designed for it. It also has skins for commonly used functions like playing music, setting an alarm and more.

We consciously designed the interface for these elements so that only key visual elements are shown on the UI. While Aido can run entirely on voice commands, having some key controls on the interface brings an element of comfort to users who have learnt to deal with other technology devices.

Users can use voice commands to activate more functionality (eg: show me the complete playlist), but most of this is hidden behind a menu icon, a planned redundancy. Once people get comfortable using voice commands with Aido, we expect them to rarely (or never) make a trip to the menu functionality.

Putting the interface elements together

Given the three distinct visual elements that make up Aido’s UI, we needed a way to allow for seamless transitions between states. This transition needed to be simple to learn, but powerful enough to factor in complex interactions in day-to-day usage.

After trying out multiple visual transitions, we hit upon the solution – a Toblerone-inspired switch menu with the three visual elements (Face, Notifications, App) on a side each.

A simple clockwise flip across the central axis would switch between Face and the App UI and an anti-clockwise flip from Face to a notification. The transition from the App UI to a notification is also just a flip away.

Our preliminary testing with users indicates that this simple model helps them quickly get a hang of Aido’s states and navigate the UI with ease.

Factoring in behavioural elements

As Aido breaks ground being a new category, we also considered behavioral aspects of human-robot interactions.

Research shows that there is a ‘desirable interfacing distance’ – roughly translates to the distance at which a mobile home robot should stop while approaching a human. This ensures that the robot doesn’t violate a human’s zone of comfort. Aido maintains this distance from you unless you walk over to it.

Voice is a key interface element with Aido. While having a conversation, people need an acknowledgement that they’ve been heard. Humans have various visual cues to demonstrate this – direct eye contact and head shakes, for instance.

Aido plays a short ‘ding’ sound to confirm it has finished hearing you. Cues like these make Aido appear more connected, rather than just a gadget.

Aido also has haptic sensors that let you tap it to close a task or wake it up from the sleep state. We found that this simple gesture made Aido appear friendly and connected.

We’ve put in a lot of work thinking through Aido’s interface elements and are waiting for users to start welcoming Aido to their homes. The dream of every designer is to have a happy user. We believe that with Aido, we will have many happy users who see robots as friendly companions at home.