OKO Home AI

Year

2023

Theme

HCI Case Study; User research

Type

6-week Course Project(ArtCenter College of Design)

Team

Munchy W, Dian G, Jiacheng F

My role

Research; Prototype; UI Design; Video Editing

This project is a research-based project that I participated in Human-Computer Interaction course by Julian Scaff in 2023 at ArtCenter College of Design. Our team used research methods to predict future AI functions in home systems and tested our design, focusing on user scenarios, projected interfaces, and gesture and voice interactions. I collaborated with my teammates on the research and was responsible for creating story dialogue prototypes and interface prototypes.

OKO HOME

Future Casting

In 2030, what experiences and interfaces of smart home system will be given by AI?

The future of smart home

In the future, technology in home living spaces will be more integrated, and the importance of smart home terminals will increase.
Picture source: https://www.cbinsights.com/research/report/smart-home-future-trends/

Research Outcome

Our project deliveries contain a series of the user scenario, the home dashboard interfaces and dashboard interactions.

User Scenario 1

Communicate with AI on the way home

After school, the user talks to the AI assistance to help him order a dinner to home.

User Scenario 2

Head-up interface on the home devices

The delivery guy arrives when the user is in the bathroom. The AI communicates with him in the mirror.

User Scenario 3

AI learns and request attempt

The smart home asks to refine room settings for the user as it learns user behaviors.

Design

Dashboard interfaces

Sleep mode & Views(home app)

Our team decided that our AI would interact using dynamic, gradient circles, so I designed this expandable interface. It can adapt to all surfaces and screens in the home, including mirrors, walls, and floors. This example shows how the interface displays in different modes.

Room settings

Functions variation by room settings(home app)

The outer circle function buttons change with each room, so I created three room interface examples to showcase their functionality. For the icons, I chose simple and clear designs that align with users' current habits from mobile app icons.

Operation interfaces

Expanded content(home app)

I designed two functional interfaces: lighting control (common) and student study room settings (uncommon). When users press a function button, the rest of the content becomes less transparent to focus the interface on the current step.

Left: The lighting interface is on a secondary screen, so its content will appear on either side of the dashboard console.

Right: The student study room settings require entering the app, placing it on a tertiary screen with a back button on one side of the interface.

Mobile interfaces

Ordering food(home app)

Chatting mode(default): We added an app into the smart home system, allowing users to use the AI assistant or control their home remotely when they're not at home. We placed the AI avatar at the center of the screen to encourage users to primarily interact with the AI for home settings. Users can access operations and settings by clicking the hamburger menu.

Dialogue mode: While interacting with the AI, we designed an expandable conversation interface (accessible by swiping up on the homepage). We highlighted the speaking and listening actions, and real-time content updates in the dialogue.

Popup Component: We designed an interface example for when the AI retrieves restaurant and delivery information. In this conversation, the user can continuously see this information until the order is delivered.

Prototype

Dashboard interaction

I used Figma to prototype this use case and project on the wall.

Research Approaches

We set up 3 groups of experiments and a user test of dialogue design to test our design. We tested:
1. The color theory and mental model of the outlook in different colors;
2. The feeling and connection to scenarios of the notification sounds;
3. Gestures for interaction.

1. Color Study

Gredient and color choices for an AI feeling

Cold-toned colors are considered to be more inclined to the feeling of AI, especially BLUE.

2. Prompt Sounds Study

Gredient and color choices for an AI feeling

Mental models of sound effects lead test subjects to choose sounds that feel similar to the scene.

1. Something is ready.

"Ding~"
Joy of Ready

2. Your delivery is arrived.

Joy of Food
Dropped

3. Guest in front door.

Door Bell Ring
Dong Dong~

3. Gesture Study

Gredient and color choices for an AI feeling

Users' preferences for interface feedback hinge on intuitive mappings, like buttons moving out mirroring finger spreads, and the effectiveness of gestures, considering accidental touches and input frequency.

4. AI Conversation Study

Dialogue Scripting

Users' choices mapped how they expect to receive feedbacks from interfaces. For example, buttons spreading out from the center correspond to fingers opening, which is a strong mapping. At the same time, testers believe that the quality of the effect achieved by gestures is an important criterion in selection, such as the possibility of accidental touches or the frequency of input methods.