Happilee or how to represent computer feelings

2016 was a big year for me. I graduated from ECAL, and I got a first job as design intern at PAN studio, London. I did various things, and one of the projects that I worked on is called HAPPILEE, a physical movement game in which players must rescue a small, obsolete robot that is prone to dying when exposed to “data hotspots”. In the game, the only way to save HAPPILLEE, is by taking it in your hands, and walk through an invisible maze formed by clouds of data produced by people’s mobile phones. The more people around you, the trickier the game becomes.
My former boss Sam Hill talks about the narrative we developped around the project:

“Many years ago, HAPPILEE (High Aggregate Peer-to-Peer In-Location Empathy Emulator) was developed by a government think tank and optimistic machine learning laboratory for use in public spaces. The objective was to communicate the emotional state of an area by intercepting data from nearby phones and re-enacting any emotions conveyed, in real-time.
Unfortunately, nobody predicted the sheer volume of traffic that would exist today. Now HAPPILEE, broken and abandoned, exists on the brink of overload - desperately processing hundreds of emotional states at once.” (see full blog post here)

HAPPILLEE is part of the being there program, that gathers several projects that focus on the relationship between human and robots in public spaces. HAPPILEE is a collaborative experiment that brought together PAN Studio, creative technologist David Haylock, and Traian Abrudan from the Department of Computer Science, Univeristy of Oxford. While we worked on designing the game logic, behaviour and design of HAPPILEE, we had some awesome people working on the location system, the programming and communication protocol of the project. On this article, I will mainly focus on PAN Studio’s part.

REFERENCES

So, how do you design a robot that is supposed to be obsolete, processing emotions, and, more importantly, that you would actually want to engage with ? In order to find the right design, we first made some visual investigations on various fields, and particularily in the way human emotions are mimicked by robots

Baymax from the movie Big Hero. His face remainy unchanged, but it can communicate via a screen located on his chest.

Mira, the small robot who you can play peek-a-boo with. Eventhough it was no facial emotions, it can communicate with a change of colors, and some really cute movements.

A really weird wearable robotic ring brings facial emotions to its basics: eyes and mouth.

FINDING THE RIGHT EMOTIONS TO DISPLAY

We wanted a simple, direct way to picture feelings. After some research on emotions, we selected a few of them that were significant enough to be integrated in the robot.

All the emotions we finally chose to display, as well as their attributed color.

A first attempt on overlaying the word with an image.

A FACE AND SOME COMPUTER-SUFFERING

We quickly decided that it was essential for HAPPILEE to have a face, because it would make it easier for people to engage with it.

One of our influences was Gerty from Moon movie, who is the only companion of a lonesome astronaut. To make sure Gerty is clearly and unmistakably understood, he shows cartoony, exaggerated emotional features. We decided HAPPILEE would use the same medium of communication.


We also thought of “hardware pain”, by mimicking signals that are universally interpreted as “pain” or “major issue” in the computer world. That involves: BSD (blue screen of death), desperately blowing fans, overheating, and high pitched beeping noises.

Anyone above 20 can relate to this blue screen. Displaying a blue screen is an easy way to indicate to the player that the robot / computer is in deep trouble and needs serious help.

FIRST DESIGNS

HAPPILEE is supposed to have been designed in the 90’s, so we had to take that into account, too. We made some research into what I call retro-futuristic aesthetics. We researched colors, shapes and mechanisms that would remind us that HAPPILEE has been designed two decades ago.
The first part of the job was to actually design the shell of the robot. I produced a serie of doodles, drawing what happilee could be. We tried to make it look cute, broken and obsolete. We wanted to indicate, that, when HAPPILEE was completely functional, it could move on it’s own, but now, it was somehow unable to.

Some doodles on Happilee's appearance. It had to look broken, indicating that is was once able to move on its own

Then, we focused on HAPPILEE’s face (a screen), that is supposed to display the emotions that the robot is “scanning” around. It had to somehow display emotions, but also start going really wrong if scanning too much info.

A video that shows all of Happilee's emotions. As the video goes by, Happilee's state worsens, to finally crash and die. This is the type of behaviour Happilee would get during the game: going well when not too much data is around, and getting worse when there would be data overload. Following these hints, the player would need to find safe zones, and avoir lethal ones, by walking around in the physical space.

FIRST PROTOTYPE

We built a first prototype that you could actually hold in your hands. The screen would display emotions, and there would also be a fan that would start blowing whenever the robot was in state of “emotional overload”. We used a RaspberryPi to control it. To make it wireless, we added a power bank.

WHAT THE PROJECT IS NOW

After some -a lot- of additional work, HAPPILEE was finally showcased at an event at Bearpit Bristol, Fri 15 Jul 2016.

FURTHER INFO:

http://being-there.org.uk/
http://being-there.org.uk/project/happilee-0
http://www.watershed.co.uk/studio/events/2016/07/15/happillee