Kinesics: Designing charades

In this series of blog posts, we will take you through the creative process behind Kinesics. This artistic research project explores the future role of body language in Virtual Reality. This is the sixth post in a series of ten.


As we pointed out in our previous post, we decided to divide our workload over three prototypes in order to experiment in three different directions. One of the three prototypes is an evolution of the popular game Charades. But how to evolve it and translate it into a virtual world?


Decisions made
The core thing we tested with Charades is how people communicate emotions to each other non-verbally. We tested this at A.maze 2019. Emotions touch upon the essential of this project - we make sense of the world by using our body. Often without consciousness, we translate all these sensations and emotions we feel through body language. Of course, portraying emotions through body language in VR has consequences and limitations, such as the absence of facial expressions, lower body tracking, or finger tracking. However, these limitations made playing the game more fun. It stimulates the player to really think about how you translate an emotion into body language, which is a fun mental and physical exercise to do. The emotions we chose for the game ranged from easy (happy and angry) to complex (pensive and cheeky). In our instructions, users experimented with both text (the word ‘happy’) and emoticons (a happy smiley).

Decisions made
The game switched roles after each turn. Therefore, each player could perceive the task from both perspectives of acting and guessing. As the game advanced, the perspective of the game changed for both players. They would see each other, sideways, upside down or even on both sides of a glass ceiling. This was mainly done to convey a sense of progression and surprise on top of the core acting and guessing gameplay. We hoped it would also offer a small increase in difficulty, but it turns out people are surprisingly good in interpreting another persons upside down body.

Lessons learnt
Players from different cultural background use different body language. Repeatedly, playtesting showed that people who speak different languages have trouble understanding others’ body language. When two players with different cultural backgrounds (in this case French and Russian) played the game, they visibly had more difficulty in understanding each other. It took them about twice as long to complete the game. They explained this by the fact that some gestures for emotions do not mean the same thing in other cultures. This means that the ways in which we use our body for communication is culturally driven. It would be anthropologically interesting to experiment with these cultural differences in body language in another game. It might lead to the emergence of new cross-cultural body language, if people from different cultures were to adopt other ways of using the body for communication.

Lessons learnt
People shared their own body language when they knew each other. We saw that couples or close friends were able to communicate far quicker and more effectively with each other than strangers. Friends and family members often had a shared understanding of their body language that outsiders wouldn’t understand.

Lessons learnt
Facial expressions are tightly bound to emotions in particular. Even though players can’t see each other’s faces, because facial expression is not tracked in VR, almost all players instinctively keep using exaggerated facial expression during their ‘acting’ turn. It seems emotion and facial expression are so intertwined for us that we cannot help but combine them.

Lessons learnt
Incorporating emoticons distracted the players of using their body language. This was the most interesting thing that did not work well in the prototype. We experimented with using emoticons instead of communicating the emotions by text. The result was that players tried to visually mimic the emotions with their face instead of trying to find their own embodied expression of the emotion.

Lessons learnt
The open virtual environment invited people for spontaneous play and personal expression. This was illustrated by couples that were hugging each other in VR, while standing 5 feet away from each other in real life. Also, when communicating ‘right’ or ‘wrong’, players instinctively started communicating this through body language instead of using the incorporated buttons in the game design. As a result, we turned away from the ‘right’ or ‘wrong’ mechanical answers to a more physical action: choosing a ball and giving that to the other player. It afforded players to put more expression and nuances into their body language. A frustrated and angry expression is so much funnier after repeated failure of your partner to grasp your emotion.

Lesson learnt
Allow people to become accustomed to their virtual body. Players seemed to need space and time to get into VR and the avatar they were using. This resulted in a lobby room being formed at the beginning of the game. In this way, understanding your virtual body was not part of playing the game itself.


In our next post we’ll talk about our the design of the second prototype ‘Switched Hands’. This research was a collaboration with ImproVive and funded by Creative Industries NL.