Experimental Camera

view the p5js sketch

Description

This camera works to help people communicate with the deaf-mutes. The hand gestures on the screen are hints and guides to teach people the sign language. By selecting the daily sentences in the box, the hand gestures change to match the words. So users could follow the gestures on the screen to communicate with the deaf-mutes through video chat.

Design Process

My inspiration comes from my experience visiting the children’s wealfare home, and recently I get into contact with one deaf-mute girl I met at that time. I found it hard to communicate with her through such a small screen on video chat since I knew nothing about sign language.

First of all, I did some research on sign languages and found that the signs are different in countries. I chose the sign language used in China. When people learn a new language, they always start with simple everyday used expressions. So I started to write down the most frequently daily-used sentences and phrases, such as “hello” and “sorry” . Then I drew out the gestures that matches the words. After uploading the images in p5js, I created a small selection box at the bottom of my camera for users to choose the words they need.

Reflection

Although I only chose the sign language in Chinese for the sketch, I think about the necessity of different sign languages for people from different areas over the world. The camera could be more interactive and useful if the camera can detect the movements of users. In this way, the camera can estimate if the users do the right gestures and show the following gestures to make longer sentences. But I haven’t figure out how to track and identify movements, especially hand gestures in p5js. I will keep exploring in this area.