Do Sign Language Users Understand Tones? The Short Answer Is Yes

By: Cheryl

By: Cheryl Lu, Social Media Coordinator

A few weeks ago, we came across a discussion on a Chinese social media page about sign language. Similar to many anecdotes that went viral online, this story has a peculiar opening and a heart-warming ending: A person ordered food delivery online. When the food arrived, she was contacted by the deliverer, who acknowledged himself as being deaf and communicated with the orderer through texting. Despite understanding the deliverer’s difficulties, the orderer felt his tone was impolite. Wondering if the situation was normal, she posted the conversation and her concerns online, which was then flooded with comments from people who have experiences working with the deaf community. They explained to her that the grammars and sentence structures in sign language are different from spoken language and that figuring out tones in written text is hard for people who are deaf. The story was then shared across various platforms to inform the public about sign language and has received thousands of likes.

If people who are deaf don’t understand tones, how do they express their emotions? Intrigued by the use of tones in sign languages, we explored more on the internet. An educational video soon answered our questions. Made by Dudu, a Chinese Sign Language (CSL) teacher and vlogger from the deaf community, the video was dedicated to explain to hearing people the linguistic facts in the story above.

According to Dudu, sign languages have the ability to convey emotions, but most of the “tones” are achieved through different facial expressions, while the actual “words” that are signed remain the same. When people who are deaf translate their words into written language, it’s common for them to translate vocabulary by their literal meanings and omit facial expressions, making their sentences unpolished and sometimes crude. For example, the expression “You don’t like it? How about giving it to me?” in CSL can be signed the same as “You don’t like it? Give it to me,” while the differences in tones and levels of politeness are completely distinguished by facial expressions. For deaf people who are not familiar with written languages, it’s common to mix the two.

(“Good for you! (Sincere)” in CSL.)

(“Good for you. (Sarcastic)” In CSL.)

The use of facial expression is also essential in making jokes and being sarcastic. In 2015, researchers in Manitoba looked into how adults who use American Sign Language (ASL) communicate and understand sarcasm. Similar to CSL, sarcasm in ASL is expressed through facial expressions. According to the study, the use and understanding of sarcasm in sign language evolved with age. While children frequently fail to recognise sarcasm, college students sometimes overthink and mistakenly interpret literal language as sarcastic. However, the exact time when children start to be able to distinguish between tones hasn’t been determined yet.

(“Good job! (Sincere)” in ASL.)

(“Good job. (Sarcastic)” in ASL.)

The studies of tones in sign language even went as detailed as the use of specific facial features. In the book American Sign Language Tone and Intonation: A Phonetic Analysis of Eyebrow Properties, the author broke down the use of eyebrows in conveying emotions in ASL. In the book, the author concluded that the height of an ASL user’s eyebrow suggests their sentence’s properties: While the lowering of eyebrows indicates open-ended questions, the raising of a person’s eyebrows usually suggests expectation of a yes/no answer.

As crucial as tones are to human communication, this gap between emotional expressions in signed and written languages yet needs to be filled. Recently, a tool called tone indicator has been introduced for digital communication in text-based online communities. It works similarly to emojis or emoticons in the hope of conveying users’ tones and connotations. Though not as popular and widely used, tone indicators can potentially clarify text-based conversations and can be helpful to the deaf community. But to figure out all the features of tones and facial expressions in sign languages, as the researchers in Manitoba have said, there still is a lot of research to be done.

The best way to understand tones and connotations in sign languages, to date, is still through the work of an experienced professional interpreter. For over 30 years, MCIS has been bridging the deaf and hearing communities through our ASL and LSQ interpretation services. To read about ASL interpretation from one of our ASL interpreter’s perspectives, click here: Part 1Part 2 

To request interpretation services, click HERE.