Signed languages Source: Aaron Aupperle
Advancements in natural language processing (NLP) enable computers to understand what humans say and help people communicate through tools like machine translation, voice-controlled assistants and chatbots.
But NLP research often only focuses on spoken languages, excluding the more than 200 signed languages around the world and the roughly 70 million people who might rely on them to communicate.
Kayo Yin, a master's student in the Language Technologies Institute, wants that to change. Yin co-authored a paper that called for NLP research to include signed languages.
"Signed languages, even though they are a significant part of the languages used in the world, aren't included," Yin said. "There is a demand and an importance in having technology that can handle signed languages."
The paper, "Including Signed Languages in Natural Language Processing," won the Best Theme Paper award at this month's 59th Annual Meeting of the Association for Computational Linguistics. Yin's co-authors included Amit Moryossef of Bar-Ilan University in Israel; Julie Hochgesang of Gallaudet University; Yoav Goldberg of Bar-Ilan University and the Allen Institute for AI; and Malihe Alikhani of the University of Pittsburgh's School of Computing and Information.
The authors wrote that communities relying on signed language have fought for decades both to learn and use those languages, and for them to be recognized as legitimate.
"However, in a predominantly oral society, deaf people are constantly encouraged to use spoken languages through lipreading or text-based communication," the authors wrote. "The exclusion of signed languages from modern language technologies further suppresses signing in favor of spoken languages."
Yin first became interested in sign language while doing outreach work at a homeless shelter while she was an undergraduate at École Polytechnique in Paris. There, she met a deaf woman and saw how difficult it was for her to establish social connections with others. Yin started learning French sign language and pursued sign language translation as part of her undergraduate research.
Once at the LTI, she noticed that almost all NLP research addressed only spoken languages. Computer vision research sought to understand signed languages but often lost the linguistic properties that signed languages share with spoken languages.
Signed languages use hand gestures, facial expressions, and head and body movements and can convey multiple words at once. For example, someone could sign "I am happy," but shake their head while doing it to indicate that they are not happy. Signed languages also employ shortcuts similar to the use of pronouns in spoken languages. Natural language processing tools are better equipped than computer vision methods alone to handle these types of complexities.
"We need researchers in both fields to work hand in hand," Yin said. "We can't fully understand signed language if we only look at the visuals."
Hochgesang, a deaf linguist who focuses on signed languages, said that when she was studying for her degree, there was barely any mention of signed languages in the literature, in her linguistics classes and in research like NLP. Language was speech; other methods of expressing language were ignored.
"On a personal scale, this hurt. It completely ignored my way of being," Hochgesang said. "When I was a student, I didn't see myself in the data being described and that made it really hard for me to connect. That it still hasn't improved much these days is unfortunate. The only way this kind of thing will change is if we are included more."
Yin said the paper was well received by both natural language processing researchers and people studying and using signed languages -- the two groups she sought to bring together.
"It's really exciting to see a paper the I wrote motivate people, and I hope can make a change in these communities," Yin said.
| }
|