Getting Eyebrows Involved in ASL Translation
Check out the latest podcast episode Listen
The Y Report

Eyes on Accessibility

A BYU graduate researcher is improving ASL translation—by focusing on eyebrows.

A BYU graduate researcher is improving ASL translation by focusing on eyebrows. He poses with different eyebrow expressions.
Photos by Bradley Slade

American Sign Language is all about the hands, right? Actually, ASL signers form ideas and concepts using the whole body—including facial expressions. Nathan O. Browne (BA ’22, MA ’25), a computer science PhD student, is developing AI models that can “read” body movements to better decipher sign language and hopefully improve ASL translation. He’s starting with eyebrows, which provide important
grammatical markers.

Browne saw potential for improved translation while taking ASL classes for his linguistics master’s program at BYU. A major obstacle for ASL translation is the lack of available data for the models to train on. The largest data set for ASL has only 98,000 sentences. (For comparison, similar datasets in Spanish, Portuguese, French, and Chinese have hundreds of millions of sentences.)

“I wanted to build something that could level the playing field,” says Browne, who works in BYU’s MATRIX Lab, which aims to improve machine translation.

Browne’s research breaks down ASL sentences into different physical components—such as eyebrow movement—instead of all the movement together as a whole. His results show that his AI model can identify the difference between a yes/no question and an open-ended question more accurately than models that track the whole body at once. It’s a model, he says, “that can do more with less data.”

The research “[leverages] existing data to show what’s possible but also highlights the gap . . . to motivate [researchers] to collect more data,” says computer science professor Eric K. Ringger (BS ’92), Browne’s faculty mentor. “It’s great to work with Nathan on this project. He brings a lot of great talent
and insight and passion.”

Browne has long been drawn to improving accessibility. He was helped by BYU’s accessibility center after he was diagnosed with narcolepsy. Over the years he’s felt the guidance of the Lord pushing him
toward the field of machine translation.

Browne has felt that help again during work on his dissertation. “It was inspiration,” he says. “There would be nights where I’d wake up in the middle of the night, have an idea, write it down, and then it’s
turned into this.”

With this success, Browne’s goal is to train AI models to read the other sign components, “all of them working together to actually do translation, like we see with Google Translate.” In so doing he is “making the world a more accessible place for Deaf and hard of hearing people.”

Artificial IntelligenceHumanitiesLinguistics