Smart Device Translates American Sign Language To English

By Kim Bussing on October 30, 2015

CCSS NAS-1 Word Search
Photo Credit:

American Sign Language (ASL), has been one of the primary means of communication for the deaf in the United States and many parts of Canada since the 1800s. It is estimated that between 500,000 to 2 million people use the language on a regular basis.

But popular as it is, there are millions of people with normal hearing that do not understand the language. And while most deaf Americans learn to speak with the help of speech therapy, many find it easier to communicate through sign language. Now a team of researchers at the Texas A&M University has created wearable technology that will make it easy for ASL and non-ASL users to converse.

Photo Credit: Texas A&M University

The smart device is the brainchild of a team led by Biomedical Engineering Associate Professor Roozbeh Jafari. It uses two separate sensors to translate intricate ASL gestures into simple English. The first that is fitted with an accelerometer and gyroscope keeps track of the significant movements - the user's hand and arm as he/she tries to communicate.

The second sensor helps distinguish the smaller movements that follow, the larger ones. Called an electromyographic sensor (sEMG), it can recognize various hand and finger movements based on muscle activity. The two sensors working in tandem help provide an accurate interpretation of the gesture.

Image Credit:

For example, when an ASL user is gesturing the words “please” or “sorry," the first sensor will pick up the hand drawing circles to the chest, while the second will ascertain if the fist is open ("please") or closed ("sorry").

Once the device that is worn on the user's right wrist has captured the gesture, it transmits the appropriate signals to a laptop via Bluetooth. A complicated algorithm translates them into English and displays the word on the computer screen.

Jafari, who unveiled the prototype at the Institute of Electrical and Electronics Engineers (IEEE) 12th Annual Body Sensor Networks Conference this past June, says there is still some work to be done before the technology can be used in the real world.

For one, it currently recognizes just 40 primary ESL signs, which means that it has thousands more to learn. Also, the smart device only translates one word at a time making ordinary conversations painfully slow.

The research team also realizes that not all communication takes place around a laptop. They hope to eliminate the need for one by incorporating a computer into the wearable. The computer will then send the translation to a smart device, allowing two people "speaking" different languages to have a coherent conversation.

Photo Credit: Texas A&M University

Additionally, each device has to be custom programmed, which means that the individual has to "train" the wearable by repeating every ASL sign a few times. This is a time-consuming process and can only get worse as the translator's vocabulary expands. Jafari hopes to reduce the learning time or eliminate this requirement altogether in the product's next release. Despite the numerous challenges the researcher is not worried. After all, it took his two graduate students just a few weeks to come up with the first impressive prototype.

Photo Credit: Motion Savvy/Indiegogo

The Texas team is not the only one working on making conversation between ASL and non-ASL users easier. MotionSavvy has a product that uses a smart device camera to translate gestures into speech. In China, researchers have created a motion sensing device that translates Chinese Sign Language into both spoken and written words. With so many brilliant minds focused on finding a solution, communication difficulties experienced by ASL users may soon be a thing of the past!


Listen to the Article: Play Audio

VocabularyPlay Game


Article Comprehension (5 questions)

  1. What problem is the team from Texas A&M University trying to solve?
  2. How do the two sensors allow the smart device to decipher intricate ASL hand gestures?

Critical Thinking Challenge

Which of the three solutions mentioned in the article do you believe is...

Vocabulary in Context

“It uses two separate sensors to translate intricate ASL gestures into simple English.”

In the above sentence the world intricate most likely means:

(a) very...

to use your custom avatar.
  • Diamond LOLFriday, December 2, 2016 at 7:51 am
    here goes agian
    • Diamond LOLFriday, December 2, 2016 at 7:50 am
      oh no i think its happening
      • Diamond LOLFriday, December 2, 2016 at 7:49 am
        i think im going to barf this computer stinks litterly
        • made Friday, November 4, 2016 at 12:53 pm
          made from china cool
          • flamefly
            flameflyTuesday, November 29, 2016 at 2:16 pm
            Isn't every thing these days
          • AmazingFriday, November 4, 2016 at 9:27 am
            I LOVE THIS!
            • boooringWednesday, November 2, 2016 at 3:59 pm
              yeah kinda boring... i mean this would be cool for someone who uses sign language in every day life, but it should speak so you can tell what they are saying.
              • What upMonday, October 31, 2016 at 4:03 pm
                So cool that they can translate movement and mussel usation
                • hello manMonday, October 31, 2016 at 4:01 pm
                  i dont like reading but my teacher made me why
                  • love itTuesday, October 25, 2016 at 10:57 am
                    lol cool but boring sorry
                    • love itTuesday, October 25, 2016 at 10:56 am
                      so cool 101

                      Our Apps and Plugins