Priyanjali Gupta, an engineering student, doesn’t have many stories about the inspiration behind her AI model which instantly translates American Sign Language (ASL), into English.
Her mum was the real driving force. She asked her to do something while she was studying engineering, a sentiment that is repeated by many Indian mothers. Gupta, a third-year student in computer science at the Vellore Institute of Technology (Tamil Nadu), specializes in data science.
Gupta’s AI model uses Tensorflow object recognition API. Fast forward to February 2022. The model uses transfer learning with a pre-trained model called ssd_mobilenet. Her LinkedIn post went viral with over 58,000 responses and 1,000 people appreciating the idea. It bridges the gap in inclusive technology and creates ripples.
Gupta attributes her model inspiration to Nicholas Renotte, a data scientist who created the video Real-Time Sign Language Detection.
Gupta admits that it is not easy to create a deep learning model starting from zero for sign detection.
ASL is ranked third in America behind English and Spanish. However, technology and applications to translate it into another language are still far behind. The Zoom Boom, which was made possible by the pandemics, has brought sign language to the forefront.
Google AI researchers presented a real-time sign language detection model that could identify people signing with as high as 91 percent accuracy.