News

New tech translates American Sign Language gestures into text in real time using deep learning and hand tracking.
These annotations played a critical role in enhancing the precision of YOLOv8, the deep learning model the researchers trained, by allowing it to better detect subtle differences in hand gestures.
American Sign Language (ASL) recognition systems often struggle with accuracy due to similar gestures, poor image quality and inconsistent lighting. To address this, researchers developed a system ...
That data could also enable Nvidia to develop new ASL-related products down the road — for example, to improve sign recognition in video-conferencing software or gesture control in cars.
On Thursday, Nvidia launched a language learning platform called "Signs" using artificial intelligence for American Sign Language learners.