fbpx

AR / VR, News, Technology, User Research, Virtual Reality

Typealike: From Gestures to Commands

Abhinav Raj

Abhinav Raj, Writer
@uxconnections

VR is changing the way we interact with technology every day. Researchers at the University of Waterloo are developing a method to extract commands from hand gestures.

Where words can’t reach, gestures can. 

As humans, we instinctively rely on forms of nonverbal expression—like gestures. 

While hailing a taxi, pointing to an object, raising our hands at a meeting, or simply waving them around at a concert—we extensively rely on gestures to express ourselves. As a matter of fact, we have developed an entire complex language to aid the aurally impaired—based on gestures. 

Gestures are slowly finding their way into human-computer interactions as well. 

A team of researchers at the University of Waterloo’s Cheriton School of Computer Science have designed a prototype that can extract commands from simple hand gestures. 

Dubbed “Typealike”, the prototype can be used with a standard laptop or desktop webcam alongside an affixed mirror. The prototype functions by reading and recognizing simple hand gestures—which are construed as commands. 

A variety of different commands can be given through gestures to perform various functions—such as raising media volume by placing one’s hand beside the keyboard and using their thumb to point upwards. The program recognizes this as a command to pitch up the volume. 

VR has moved into the gaming, medical, and defence industries. (Image: Unsplash)

“It [Typealike] started with a simple idea about new ways to use a webcam,” commented Nalin Chhibber, a recent graduate from the varsity and Software Development Engineer at Amazon Toronto. 

“The webcam is pointed at your face, but the most interaction happening on a computer is around your hands. So we thought, what could we do if the webcam could pick up hand gestures?”

The prototype is based on a neural network model and it leverages machine learning (ML) technology to train on datasets. As with all ML-based models, the richer the dataset, the more precise the program becomes at performing tasks. 

The researchers postulate that more advanced iterations of the Typealike prototype might replace handheld controls altogether. This could be a revolutionary juncture for user experience and user interface design. 

As human-computer interactions take more forms, UX design is gaining new dimensions in relevance. From buttons to screens and now soon enough, the 3D space around devices, the idea of what defines an ‘interface’ is in constant flux. 

Backed by AI, VR may as well be the next big thing in the consumer tech revolution. 

or than just a tool in shaping our world.

Subscribe to the Blog
Join for the latest tech, design and industry news straight to your inbox.

Our UX team designs customer experiences and digital products that your users will love.

Follow Us

Leave a Reply

Your email address will not be published.

Optimized with PageSpeed Ninja