Soon, robots could read your mood and actions through your body language

Researchers develop a computer code to help robots decode your actions through your gestures

Representative Image
Automation, Robotics, Robots, Digital, AI
IANS New York
Last Updated : Jul 07 2017 | 7:36 PM IST

Researchers have developed a computer code that could help robots understand body poses and movements, allowing them to perceive what people around them are doing, what moods they are in and whether they can be interrupted.

"We communicate almost as much with the movement of our bodies as we do with our voice," said Yaser Sheikh, Associate Professor of Robotics at Carnegie Mellon University in Pittsburgh, Pennsylvania, US.

"But computers are more or less blind to it," Sheikh said, adding that the new methods for tracking 2D human form and motion open up new ways for people and machines to interact with each other, and for people to use machines to better understand the world around them.

The computer code was developed with the help of the university's Panoptic Studio, a two-storey dome embedded with 500 video cameras.

The insights gained from experiments in that facility now make it possible to detect the pose of a group of people using a single camera and a laptop computer, the researchers said.

To encourage more research and applications, the researchers have released their computer code for both multiperson and hand-pose estimation.

The researchers will present reports on their methods at CVPR 2017, the Computer Vision and Pattern Recognition Conference to be held in Honolulu, Hawaii from July 21-26.

Tracking multiple people in real time, particularly in social situations where they may be in contact with each other, presents a number of challenges.

Simply using programmes that track the pose of an individual does not work well when applied to each individual in a group, particularly when that group gets large.

Sheikh and his colleagues took a bottom-up approach, which first localises all the body parts in a scene -- arms, legs, faces, etc. -- and then associates those parts with particular individuals.

This method helped them to build the computer programme that the researchers believe may have several other application.

The ability to recognise hand poses, for instance, will make it possible for people to interact with computers in new and more natural ways, such as communicating with computers simply by pointing at things.

It could also help a self-driving car get an early warning that a pedestrian is about to step into the street by monitoring body language.

--IANS

gb/vt

(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)

*Subscribe to Business Standard digital and get complimentary access to The New York Times

Smart Quarterly

₹900

3 Months

₹300/Month

SAVE 25%

Smart Essential

₹2,700

1 Year

₹225/Month

SAVE 46%
*Complimentary New York Times access for the 2nd year will be given after 12 months

Super Saver

₹3,900

2 Years

₹162/Month

Subscribe

Renews automatically, cancel anytime

Here’s what’s included in our digital subscription plans

Exclusive premium stories online

  • Over 30 premium stories daily, handpicked by our editors

Complimentary Access to The New York Times

  • News, Games, Cooking, Audio, Wirecutter & The Athletic

Business Standard Epaper

  • Digital replica of our daily newspaper — with options to read, save, and share

Curated Newsletters

  • Insights on markets, finance, politics, tech, and more delivered to your inbox

Market Analysis & Investment Insights

  • In-depth market analysis & insights with access to The Smart Investor

Archives

  • Repository of articles and publications dating back to 1997

Ad-free Reading

  • Uninterrupted reading experience with no advertisements

Seamless Access Across All Devices

  • Access Business Standard across devices — mobile, tablet, or PC, via web or app

More From This Section

First Published: Jul 07 2017 | 7:36 PM IST

Next Story