With new AI technology, robots can detect human touch by analyzing shadows

Cornell University scientists have developed a way for robots to identify physical interactions by analyzing only the shadow of a user.

Their ShadowSense system uses a USB camera from the shelf to capture the shadows produced by hand gestures on the skin of a robot. Algorithms then classify the movements to derive the user’s specific interaction.

Lead author Guy Hoffman said the method is a natural way to communicate with robots without relying on large and expensive sensor setups:

Touch is such an important means of communication for most organisms, but it was virtually absent in human-robot interaction. One of the reasons is that touching the body previously required a large number of sensors and was therefore not practical to implement. This research offers a cheap alternative.

The researchers tested the system on an inflatable robot with a camera under the skin.

[Read: How Polestar is using blockchain to increase transparency]

They trained and tested the classification algorithms with shadow images of six gestures: palm touch, punch, two-hand touch, embrace, point and not touch.

It successfully distinguished between the gestures with 870.5 – 96% accuracy, depending on the lighting.

The system was best in daylight (96%), followed by dusk (93%) and night (87%).
Credit: Hu et al.