[ad_1]
Scientists have found that robotic vacuum cleaners could allow snoopers to hear household conversations remotely, despite not being equipped with microphones.
US experts have found that they can perform a remote interception attack on a Xiaomi Roborock robot cleaner by remotely accessing its Lidar readings, which helps these cleaners avoid bumping into furniture.
Lidar is a method of measuring distances by illuminating the target with laser beams and measuring its reflection with a sensor.
But Lidar can also capture sound signals by getting reflections from objects in the house, such as a garbage can, which vibrate due to nearby sound sources, such as a person talking.
A hacker could reuse a vacuum cleaner’s Lidar sensor to detect beeps in the environment, remotely collect Lidar data from the cloud, and process the raw signal with deep learning techniques to extract audio information.
This flaw could reveal the gender of a robot vacuum cleaner in the home, confidential business information from a conference call meeting, or credit card information recited during a phone call.
It could also allow hackers to listen to audio from the TV in the same room, “potentially leaking the victim’s political orientation or viewing preferences.”
The researchers used their hacking method on a Xiaomi Roborock robot vacuum cleaner (pictured) and evaluated the dangers of a hack
Experts, who call eavesdropping on private conversations “one of the most common but damaging threats to privacy,” point out that a smart device doesn’t even need a built-in microphone to snoop into private conversations at home.
“We welcome these devices into our homes and think nothing of them,” said Nirupam Roy, assistant professor in the University of Maryland’s Department of Computer Science.
“But we have shown that even if these devices don’t have microphones, we can reuse the systems they use for navigation to spy on conversations and potentially reveal private information.
“This kind of threat may be more important than ever when you consider that we all order food over the phone and have computer meetings, and we often talk about our credit card or bank details.
‘But what’s even more troubling to me is that it can reveal a lot more personal information.
‘This kind of information can tell you about my lifestyle, how many hours I am working, other things I am doing and what we watch on TV can reveal our political orientations.
“This is crucial for someone who may want to manipulate political elections or address very specific messages to me.”
Lidar allows vacuum cleaners to build maps of people’s homes, often stored in the cloud.
This can lead to potential privacy breaches that could allow advertisers to access information on things like the size of the home, which suggests the level of income.
This new hacking method involves manipulating vacuum Lidar technology, a remote sensing method involving lasers, which is also used in driverless cars to help them “see”.
Lidar navigation systems in home vacuum robots project a laser beam around a room and perceive the reflection of the laser as it bounces off nearby objects.
Researchers repurposed the laser navigation system on a robot vacuum cleaner (right) to pick up sound vibrations and capture human speech bouncing off objects such as a garbage can placed next to a computer speaker on the floor.
Robot vacuums use reflected signals to map the room and avoid colliding with a dog, a person’s foot, or a dresser as they move around the house.
Lasers and their small wavelength (a few hundred nanometers) allow for a fine-grained distance measurement, which can be used to measure subtle motion or vibration.
Meanwhile, sound travels through a medium like a mechanical wave and induces small physical vibrations in nearby objects.
The hacking method uses the same theory of laser microphones, used as a spying tool since the 1940s, which emits a laser beam on an object placed near the sound source and measures this induced vibration to recover the audio source.
A laser microphone pointed at a glass window in a closed room can reveal conversations from within the room from over 500 meters away.
In general, sound waves cause objects to vibrate, and these vibrations cause slight variations in the light bouncing off an object, converting those variations back into sound waves.
The figure shows the attack, in which a hacker remotely exploits the Lidar sensor equipped on a victim’s robot vacuum cleaner to capture privacy-sensitive conversation parts (such as a credit card) emitted through a computer speaker while the victim participates. to a conference call
Experts say a diffuse signal received by the vacuum sensor provides only a fraction of the information needed to recover sound waves.
In the tests, the researchers hacked a robot vacuum to check the position of the laser beam and send the detected data to their laptops via Wi-Fi without interfering with the device’s navigation.
Subsequently, they conducted experiments with two sound sources.
One source was a human voice reciting numbers played over computer speakers and the other was the audio of a variety of television programs played through a TV soundbar.
Then they captured the laser signal detected by the vacuum navigation system as it bounced off a variety of objects placed close to the sound source.
Items included a kitchen bin, cardboard box, take-away food container, and polypropylene bag – items found on a typical floor.
Using a computer program, the researchers identified and matched the spoken numbers with 90% accuracy.
Deep learning algorithms were able to interpret scattered sound waves, such as the ones above that were captured by a robot vacuum cleaner, to identify musical numbers and sequences
They also identified one-minute recording TV shows with the same accuracy.
The researchers said other hi-tech devices could be subject to similar attacks such as smartphone infrared sensors used for facial recognition or passive infrared sensors used for motion detection.
“I believe this is significant work that will make manufacturers aware of these possibilities and inspire the security and privacy community to find solutions to prevent these types of attacks,” said Professor Roy.
The research, a collaboration with Jun Han at the University of Singapore, was presented Wednesday at the Association for Computing Machinery conference on network-integrated sensor systems (SenSys 2020).
.
[ad_2]
Source link