Here's how to see people through walls
Using radio waves and AI (Artificial Intelligence), MIT researchers created a system capable of visualizing people who are on the other side of a wall, in other words see skeleton-like representations of people moving on the other side of a wall and can show them moving in real time as they do normal activities, like walk or sit down. It focuses on key points of the body, including joints like elbows, hips, and feet. When a person—either occluded by a wall or not—takes a step, “you see that skeleton, or stick figure, that you created, takes a step with it.
The radio signal they use is similar to Wi-Fi, but substantially less powerful.
The system works because those radio waves can penetrate objects like a wall, then bounce off a human body—which is mostly water, no friend to radio wave penetration—and travel back through the wall and to the device. But how do you interpret it? That’s where the AI comes into play, pacifically a machine learning tool called a neural network.
The way that artificial intelligence researchers train a neural network—which can deduce its own rules from data in order to learn—is by feeding it annotated information. It’s a process called supervised learning. Neural networks are commonly used to interpret images, but can also be used carry out complex tasks like translate from one language to another, or even generate new text by imitating the data it’s given.
They support the neural network with a camera then label the images the camera created to help the neural network correlate the activities. With the help of another neural network, the system could see examples of people walking, and then later, in new instances involving the same people, identify individuals with an accuracy of more than 83% even through walls.
The researchers have already started using the system, in a small study, with Parkinson’s patients. By putting the devices in the patients’ homes, they could monitor their movements in a comfortable setting without using cameras—in that sense, it’s a less invasive way of learning about someone’s body movements than traditional video would be. That study involved seven people and lasted eight weeks.
The results had a “high correlation” with the standard questionnaire used to evaluate the patients, Katabi says. Also, it revealed additional info about the quality of life of a Parkinson’s patient, the behavior and functional state.
Monitoring patients like this can help avoid “white coat syndrome,” when patients act differently in front of doctors during an occasional visit.