Video: MIT’s X-Ray Vision System Can See Straight Through Concrete Walls

Share

The ongoing imperative to turn soldiers into “supersoldiers” has seen some pretty superhero-esque technological enhancements become real-world defense sector pursuits, like the ability to hear through walls, to fly at will, or to harness superhuman strength. Now we can add X-ray vision to that list.

A team of researchers at MIT’s Lincoln Lab have created a radar array that provides a realtime picture of what’s happening on the other side of solid concrete walls, even when they’re eight inches thick and 60 feet away. That’s no simple feat. More than 99 percent of radar signal is lost passing through the wall–and another 99 percent is lost as the reflected signal passes back through. But by leveraging signal amplifiers, a clever filtering technique, and some powerful digital processing, the new radar system is able to produce what basically amounts to a realtime video of movements on the other side of a solid wall at 10.8 frames per second.

The array consists of two rows of radar antennas–a top row of eight transmitters and a bottom row of 13 receivers–with the transmitter row augmented with powerful signal amplifiers to offset the signal loss caused (twice) by the wall. The transmitters fire strengthened S-band waves at the obstacle, and here the system meets its first real challenge: the wall itself.

Regardless of how strong the waves are or how much wall they can penetrate, the wall will always show up as the brightest element. To circumvent this issue, the system employs an analog crystal filter that can be used to delete the wall from the results. That’s because the wall, at say 30 feet away, will return signals at a different frequency than the people on the other side of the wall, which are perhaps 35 or forty feet away. The system establishes where the wall is and filters out those signals, leaving behind only the frequencies that correspond to distances beyond the wall.

It also filters out inanimate objects, because the system works based on motion–that is, on comparing each frame to the previous one and seeing what’s changed. So furniture and other obstructions won’t show up in the final results, but humans–even humans trying their best to remain motionless–will register.

The end result is an image that offers an overhead view of the area on the other side of the wall. The system is pretty rough as it stands currently. The array is about 8.5 feet long (not man portable) and even after all the algorithmic processing the system performs on the receiving end, targets on the other side of the wall are crudely represented by red blobs slowly moving on the screen.

But given some tweaking–the team wants to work on the end-user interface so the imagery is sharper and targets are represented as crosses or squares rather than l like storm systems moving on a weather map–and perhaps some shrinking, a vehicle-mounted version of this could seriously augment situational awareness during urban combat. The Lincoln Lab team offers a much more detailed explanation of the nuts and bolts in the video below.

MIT News

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.