In ridiculously futuristic technology news, engineers at MIT have come up with a camera that can record things outside its direct line of vision. It does this by outpacing the speed of light.
Nothing ruins a photo like a solid wall standing between the camera and the object that it needs to photograph. Cameras depend on a quaint thing called 'light', and a wall will disrupt the flow of light to the camera lense. A new camera, being refined at MIT, uses a technique that allows it to photograph around the corner of a wall.
There are still conditions that need to be met. Light must reach the camera somehow, so the camera requires some surface, like a door or a nearby wall, that can be angled so that light bounces off of it and into the room. Even if that surface is opaque, the camera can use it to see into the room.
Say you're sitting in a room - as I imagine many of you are - with an open door. The camera would emit a beam of light. The beam hits the door, and bounces into the room. The atmosphere causes the beam to scatter. Some of its light hits you, and some of that light is scattered on back to the door. The door then bounces some of that beam back to the camera, which receives the light and creates a picture.
But if it were that simple, the camera would have been invented long ago. All it would have taken is an extraordinarily sensitive camera. There's a wrench in the works. Light scatters off you, but it also scatters off everything else in the room. You, the wall behind you, and the life-sized model of R2D2 between you and the door; it all gets reflected back, and no camera can tell a photon that's been bounced off you from a photon that's hit the wall.
Ah, the canny MIT researchers reasoned, but not all the photons will hit the door at the same time. The photons that hit R2D2 will have a shorter path to travel, and will hit the door and be reflected back to the camera first. The photons that hit you will have a slightly longer path, and travel back to the camera after the first wave has come in. Lastly, the photons bouncing off the wall, which will have had the longest distance to travel, will come back to the camera, panting and exhausted.
They designed this camera to take this into account. It has an incredibly high shutterspeed - up to one quadrillionth of a second. It emits a burst of light, then opens for a femtosecond, and counts the number of photons it gets back. It then emits another burst of light, then opens for a bit longer. This will increase the number of photons it gets back, and it will record these newcomers as having come from an object behind the first wave of photons. The process will be repeated until the entire room has been mapped. The first burst of photons will come from R2, the second burst will come from you, and the last, largest number of photons will come from the wall behind you. The camera will have 'mapped' the room.
The process is called Femtosecond Transient Imaging. Although right now it has to be refined in order to get clear images, it has a lot of possible applications. It could be helpful for the military or the police, but it could also be a boon to rescue workers. Researchers hope it will allow rescue workers to scan unstable structures without having to go in the structures themselves. With luck and time, someday firefighters can declare a burning building clear of people without ever having to step inside.