Camera could help motorists see through fog

A camera that can see through opaque substances like milk could help motorists navigate their way safely along foggy roads. Engineers at the Massachusetts Institute of Technology’s Media Laboratory, have developed a camera that can measure distances even through translucent…

A camera that can see through opaque substances like milk could help motorists
navigate their way safely along foggy roads.

Engineers at the Massachusetts
Institute of Technology’s Media Laboratory, have developed a
camera that can measure distances even through translucent material.

The camera
uses nanosecond long pulses of light and measures the shift that
occurs when they are bounced back, allowing it to build up a three
dimensional picture.

This allows the camera to tell the difference between light that is scattered
by water droplets for example in fog or heavy rain, and solid objects like a
car bumper.

The engineers behind the device say it could be used to help motorists see the
road and other vehicles through thick fog.

“Using our technique you can generate 3D models of translucent or
near-transparent objects,” said Achuta Kadambi, one of the team who
developed the camera.

“We can measure the range of transparent objects and looking through diffusing
material.”

The camera uses technology known as Time of Flight, which measures the time it
takes for a light signal to be reflected back to an object to calculate
distance.

Cameras on Microsoft’s new Kinect system uses this approach, however they
struggle when viewing transparent or translucent objects.

This is because the light is scattered by these and “smears” the reflected
light, making it inaccurate.

This scattering of light is what can often blind motorists when they turn
their full beam on in foggy conditions, making it hard to see and judge
distances.

The new camera, which is constructed using equipment costing just £300, uses
pulses of light of different length to create a kind of binary code that is
beamed out. The camera then looks for this pattern being reflected back.

In this way the technique, known
as nanophotography, is able to tell the difference between scattered
light and light that is reflected back.

Refael Whyte, another member of the research team at MIT, said. “The light
sources sends a pulse of light to the object and back to the camera. This
camera measure the time it takes for the light to go from the camera, to the
object and back.

“We can measure the time of flight and calculate the distance light has
travelled and therefore build up a three-dimensional view of the object in
front of us.”