Every so often something happens to take image recording and processing a major leap forward. Here are two recent examples.

Recording and Playback. Traffic Police dealing with a major crash have two conflicting priorities.

1) Gather all possible evidence to help understand the nature of a crash
2) Get the road cleared as fast as possible to get traffic moving again thereby reducing the chance of another crash

Instead of a photographer taking a dSLR and recording each element with a numbered marker in the way we see on CSI programs, suitably equipped forces can now use dual 360 degree laser enabled cameras. Unlike a single car mounted 360 camera used to record streets for Google Street View, two cameras are placed some distance apart and work together to produce a highly accurate 3D representation of the crash site. Long after the sites cleared it can be virtually walked around by the crash forensics team.

Anyone who’s watched Blade Runner will remember Harrison Ford’s brilliant performance as Deckard, the burnt out expert who reluctantly agrees “retire” one last group of replicants who’s come to earth to meet their maker. Photographers especially will recall him talking to a computer asking it to zoom and pan around a photograph as he looked for evidence.

Watching him ask the computer to track to the side, bringing detail previously unseen detail into view brings us to the second example.

The second example got a lot of people literally gasping! (There’s video!)

Motion blur. At some point we’ve all had to shoot hand held in conditions that have resulted in an image blurred by camera movement. A good tip for such situations is to shoot as many shots as possible in quick succession so that we hopefully record an image at the point of least movement.

But what if we could bring some image processing worthy of a Blade Runner style film to our workflow to help? Adobe are close to doing just that.

Just watch this clip from the Adobe MAX Sneak peak session.