The "jitter" method (earlier post) can also be used for ultra-high-resolution photography.
People want higher-resolution cameras, but the output resolution of a camera is usually limited by the number of pixels in its sensor. Some digital cameras have a "digital zoom" function, but this is a bit of a cheat: it simply invents extra pixels between the real pixels by smudging the adjacent colour values together. Conventional digital zoom doesn't actually give you any additional information or detail, it just resizes a section of the original image to fill the required size.
A second problem with cameras is camera shake. If you're holding the camera in your hand, then a tiny movement of the camera can result in the image being panned across the sensor while the CCD imaging chip is doing its thing, giving a blurred photograph. The smaller the pixel elements, and the greater the optical zoom, the worse this gets. We can try clamping the camera and taking a shorter-exposure image (so that the camera doesn't have as much time to move), but shorter exposures lead to more random "noise" per pixel, due to the reduced sampling time.
But with enough processing power, we can use jitter techniques to solve both problems:
In our earlier "audio" example, we deliberately added high-frequency noise to an audio signal to shift the sampling threshold up and down with respect to the signal, and we took multiple samples and overlaid them to achieve sub-sample resolution.
With digital photography we can use "positional" noise: we vary the alignment of the camera sensor to the background image, take multiple samples, and overlay those (aligned to subpixel accuracy), to generate images that have higher resolution than the camera sensor. In some ways, this is a little like the Nipkow disc approach used in early television systems, that often used a swept array of less than a hundred sensor elements provide a passable image ... in this case, we're not sweeping a line strip of sensors at right angles, but an entire grid of pixel elements, and using their random(-ish) offsets to extract real intermediate detail.
Instead of camera shake being a problem, it becomes Our Friend! The individual images will be noisier, but when you recombine a secondsworth of images, the end result should have noise levels comparable to a single one-second exposure – and since you might not normally try to take a one-second exposure (because of camera stability issues), static scenes might sometimes end up with reduced noise as well as enhanced resolution.
So, if we have a programmable camera, in theory it's possible to design an "ultra-resolution" mode that fires off a series of short-exposure images while we hold the camera, and then makes us wait while its processor laboriously works out the best way to fit all the shots together ... or saves the individual shots to their own directory, to be assembled later by a piece of desktop software.
If we were able to design the camera from scratch, we'd probably also want to include a gadget to deliberately nudge the CCD sensor diagonally while the component shots were being taken. If the software's smart enough, the nudging doesn't have to be particularly accurate, it just has to give the sensor a decent spread of deliberate misalignments. A cheap little piezo device might be good enough.
The problem with this approach is getting hold of the software: In theory, you can try aligning images by hand, but in practice ... it doesn't really seem sensible.
People are already writing algorithms for this sort of stuff – it's what allows the Hubble space telescope to take those absurdly high-resolution images of distant galaxies, and presumably the military guys also use the technique to get extreme resolution enhancements from spy satellite hardware. For analysing and aligning photos with "free-form" offsets, the necessary techniques already seem to be included in the Autostitch panoramic software, which even includes the ability to distort images to make them fit together better – it wouldn't seem to take a lot to turn Autostitch into an ultra-resolution compositor.
Amateur astronomers are now enthusiastically using the technique, and sharing resources (try using "drizzle" as a Google search keyword).
Suppose that you want to take an ultra-high resolution photograph of the full Moon – you train your camera-equipped telescope at the Moon, lock it down, and set it to keep taking ten pictures per second for an hour while the Moon gradually arcs across the sky and it's corresponding image crawls across your image-sensor ... and then feed the resulting thirty-thousand-odd images into a sub-pixel alignment program, to chew over for a few weeks and pull out the underlying detail. As long as the matching algorithm knows that it's supposed to be lining up the part of the images that contain the big round yellow thing rather than the clouds or the treetops, there wouldn't seem to be any real limit to the achievable resolution. Okay, so you have different atmospheric distortions when the Moon is in different parts of the sky, and when the air temperature drifts, but with a sufficiently-smart autostitch-type warping, even that shouldn't be a problem. If you didn't have a "rewarping" feature, you'd probably just have to decide which part of the moon you wanted the software to use as a master-key when lining up the images.
Techniques like this go beyond conventional photography and enter the territory of hyperphotography – we're capturing additional information that goes beyond our camera's conventional ability to take images, and doing things that, at first sight, would seem to be physically impossible with the available hardware. A bit of knowledge of quantum mechanics principles is useful here: we're not actually breaking any laws of physics, but we're shunting information between different domains to obtain results that sometimes seem impossible.
There's a whole family of hyperphotographic techniques: I'll try to run through a few others in a future post.
Quantum Tech Could Be Used in Warfare, Scientists Warn.
-
Physics, as history taught us, can be dangerous. As quantum computing
continues to progress (slowly), researchers and governments across the
world are worr...
4 hours ago
No comments:
Post a Comment