Digital camera technology was first developed by Kodak in 1975. Since then, digital cameras have become so mainstream, that in 2022 there are an estimated 45 billion individual cameras in use across the world.
The core elements of a camera are the lens, which forms an image and the electronic sensor, which captures the image. Lens designers and optical engineers pay close attention to both the lens specification and image sensor specification to arrive at an optimal camera solution.
Why is it important to match the lens and sensor?
One good reason to achieve correct matching between the lens and image sensor is to ensure that an overly expensive lens is not selected, causing the camera to cost more than it should. Also, obtaining an optimal match will increase the likelihood that the image is high quality and suitable for the application. The following are common issues that occur when proper matching is not achieved:
Pixelization, or screen door effect
Aliasing
Incorrect field-of-view
Too much or too little magnification
Vignetting
Color shading errors
Let's break these down one by one.
Pixelization is an artifact where the optical resolution is much greater than the image sensor resolution. When this occurs, the image looks highly digitized and grid-like, or like the image was taken through a screen door. When the lens and image sensor are correctly matched, the pixel size will be small enough compared to the lens resolution that no hard pixel edges will be visible in the image when viewed at 100% digital zoom. Any digital image can be digitally magnified to the point of seeing one pixel to the next. When the optical resolution is correctly matched, the smallest features within the object will be sampled by the sensor across two or more pixels.
Aliasing occurs when the spacing of the camera pixels closely matches the spacing between edges of features in the optical image. For example, imagine taking a picture of a fence from a distance, and within the image formed by the lens, the spacing from one fence picket to the next is the same as the spacing from one pixel to the next on the image sensor. In this case, the period of the object and the pixel period are exactly matched and the image will just barely resolve the fence pickets. When the spacing is slightly less or slightly greater, an effect called aliasing occurs, resulting in banding or fringing in the image that shouldn't be there.
Incorrect field-of-view may occur when the image formed by the lens is too large or too small compared to the image sensor pixel array size. Choosing a different focal length lens or adjusting the magnification of the optical system will correct the field-of-view issue.
Too much or too little magnification. Magnification is the extent to which the lens makes the image of the object look bigger or smaller on the screen. When using a lens in a microscope or macro focus (close up) mode, a long focal length lens will produce too little magnification, while a short focal length may provide too much magnification.
Vignetting is partial or full blocking of rays of light due to mechanical surfaces within the lens. This can impact the image by making the edges dim or even fully black. The image circle of a lens is a light-to-dark transition at the image plane, where outside the circle is fully dark. When the incorrect field-of-view is used due to a mismatched focal length and image sensor, a dark ring at the outside of the image could be present in the digital image. Sometimes this can't be avoided if the lens image circle is too small compared to the desired lens to sensor matching for resolution and pixel size.
Color shading errors occur when the lens chief-ray-angle (CRA) is mismatched compared to the image sensor CRA requirement. In the article linked above, a graphic shows how when the rays miss the pixel, they can fall on the adjacent pixel causing both an intensity or illumination error as well as a color error. In a color RGB sensor, the red, green, and blue colors in the image are achieved through a pattern of filters applied on top of the image sensor. When light rays hit the adjacent pixel due to CRA error, the resulting image will contain a color banding effect called color shading.
What are the key factors when matching an image sensor and lens?
To achieve optimal matching of the image sensor and lens, an understanding of the application and use case is needed. The following basic system specifications are helpful to identify when attempting to match a lens and image sensor:
Working distance
Resolution requirement
Depth-of-field
Field-of-view or object size
Operating wavelength spectrum
Once these basic specifications are known, the parameters of the image sensor and lens need to be evaluated and selected. Typically it is best to first choose an image sensor that meets the overall system requirements for pixel resolution, framerate, data transfer rate, and interface type, then choose a lens or design a custom lens to match it.
The following are basic image sensor parameters to identify first to:
Pixel size
Number of horizontal and vertical pixels (often called pixel resolution)
Sensor format, or aspect ratio (# horizontal pixels/ # of vertical pixels)
Along with the image sensor parameters, the following key lens parameters will help narrow the range of options:
Focal length (if known)
image circle size
working distance
back focal-distance
As with most engineering problems, obtaining the best solution is a trade-off in balancing multiple variables to achieve the requirements. Two of the most common cases for lens-to-image sensor matching are described below.
If field-of-view is critical
When achieving the right field-of-view is critical, understanding the working distance and object size (or angular field-of-view) is important. It is also important to specify which image dimension the the field-of-view is defined along. Typically a diagonal or horizontal field-of-view is specified, with the image sensor being oriented with the greater number of pixels aligned to the horizontal axis.
To achieve the right lens to sensor matching, the lens focal length is the most important parameter to choose correctly. The lens focal length should be specified so the angular field-of-view fills at most 100% of the image sensor dimension. For lenses working at several meters to infinity working distance, the focal length can be calculated through simple trigonometry, comparing the object angle and working distance to the image angle and lens focal length. For closer working distances or more complex lens setups, lens formulas and optical design software are needed.
If the image resolution is critical
When image resolution is critical, the size of the smallest object that the camera needs to resolve will help establish the lens focal length requirement, magnification, and image sensor pixel resolution. Getting these right will also avoid pixelization and aliasing.
A widely adopted criteria to achieve suitable resolution is a signal sampling method called Nyquist sampling. Nyquist sampling states that to fully resolve an object feature of a certain size, there must be two pixels or more across the width of the object needing to be resolved. When there are less than two pixels, aliasing can occur and the object frequencies may not be resolved. When there are far less than two pixels, the image will have a pixelation or screen door effect.
In order to avoid pixelization, the sharpness of the lens must be quantified or estimated. For many high-quality lenses, the image sharpness can be calculated based on the lens f-number using a standard resolution formula. In some cases, low-quality lenses will not produce image sharpness matching the theoretical sharpness and further characterization may be needed.
Summary
Obtaining the best final image quality comes down to understanding the application and requirements, balancing several variables to achieve the best performance, and selecting the right lens to match with the image sensor chosen.
Comentários