Cameras. Everyday we use them, and they have a wide variety of applications such as facilitating video communication, taking pictures at crime scenes, and storing data pictures taken whilst on holiday. These cameras use image sensors, and this article explains the image sensor types.
Before we deal with the various image sensor types, let us first define the term image sensor.
Table of Contents
What is an Image Sensor?
An image sensor is a sensor that detects and conveys the data that is used to create an image by converting the variable attenuation of waves into signals. The waves can be light or other electromagnetic radiation.
The two main Types of electronic image sensors are the Charge Coupled Device (CCD), and the Complementary Metal Oxide Semiconductor (CMOS). Both CCD and CMOS sensors are based on metal oxide semiconductor technology with CMOS sensors based on Metal Oxide Semiconductor Field Effect Transistor (MOSFET) amplifiers and CCD sensors based on the MOS capacitors.
Similarities between CCD and CMOS Sensors
1. Light Detection
Both image sensor types detect light the exact same way. An incoming photon of light hits an atom of silicon. When this occurs, one of the electrons in the atom is boosted to a higher energy level i.e. the conduction band. When the electron is boosted up to the conduction band it is freed to move to other adjacent atoms, and these mobile electrons are referred to as photoelectrons.
Both types of image sensors use pixels. Pixels refer to the tiny square region of silicon that collects and holds the photoelectrons. An example that is commonly used to explain pixels is a group of rain buckets in a field, with each collecting rain water. After a storm passes, if one wants to know how much it rained in any part of the field, one just needs to measure how full each bucket is.
Differences between CCD and CMOS Sensors
While both image sensor types use pixels, the operation of each pixel is different. In a CCD sensor, light enters the photoreceptor and it is stored as an electrical charge within the sensor. It is then converted to a voltage signal, buffered, and sent out as an analogue sensor when the shutter is closed. In a CMOS sensor, each pixel has a photoreceptor that performs its own charge to voltage conversion, and typically includes digitization circuits, amplifiers, and noise correction circuits that allow the sensor to output the digital data directly. In this image sensor type, the pixels do not usually store any charge. All they do is read how much light is hitting the pixel at a particular moment and read out progressively from the top left right to the bottom left. They do all of this line by line while the shutter is open.
2. Digital Output
A key advantage of the CMOS image sensor type is that it provides digital output, and it can be controlled at the pixel level. The manner in which the sensor can be controlled at the pixel level cannot be replicated with CCDs. This results in CMOS sensors becoming the image sensors chosen for specialized imaging as it is quite common to apply partial scanning or apply a particular control process to only a segment of the sensor in specialized imaging.
3. Quantum Efficiency
The CCD image sensor type has a higher quantum efficiency as compared to the CMOS sensor. The proportion of each pixel that is dedicated to light gathering vs. being masked for other functions is also comparatively high. In addition, the CCD image sensor type generally has lower noise than the CMOS sensors.
4. Power Consumption
The CCD image sensor type generally consumes more power than the CMOS image sensor type. This is a key consideration when choosing the application of the device which will use these sensors. Battery powered applications would need an image sensor that consumes as little power as possible. The CMOS sensor shines in this regard.
One needs to choose between the global and rolling shutter. This is the most significant issue that determines which sensor to choose for the particular application present. In global shutter sensors, all pixels begin and end exposure at the same time, but readout happens line by line. These sensors are essential for imaging high speed moving objects.
In roller shutter sensors, the exposure timing is different line by line with reset and readout happening at shifted times. This row by row exposure produces image distortion if either the camera or the target is in motion. These sensors provide excellent sensitivity for imaging slow moving and static objects.
CMOS sensors generally use a rolling shutter which is always active and rolling through the pixels line by line from top to bottom. The CCDs, however, store their electrical charges and read out when the shutter is closed and the pixel is rest for the next exposure. This allows the entire sensor area to be output simultaneously.
As a result, CCDs manage horizontal motion, and rotational movement quite well. In addition in environments illuminated by laser pulses or strobe light, CCDs manage these light conditions quite well. In addition, the CCD sensor can be more easily triggered, thus, enabling synchronous timing of the light or motion to open the shutter phase.
CMOS sensors do not need to have complex external clock driver electronics that produce precise voltages and waveforms to move charges around the sensor. They do not need analogue to digital converters, complex external readout electronics and double correlated samplers. All of the electronic components that are needed for readout are built right into the sensor. The single chip just needs to have clean power in order for it to provide a good image. This results in CMOS sensors being much cheaper than the CCDs. This, among other factors has resulted in CMOS sensors gaining dominance over CCD sensors in video, smartphone and DSLR applications due to their low cost and fast readout speed.
This article gave an introduction to image sensors. The two main types of image sensors were brought out, and a thorough comparison of the two was brought up.
We hope you enjoyed the article.