The Food and Agriculture Organization of the United Nations predicts that the world population will grow by more than 30% by 2050, reaching some 9 billion people. Farmers need to adopt new precision-farming processes that enable them to produce more crops, more efficiently, to provide enough food for everyone. The variability of environmental conditions requires tools that can manage all biophysical and productive factors in their complexity.
Remote sensing techniques have exponentially evolved, thanks to technological progress with the spread of multispectral cameras. Hyperspectral imaging (HSI) is the capture and processing of an image at a very high number of wavelengths. While multispectral imaging (MSI) can evaluate the process with three or four colors (red, green, blue, and near-infrared, or NIR), HSI splits the image into tens or hundreds of colors. By using the technique of spectroscopy, which is used to identify materials based on how light behaves when it hits a subject, HSI obtains more spectra of data for each pixel in the image of a scene.
Unlike radiography, HSI is a non-destructive, non-contact technology that can be used without damaging the object being analyzed. For example, a drone with a hyperspectral camera can detect plant diseases, weeds, and soil erosion problems, as well as estimate crop yields.
SpectroscopyRemote sensing has driven manufacturers to market hyperspectral chambers for applications ranging from precision farming to water quality control. The underlying phenomenon is spectroscopy. Spectroscopy is a method of investigation that measures and reads spectra resulting from the interaction of electromagnetic radiation with matter. It is related to the absorption, emission, or scattering of electromagnetic radiation from atoms or molecules. In practice, the fundamental principle consists of a beam of electromagnetic radiation on the desired sample and observing how it responds to that stimulus.
The electromagnetic spectrum can be divided into seven regions, according to wavelength. Depending on the use, a distinction is made between the various instruments. When light hits a surface, it is reflected, absorbed, or refracted to different degrees, depending on the characteristics of the medium. The total unit of energy can be divided into three factors: 𝛼 (absorbance — energy absorbed by the body per unit of incident energy); 𝜌 (reflectance — energy reflected by the body per unit of incident energy); and 𝜏 (transmittance — energy passing through the body per unit of incident energy) (Figure 1).
Unreflected light can be absorbed by the medium, in which case it is disposed of in the form of heat, or refracted. The phenomenon of refraction originates from the fact that the speed of electromagnetic waves in a medium other than a vacuum is different and always less than the speed of light (c), changing from one medium to another (Figures 1 and 2).
Figure 1: Electromagnetic spectrum (Source: Brittanica)
Figure 2: Incident energy components in a body
Each body emits energy in proportion to its surface temperature. Thanks to Wien’s Law, it is possible to search for the peak of this emission within the electromagnetic radiation spectrum: 𝜆max𝑇𝑠𝑢𝑝 = 2,893 𝜇𝑚𝐾.
Another factor to consider is radiance. This allows measuring the amount of energy coming from the area unit and in the solid angle unit. It is defined as:
Where P is the power (W), A is the radiating surface (m2), Ω is the solid angle (sr), and 𝜃 is the angle between the normal and the surface of the observation point (°).
These parameters make it possible to define the view and, thus, the measurement. The angle of observation and incidence influence the reflection of the energy and thus its measurement.
The sensor, whether multispectral or hyperspectral, does not automatically measure radiances but voltages, which are then converted into image units by the electronics of the instrument. The process of transforming radiances in different spectral bands is called “radiance calibration” or “radiometric calibration.” The optical sensors used in remote sensing are used to conduct spatial and temporal analyses. The single acquisition usually consists of an image representing the distribution of the radiance in space at a specific band.
Hyperspectral systemThe main unit of a hyperspectral chamber is the optical sensor that acquires a series of images of the electromagnetic spectrum following the physical process of reflection and refraction described in the previous paragraph. Images are processed by a digital signal processor to identify the required information. This is a real recognition factor for physical, chemical, or biological processes. Moreover, an accurate interpretation of the spectral signature and its variations over time leads to the identification of the biochemical elements under study or analysis.
Datasets of hyperspectral origin generally consist of more than 100 spectral bands of relatively narrow bandwidths (5 to 10 nm), while multispectral data sets usually consist of about five to 10 bands of relatively large bandwidths. The spectral regions commonly used are those of the visible, near-infrared, and short-wave infrared.
Hyperspectral sensors are passive elements; they collect information as a set of images representing different bands of the electromagnetic spectrum. These images are combined to form a cube of hyperspectral data, which can be processed and analyzed to read the spectral data for a wide range of applications.
A hyperspectral cube consists of a set of images layered one above the other. Each image represents a particular wavelength band, and in a hyperspectral image, each pixel consists of a spectrum over an appropriate spectral region. Each object has a unique characteristic between the different wavelength bands. This unique characteristic is its “spectral signature” (Figure 3).
Figure 3: Schematic showing hypercube structure (Source: “Potential applications of hyperspectral imaging for quality control in dairy foods,” Aoife A. Gowen)
Most HSI systems consist of an imaging lens, a narrow slit, a diffraction grating, and a two-dimensional focal plane array detector (usually CCD or CMOS).
The image is projected through the slit onto the diffraction grating, where the light is divided into wavelengths before being projected onto the focal plane matrix. At each X-Y coordinate, one pixel is illuminated at a certain level, depending on the intensity of the light in that position and the wavelength, resulting in a 3D matrix for each narrow-slit width.
The sensor technology can be distinguished in its simplest form of CCD, the full-frame CCD, in which the light-sensitive area is fully exposed. The data is read by moving the entire pixel matrix one line at a time to the readout registers, which in turn transfer one pixel at a time to the converter.
On the other hand, the other type of active sensor, CMOS or active pixel sensor (APS) is an image sensor that contains most of the necessary functionality in each individual pixel, using CMOS technology. The light arrives through the lens and is processed by the color filter before reaching the pixel matrix. Once the filtered light reaches the matrix, each individual pixel converts the light into an amplified voltage, which is then processed by the rest of the sensor. The main parts of a CMOS sensor are the color filter, the pixel matrix, the digital controller, and the analog-digital converter.
Although CCDs and APS are both digital image sensors that exploit the photoelectric effect, they differ in their data processing and construction techniques. The differences are already noticeable in the architecture of the two sensors.
The main advantage of hyperspectral instruments is undoubtedly the great accuracy that can be achieved in the reconstruction of the spectral signature, continuous in the electromagnetic spectrum, starting from measurements that are discrete in the spectrum. The main disadvantage of HSI lies in the high data size for the processing step/elaboration process, even if over time it has seen a noticeable improvement thanks to the advent of embedded microelectronics (Figures 4 and 5).
In combining the spectral information provided by spectroscopy and the spatial information provided by imaging, HSI offers improved knowledge on the composition and distribution of components in a product (Table 1).
Table 1: Comparison of RGB imaging (RGB), near-infrared spectroscopy (NIRS), multispectral imaging (MSI), and hyperspectral imaging (HSI) techniques for dairy product applications (Source: “Potential applications of hyperspectral imaging for quality control in dairy foods,” Aoife A. Gowen)
Figure 4: Schematic of a CCD image sensor (Source: Hamamatsu)
Figure 5: Schematic of a CMOS image sensor (Source: Hamamatsu)
The realization of hyperspectral sensors involves different materials for various fields of application. Silicon is used for the acquisition of NIR, ultraviolet, visible, and short-wave regions; indium arsenide (InAs) and GaAs have a spectral response between 900 and 1,700 nm; indium gallium arsenide (InGaAs) extends the previous range to 2,600 nm, and mercury cadmium tellurium (HgCdTe) has a wide spectral range and high quantum efficiency to achieve mid-regional infrared (about 2,500 to 25,000 nm) and NIR (about 800 nm).
Hyperspectral imaging for agricultureHSI can be used for a variety of applications, including mineralogy, agriculture, astronomy, and surveillance, through UAV solutions. This technology has enabled many advances in precision agriculture, which requires more than just basic RGB information. For example, video images flying over a field would not be able to accurately distinguish real plants from fake ones. By looking at the spectral content in pixels, hyperspectral solutions can detect chlorophyll or very small color changes in foliage.
An example is the inspection of orange groves for citrus blight. Citrus blight destroys the vitality of trees and can spread throughout the grove. One of the first signs of this disease is a by-product secreted on the surface of the leaves. With HSI, this can be seen 300 to 400 meters above crops, covering a large area quickly (mounted on UAVs).
In precision farming, a distinction is made between ground sensing technologies, also known as proximal sensing, and remote sensing technologies. The former collects data on crop and process conditions through fixed or mobile ground sensors, while remote sensing technologies use optical satellite or drone-mounted sensors, which generate multispectral and hyperspectral images of crops.
The use of drones is associated with a reduction in the time needed to acquire images and data and allows great flexibility of intervention together with greater spatial resolution. The drones lead to the optimization of the use of pesticides, fertilizer, and seeds, as well as an estimated water savings of up to 25% less than the quantity used with more traditional technologies (Figure 6).
Figure 6: Example of a hyperspectral process in agriculture (Source: Gamaya)
Gamaya is an analysis platform that uses a combination of HSI data together with corresponding historical weather and climate records to provide farmers with pest and disease warnings, yield forecasts, and input application rate requirements.
FluroSat uses various sensing methods to capture and analyze hyperspectral images of cotton and wheat fields to predict disease and help farmers make crop health decisions. Multispectral cameras can measure whether a plant is healthy or not, but hyperspectral images can go further and diagnose the exact reason for that state. In addition, the FluroSat platform displays nitrogen maps and suggests to agronomists the exact locations from where to take tissue samples.
ImpactVision uses HSI to help companies in the food chain to determine the quality and maturity of food products. In meat, ImpactVision can determine tenderness, allowing meat producers to guarantee the quality of their meat for a premium price. Another example is avocado ripeness, which ImpactVision can measure through images.
ConclusionHyperspectral analysis has the potential to develop applications in any discipline. Data analysis is delicate and must be treated in a careful and conscious way to avoid providing incorrect information. As far as the control and protection of the environment is concerned, hyperspectral remote sensing proves to be useful both in the prevention phase and in the subsequent phase of analysis of the consequences of any event. In the study of landslides, for example, during the prevention phase, it is possible to control the humidity and soil deformation, the arrangement of the lithological layers, the vegetation cover, and water losses. In the phase following the landslide event, the damage can be estimated by delimiting the area affected by the phenomenon.
This article was originally published on sister site EE Times Europe.