Physics and Astronomy Instrumentation

Advanced Optical Sensing Technologies

Description

This cluster of papers covers a wide range of advances in time-of-flight imaging techniques, including laser ranging, single-photon detection, lidar systems, and non-line-of-sight imaging. The papers discuss applications such as 3D imaging, depth sensing, and photon counting in various scenarios.

Keywords

Time-of-Flight; Laser Ranging; Single-Photon Detection; 3D Imaging; Range Camera; Lidar; Photon Counting; CMOS Sensor; Depth Sensing; Non-Line-of-Sight Imaging

Avalanche photodiodes, which operate above the breakdown voltage in Geiger mode connected with avalanche-quenching circuits, can be used to detect single photons and are therefore called singlephoton avalanche diodes SPAD's. … Avalanche photodiodes, which operate above the breakdown voltage in Geiger mode connected with avalanche-quenching circuits, can be used to detect single photons and are therefore called singlephoton avalanche diodes SPAD's. Circuit configurations suitable for this operation mode are critically analyzed and their relative merits in photon counting and timing applications are assessed. Simple passive-quenching circuits (PQC's), which are useful for SPAD device testing and selection, have fairly limited application. Suitably designed active-quenching circuits (AQC's) make it possible to exploit the best performance of SPAD's. Thick silicon SPAD's that operate at high voltages (250-450 V) have photon detection efficiency higher than 50% from 540- to 850-nm wavelength and still ~3% at 1064 nm. Thin silicon SPAD's that operate at low voltages (10-50 V) have 45% efficiency at 500 nm, declining to 10% at 830 nm and to as little as 0.1% at 1064 nm. The time resolution achieved in photon timing is 20 ps FWHM with thin SPAD's; it ranges from 350 to 150 ps FWHM with thick SPAD's. The achieved minimum counting dead time and maximum counting rate are 40 ns and 10 Mcps with thick silicon SPAD's, 10 ns and 40 Mcps with thin SPAD's. Germanium and III-V compound semiconductor SPAD's extend the range of photon-counting techniques in the near-infrared region to at least 1600-nm wavelength.
The design and characterization of an imaging system is presented for depth information capture of arbitrary three-dimensional (3-D) objects. The core of the system is an array of 32 /spl … The design and characterization of an imaging system is presented for depth information capture of arbitrary three-dimensional (3-D) objects. The core of the system is an array of 32 /spl times/ 32 rangefinding pixels that independently measure the time-of-flight of a ray of light as it is reflected back from the objects in a scene. A single cone of pulsed laser light illuminates the scene, thus no complex mechanical scanning or expensive optical equipment are needed. Millimetric depth accuracies can be reached thanks to the rangefinder's optical detectors that enable picosecond time discrimination. The detectors, based on a single photon avalanche diode operating in Geiger mode, utilize avalanche multiplication to enhance light detection. On-pixel high-speed electrical amplification can therefore be eliminated, thus greatly simplifying the array and potentially reducing its power dissipation. Optical power requirements on the light source can also be significantly relaxed, due to the array's sensitivity to single photon events. A number of standard performance measurements, conducted on the imager, are discussed in the paper. The 3-D imaging system was also tested on real 3-D subjects, including human facial models, demonstrating the suitability of the approach.
The first superlattice avalanche photodiode (APD) is reported. The high field region of this p-i-n structure consists of 50 alternating Al0.45Ga0.55As (550 Å) and GaAs (450 Å) layers. A large … The first superlattice avalanche photodiode (APD) is reported. The high field region of this p-i-n structure consists of 50 alternating Al0.45Ga0.55As (550 Å) and GaAs (450 Å) layers. A large ionization rate ratio has been measured in the field range (2.1–2.7)×105 V/cm, with α/β≃10 at a gain of 10 giving a McIntyre noise factor Fn = 3. The ionization rate ratio enhancement with respect to bulk GaAs and AlGaAs is attributed to the large difference in the band edge discontinuities for electrons and holes at the heterojunction interfaces. The superlattice APD is a new device concept which can be used to develop low noise APD’s in a variety of III-V materials including long wavelength 1.3–1.6-μm semiconductors.
Photoetching of polymethyl methacrylate (PMMA) for pulsed high power UV light is demonstrated. As a high power UV light source, a KrF excimer laser was used. Etching depth obtained by … Photoetching of polymethyl methacrylate (PMMA) for pulsed high power UV light is demonstrated. As a high power UV light source, a KrF excimer laser was used. Etching depth obtained by deep UV light irradiation has not only energy dependence, but also power dependence. It increased abruptly by increasing the exposed power density for the same exposed energy density.These experimental results show that high power excimer lasers are the effective light source for UV photoetching of PMMA.
We review some usual laser range finding techniques for in- dustrial applications. After outlining the basic principles of triangulation and time of flight (pulsed, phase-shift and frequency modulated continu- ous … We review some usual laser range finding techniques for in- dustrial applications. After outlining the basic principles of triangulation and time of flight (pulsed, phase-shift and frequency modulated continu- ous wave (FMCW)), we discuss their respective fundamental limitations. Selected examples of traditional and new applications are also briefly presented. © 2001 Society of Photo-Optical Instrumentation Engineers.
We review the current status of single-photon-source and single-photon-detector technologies operating at wavelengths from the ultraviolet to the infrared. We discuss applications of these technologies to quantum communication, a field … We review the current status of single-photon-source and single-photon-detector technologies operating at wavelengths from the ultraviolet to the infrared. We discuss applications of these technologies to quantum communication, a field currently driving much of the development of single-photon sources and detectors.
Expressions are derived for the probability <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">P_{n,m}</tex> that a pulse initiated by <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</tex> electrons (or holes) in a uniformly multiplying semiconductor diode will result in a … Expressions are derived for the probability <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">P_{n,m}</tex> that a pulse initiated by <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</tex> electrons (or holes) in a uniformly multiplying semiconductor diode will result in a total number of electrons (or holes) <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">m</tex> , to give a gain <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">m/n</tex> , and for the probability <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Q_{n,m}</tex> that the gain will be <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">m/n</tex> or greater. It is shown that the distributions are far from Gaussian. The gain distribution <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">P_{1,m}</tex> for a single photoelectron, for example, is shown to have a maximum value for <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">m = 1</tex> for any value of the average gain <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">M=m/n</tex> . The derivations are valid for any electric field distribution and assume only that the hole ionization coefficient <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">\beta(E</tex> ) can be approximated by the relation <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">\beta(E) =k\alpha(E)</tex> , where <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">\alpha(E)</tex> is the electron ionization coefficient and <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</tex> is a constant. A method of determining an effective value of <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">k</tex> , for cases where <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">\beta=k\alpha</tex> is not a good approximation, is presented. The results can be used to calculate the average gain and the mean square deviation from the average, giving results in agreement with previously published relations [1], [2]. The implications of this theory on the use of avalanche diodes for low-level photodetection are discussed. It is shown that in the near infrared, cooled avalanche photodiodes can compare favorably with the best available photomultiplier when used either in a photon-counting mode, or for the reliable detection of low-level laser pulses.
An eye safe, compact, solid-state lidar for profiling atmospheric cloud and aerosol scattering is described. The transmitter of the micro pulse lidar is a diode pumped microjoule pulse energy, high-repetition-rate … An eye safe, compact, solid-state lidar for profiling atmospheric cloud and aerosol scattering is described. The transmitter of the micro pulse lidar is a diode pumped microjoule pulse energy, high-repetition-rate Nd:YLF laser. Eye safety is obtained through beam expansion. The receiver uses a photon counting solid-state Geiger mode avalanche photodiode detector. Data acquisition is by a single card multichannel scaler. Daytime background induced quantum noise is controlled by a narrow receiver field-of-view (FOV) and a narrow bandwidth temperature controlled interference filter. Dynamic range of the signal is limited by optical geometric signal compression. Signal simulations and initial atmospheric measurements indicate that systems built on the micro pulse lidar concept are capable of detecting and profiling all significant cloud and aerosol scattering through the troposphere and into the stratosphere. The intended applications are scientific studies and environmental monitoring. which require full-time unattended measurements of the cloud and aerosol height structure.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>
Semiconductor photodiodes were developed in the early `Forties approximately at the time when the photomultiplier tube became a commercial product (RCA 1939). Only in recent years, with the invention of … Semiconductor photodiodes were developed in the early `Forties approximately at the time when the photomultiplier tube became a commercial product (RCA 1939). Only in recent years, with the invention of the Geiger-mode avalanche photodiodes, have the semiconductor photo detectors reached sensitivity comparable to that of photomultiplier tubes. The evolution started in the `Sixties with the p-i-n (PIN) photodiode, a very successful device, which is still used in many detectors for high energy physics and a large number of other applications like radiation detection and medical imaging. The next step was the development of the avalanche photodiode (APD) leading to a substantial reduction of noise but not yet achieving single photon response. The weakest light flashes that can be detected by the PIN diode need to contain several hundreds of photons. An improvement of the sensitivity by 2 orders of magnitude was achieved by the development of the avalanche photodiode, a device with internal gain. At the end of the millennium, the semiconductor detectors evolved with the Geiger-mode avalanche photodiode into highly sensitive devices, which have an internal gain comparable to the gain of photomultiplier tubes and a response to single photons. A review of the semiconductor photo detector design and development, the properties and problems, some applications and a speculative outlook on the future evolution will be presented.
The concept of a real-time range camera without moving parts is described, based on the time-of-flight (TOF) principle. It operates with modulated visible and near-infrared radiation, which is detected and … The concept of a real-time range camera without moving parts is described, based on the time-of-flight (TOF) principle. It operates with modulated visible and near-infrared radiation, which is detected and demodulated simultaneously by a 2-D array of lock-in pixels employing the charge-coupled device principle. Each pixel individually measures the amplitude, offset and phase of the received radiation. The theoretical resolution limit of this TOF range camera is derived, which depends on the square root of the detected background radiation and the inverse of the modulation amplitude. Actual measurements of 3-D sequences acquired at 10 range images per second show excellent agreement between our theory and the observed results. A range resolution of a few centimeters over a range of 10 m, with an illumination power of a few hundreds of milliwatts is obtained in laboratory scenes for noncooperative, diffusely reflecting objects.
This tutorial reviews various noncontact optical sensing techniques that can be used to measure distances to objects, and related parameters such as displacements, surface profiles, velocities and vibrations. The techniques … This tutorial reviews various noncontact optical sensing techniques that can be used to measure distances to objects, and related parameters such as displacements, surface profiles, velocities and vibrations. The techniques that are discussed and compared include intensity-based sensing, triangulation, time-of-flight sensing, confocal sensing, Doppler sensing, and various kinds of interferometric sensing with both high- and low-coherence sources.
We experimentally demonstrate a supercurrent-assisted, hotspot-formation mechanism for ultrafast detection and counting of visible and infrared photons. A photon-induced hotspot leads to a temporary formation of a resistive barrier across … We experimentally demonstrate a supercurrent-assisted, hotspot-formation mechanism for ultrafast detection and counting of visible and infrared photons. A photon-induced hotspot leads to a temporary formation of a resistive barrier across the superconducting sensor strip and results in an easily measurable voltage pulse. Subsequent hotspot healing in ∼30 ps time frame, restores the superconductivity (zero-voltage state), and the detector is ready to register another photon. Our device consists of an ultrathin, very narrow NbN strip, maintained at 4.2 K and current-biased close to the critical current. It exhibits an experimentally measured quantum efficiency of ∼20% for 0.81 μm wavelength photons and negligible dark counts.
A simple analytical method is presented that shows some potential for application to the problem of extracting attenuation and backscatter coefficients in an inhomogeneous atmosphere from the return signal of … A simple analytical method is presented that shows some potential for application to the problem of extracting attenuation and backscatter coefficients in an inhomogeneous atmosphere from the return signal of a monostatic single-wavelength lidar system. The method assumes the validity of the single-scattering lidar equation and a power law relationship between backscatter and attenuation. For optical depths greater than unity the inversion method can be applied in principle using only information contained in the signal itself. In contrast to a well-known related analytical inversion solution, the new solution form is shown to be stable with respect to perturbations in the signal, the postulated relationship between backscatter and attenuation, and the assumed or estimated boundary value of attenuation.
The backscattered light amplitude is measured from one end of a fiber as a function of optical frequency by tuning a HeNe laser over its linewidth. Fourier transform by a … The backscattered light amplitude is measured from one end of a fiber as a function of optical frequency by tuning a HeNe laser over its linewidth. Fourier transform by a low-frequency electronic spectrum analyzer then reveals the spatial distribution of scattering and of the fiber losses.
Imagers that use their own illumination can capture three-dimensional (3D) structure and reflectivity information. With photon-counting detectors, images can be acquired at extremely low photon fluxes. To suppress the Poisson … Imagers that use their own illumination can capture three-dimensional (3D) structure and reflectivity information. With photon-counting detectors, images can be acquired at extremely low photon fluxes. To suppress the Poisson noise inherent in low-flux operation, such imagers typically require hundreds of detected photons per pixel for accurate range and reflectivity determination. We introduce a low-flux imaging technique, called first-photon imaging, which is a computational imager that exploits spatial correlations found in real-world scenes and the physics of low-flux measurements. Our technique recovers 3D structure and reflectivity from the first detected photon at each pixel. We demonstrate simultaneous acquisition of sub-pulse duration range and 4-bit reflectivity information in the presence of high background noise. First-photon imaging may be of considerable value to both microscopy and remote sensing.
Laser scanning range sensors are widely used for high-precision, high-density three-dimensional (3D) reconstruction and inspection of the surface of physical objects. The process typically involves planning a set of views, … Laser scanning range sensors are widely used for high-precision, high-density three-dimensional (3D) reconstruction and inspection of the surface of physical objects. The process typically involves planning a set of views, physically altering the relative object-sensor pose, taking scans, registering the acquired geometric data in a common coordinate frame of reference, and finally integrating range images into a nonredundant model. Efficiencies could be achieved by automating or semiautomating this process. While challenges remain, there are adequate solutions to semiautomate the scan-register-integrate tasks. On the other hand, view planning remains an open problem---that is, the task of finding a suitably small set of sensor poses and configurations for specified reconstruction or inspection goals. This paper surveys and compares view planning techniques for automated 3D object reconstruction and inspection by means of active, triangulation-based range sensors.
An algorithm, which calculates the motions of the collimator jaws required to generate a given arbitrary intensity profile, is presented. The intensity profile is assumed to be piecewise linear, i.e., … An algorithm, which calculates the motions of the collimator jaws required to generate a given arbitrary intensity profile, is presented. The intensity profile is assumed to be piecewise linear, i.e., to consist of segments of straight lines. The jaws move unidirectionally and continuously with variable speed during radiation delivery. During each segment, at least one of the jaws is set to move at the maximum permissible speed. The algorithm is equally applicable for multileaf collimators (MLC), where the transmission through the collimator leaves is taken into account. Examples are presented for different intensity profiles with varying degrees of complexity. Typically, the calculation takes less than 10 ms on a VAX 8550 computer.
In range sensing with time-multiplexed structured light, there is a trade-off between accuracy, robustness and the acquisition period. In this paper a novel structured light method is described. Adaptation of … In range sensing with time-multiplexed structured light, there is a trade-off between accuracy, robustness and the acquisition period. In this paper a novel structured light method is described. Adaptation of the number and form of the projection patterns to the characteristics of the scene takes place as part of the acquisition process. Noise margins are matched to the actual noise level, thus reducing the number of projection patterns to the necessary minimum. Color is used for light plane labeling. The dimension of the pattern space are thus increased without raising the number of projection patterns. It is shown that the color of an impinging light plane can be identified from the image of the illuminated scene, even with colorful scenes. Identification is local and does not rely on spatial color sequences. The suggested approach has been implemented and the theoretical results are supported by experiments.
This paper reviews the state-of-the art in the field of lock-in time-of-flight (ToF) cameras, their advantages, their limitations, the existing calibration methods, and the way they are being used, sometimes … This paper reviews the state-of-the art in the field of lock-in time-of-flight (ToF) cameras, their advantages, their limitations, the existing calibration methods, and the way they are being used, sometimes in combination with other sensors. Even though lock-in ToF cameras provide neither higher resolution nor larger ambiguity-free range compared to other range map estimation systems, advantages such as registered depth and intensity data at a high frame rate, compact design, low weight, and reduced power consumption have motivated their increasing usage in several research areas, such as computer graphics, machine vision, and robotics.
Abstract Time-of-flight three-dimensional imaging is an important tool for applications such as object recognition and remote sensing. Conventional time-of-flight three-dimensional imaging systems frequently use a raster scanned laser to measure … Abstract Time-of-flight three-dimensional imaging is an important tool for applications such as object recognition and remote sensing. Conventional time-of-flight three-dimensional imaging systems frequently use a raster scanned laser to measure the range of each pixel in the scene sequentially. Here we show a modified time-of-flight three-dimensional imaging system, which can use compressed sensing techniques to reduce acquisition times, whilst distributing the optical illumination over the full field of view. Our system is based on a single-pixel camera using short-pulsed structured illumination and a high-speed photodiode, and is capable of reconstructing 128 × 128-pixel resolution three-dimensional scenes to an accuracy of ∼3 mm at a range of ∼5 m. Furthermore, by using a compressive sampling strategy, we demonstrate continuous real-time three-dimensional video with a frame-rate up to 12 Hz. The simplicity of the system hardware could enable low-cost three-dimensional imaging devices for precision ranging at wavelengths beyond the visible spectrum.
Squeezed states of light belong to the most prominent nonclassical resources. They have compelling applications in metrology, which has been demonstrated by their routine exploitation for improving the sensitivity of … Squeezed states of light belong to the most prominent nonclassical resources. They have compelling applications in metrology, which has been demonstrated by their routine exploitation for improving the sensitivity of a gravitational-wave detector since 2010. Here, we report on the direct measurement of 15 dB squeezed vacuum states of light and their application to calibrate the quantum efficiency of photoelectric detection. The object of calibration is a customized InGaAs positive intrinsic negative (p-i-n) photodiode optimized for high external quantum efficiency. The calibration yields a value of 99.5% with a 0.5% (k=2) uncertainty for a photon flux of the order 1017 s−1 at a wavelength of 1064 nm. The calibration neither requires any standard nor knowledge of the incident light power and thus represents a valuable application of squeezed states of light in quantum metrology.Received 15 June 2016DOI:https://doi.org/10.1103/PhysRevLett.117.110801© 2016 American Physical SocietyPhysics Subject Headings (PhySH)Research AreasQuantum metrologyQuantum state engineeringSecond order nonlinear optical processesSqueezing of quantum noiseGeneral PhysicsAtomic, Molecular & Optical
Structured light refers to the generation and application of custom light fields. As the tools and technology to create and detect structured light have evolved, steadily the applications have begun … Structured light refers to the generation and application of custom light fields. As the tools and technology to create and detect structured light have evolved, steadily the applications have begun to emerge. This roadmap touches on the key fields within structured light from the perspective of experts in those areas, providing insight into the current state and the challenges their respective fields face. Collectively the roadmap outlines the venerable nature of structured light research and the exciting prospects for the future that are yet to be realized.
Depth and intensity profiling of targets at a range of up to 10 km is demonstrated using time-of-flight time-correlated single-photon counting technique. The system comprised a pulsed laser source at … Depth and intensity profiling of targets at a range of up to 10 km is demonstrated using time-of-flight time-correlated single-photon counting technique. The system comprised a pulsed laser source at 1550 nm wavelength, a monostatic scanning transceiver and a single-element InGaAs/InP single-photon avalanche diode (SPAD) detector. High-resolution three-dimensional images of various targets acquired over ranges between 800 metres and 10.5 km demonstrate long-range depth and intensity profiling, feature extraction and the potential for target recognition. Using a total variation restoration optimization algorithm, the acquisition time necessary for each pixel could be reduced by at least a factor of ten compared to a pixel-wise image processing approach. Kilometer range depth profiles are reconstructed with average signal returns of less than one photon per pixel.
We present, to the best of our knowledge, the first demonstration of coherent solid-state light detection and ranging (LIDAR) using optical phased arrays in a silicon photonics platform. An integrated … We present, to the best of our knowledge, the first demonstration of coherent solid-state light detection and ranging (LIDAR) using optical phased arrays in a silicon photonics platform. An integrated transmitting and receiving frequency-modulated continuous-wave circuit was initially developed and tested to confirm on-chip ranging. Simultaneous distance and velocity measurements were performed using triangular frequency modulation. Transmitting and receiving optical phased arrays were added to the system for on-chip beam collimation, and solid-state beam steering and ranging measurements using this system are shown. A cascaded optical phase shifter architecture with multiple groups was used to simplify system control and allow for a compact packaged device. This system was fabricated within a 300 mm wafer CMOS-compatible platform and paves the way for disruptive low-cost and compact LIDAR on-chip technology.
3D imaging technologies are applied in numerous areas, including self-driving cars, drones, and robots, and in advanced industrial, medical, scientific, and consumer applications. 3D imaging is usually accomplished by finding … 3D imaging technologies are applied in numerous areas, including self-driving cars, drones, and robots, and in advanced industrial, medical, scientific, and consumer applications. 3D imaging is usually accomplished by finding the distance to multiple points on an object or in a scene, and then creating a point cloud of those range measurements. Different methods can be used for the ranging. Some of these methods, such as stereovision, rely on processing 2D images. Other techniques estimate the distance more directly by measuring the round-trip delay of an ultrasonic or electromagnetic wave to the object. Ultrasonic waves suffer large losses in air and cannot reach distances beyond a few meters. Radars and lidars use electromagnetic waves in radio and optical spectra, respectively. The shorter wavelengths of the optical waves compared to the radio frequency waves translates into better resolution, and a more favorable choice for 3D imaging. The integration of lidars on electronic and photonic chips can lower their cost, size, and power consumption, making them affordable and accessible to all the abovementioned applications. This review article explains different lidar aspects and design choices, such as optical modulation and detection techniques, and point cloud generation by means of beam-steering or flashing an entire scene. Popular lidar architectures and circuits are presented, and the superiority of the FMCW lidar is discussed in terms of range resolution, receiver sensitivity, and compatibility with emerging technologies. At the end, an electronic-photonic integrated circuit for a micro-imaging FMCW lidar is presented as an example.
Abstract Single-photon avalanche diode (SPAD) arrays are solid-state detectors that offer imaging capabilities at the level of individual photons, with unparalleled photon counting and time-resolved performance. This fascinating technology has … Abstract Single-photon avalanche diode (SPAD) arrays are solid-state detectors that offer imaging capabilities at the level of individual photons, with unparalleled photon counting and time-resolved performance. This fascinating technology has progressed at a very fast pace in the past 15 years, since its inception in standard CMOS technology in 2003. A host of architectures have been investigated, ranging from simpler implementations, based solely on off-chip data processing, to progressively “smarter” sensors including on-chip, or even pixel level, time-stamping and processing capabilities. As the technology has matured, a range of biophotonics applications have been explored, including (endoscopic) FLIM, (multibeam multiphoton) FLIM-FRET, SPIM-FCS, super-resolution microscopy, time-resolved Raman spectroscopy, NIROT and PET. We will review some representative sensors and their corresponding applications, including the most relevant challenges faced by chip designers and end-users. Finally, we will provide an outlook on the future of this fascinating technology.
Lidar imaging systems are one of the hottest topics in the optronics industry. The need to sense the surroundings of every autonomous vehicle has pushed forward a race dedicated to … Lidar imaging systems are one of the hottest topics in the optronics industry. The need to sense the surroundings of every autonomous vehicle has pushed forward a race dedicated to deciding the final solution to be implemented. However, the diversity of state-of-the-art approaches to the solution brings a large uncertainty on the decision of the dominant final solution. Furthermore, the performance data of each approach often arise from different manufacturers and developers, which usually have some interest in the dispute. Within this paper, we intend to overcome the situation by providing an introductory, neutral overview of the technology linked to lidar imaging systems for autonomous vehicles, and its current state of development. We start with the main single-point measurement principles utilized, which then are combined with different imaging strategies, also described in the paper. An overview of the features of the light sources and photodetectors specific to lidar imaging systems most frequently used in practice is also presented. Finally, a brief section on pending issues for lidar development in autonomous vehicles has been included, in order to present some of the problems which still need to be solved before implementation may be considered as final. The reader is provided with a detailed bibliography containing both relevant books and state-of-the-art papers for further progress in the subject.
In recent years, Light Detection and Ranging (LiDAR) has been drawing extensive attention both in academia and industry because of the increasing demand for autonomous vehicles. LiDAR is believed to … In recent years, Light Detection and Ranging (LiDAR) has been drawing extensive attention both in academia and industry because of the increasing demand for autonomous vehicles. LiDAR is believed to be the crucial sensor for autonomous driving and flying, as it can provide high-density point clouds with accurate three-dimensional information. This review presents an extensive overview of Microelectronechanical Systems (MEMS) scanning mirrors specifically for applications in LiDAR systems. MEMS mirror-based laser scanners have unrivalled advantages in terms of size, speed and cost over other types of laser scanners, making them ideal for LiDAR in a wide range of applications. A figure of merit (FoM) is defined for MEMS mirrors in LiDAR scanners in terms of aperture size, field of view (FoV) and resonant frequency. Various MEMS mirrors based on different actuation mechanisms are compared using the FoM. Finally, a preliminary assessment of off-the-shelf MEMS scanned LiDAR systems is given.
The <i>Journal of Electronic Imaging</i> (JEI), copublished bimonthly with the Society for Imaging Science and Technology, publishes peer-reviewed papers that cover research and applications in all areas of electronic imaging … The <i>Journal of Electronic Imaging</i> (JEI), copublished bimonthly with the Society for Imaging Science and Technology, publishes peer-reviewed papers that cover research and applications in all areas of electronic imaging science and technology.
Single-photon detectors (SPDs) are the most sensitive instruments for light detection. In the near-infrared range, SPDs based on III–V compound semiconductor avalanche photodiodes have been extensively used during the past … Single-photon detectors (SPDs) are the most sensitive instruments for light detection. In the near-infrared range, SPDs based on III–V compound semiconductor avalanche photodiodes have been extensively used during the past two decades for diverse applications due to their advantages in practicality including small size, low cost and easy operation. In the past decade, the rapid developments and increasing demands in quantum information science have served as key drivers to improve the device performance of single-photon avalanche diodes and to invent new avalanche quenching techniques. This Review aims to introduce the technology advances of InGaAs/InP single-photon detector systems in the telecom wavelengths and the relevant quantum communication applications, and particularly to highlight recent emerging techniques such as high-frequency gating at GHz rates and free-running operation using negative-feedback avalanche diodes. Future perspectives of both the devices and quenching techniques are summarized. Recent progress in single-photon detectors for quantum communication based on III–V compound semiconductor avalanche photodiodes is reviewed. Specifically, Jun Zhang and Jian-Wei Pan at the University of Science and Technology of China and their colleagues in the USA and Switzerland introduce technological advances for InGaAs/InP single-photon detector systems in the telecommunication band along with their associated applications in quantum communication. III–V single-photon avalanche diodes are the most practical tools available for detecting ultraweak near-infrared light. The scientists overview important parameters for evaluating the performance of detector systems based on single-photon avalanche diodes and describe the experimental characterization of these parameters. They also consider emerging techniques, including high-frequency gating at gigahertz rates and free-running operation using negative-feedback avalanche diodes. Finally, the future prospects of these devices are considered.
Autonomous vehicles rely on their perception systems to acquire information about their immediate surroundings. It is necessary to detect the presence of other vehicles, pedestrians and other relevant entities. Safety … Autonomous vehicles rely on their perception systems to acquire information about their immediate surroundings. It is necessary to detect the presence of other vehicles, pedestrians and other relevant entities. Safety concerns and the need for accurate estimations have led to the introduction of Light Detection and Ranging (LiDAR) systems in complement to the camera or radar-based perception systems. This article presents a review of state-of-the-art automotive LiDAR technologies and the perception algorithms used with those technologies. LiDAR systems are introduced first by analyzing the main components, from laser transmitter to its beam scanning mechanism. Advantages/disadvantages and the current status of various solutions are introduced and compared. Then, the specific perception pipeline for LiDAR data processing, from an autonomous vehicle perspective is detailed. The model-driven approaches and the emerging deep learning solutions are reviewed. Finally, we provide an overview of the limitations, challenges and trends for automotive LiDARs and perception systems.
By using an optical time domain reflectometer a new measurement technique which allows displaying the length dependence of the fiber attenuation by analyzing backscattered light has been developed. This paper … By using an optical time domain reflectometer a new measurement technique which allows displaying the length dependence of the fiber attenuation by analyzing backscattered light has been developed. This paper compares the backscatter and insertion-loss techniques. In addition, results of several experiments which illustrate the versatility of an optical time domain reflectometer are described.
A novel chaotic lidar (CLIDAR) system that utilizes a chaotic laser as the light source is proposed and studied. In CLIDAR, the detection and ranging are realized by correlating the … A novel chaotic lidar (CLIDAR) system that utilizes a chaotic laser as the light source is proposed and studied. In CLIDAR, the detection and ranging are realized by correlating the signal waveform reflected back from the target with a delayed reference waveform. Benefiting from the very broad bandwidth of the chaotic waveform that can be easily generated by a semiconductor laser, a centimeter-range resolution is readily achieved. The correlation performance of CLIDAR is studied both numerically and experimentally. The power spectra, phase portraits, time series, and correlation traces of the chaotic waveforms obtained at different operating conditions are compared. The relation between the complexity of the attractor and the correlation property is examined. The correlation dimension and the largest positive Lyapunov exponent of each waveform are calculated. To compare the correlation performance of the waveforms quantitatively, peak sidelobe levels of the correlation traces with different correlation lengths and relative noise levels are investigated. Preliminary experiments show a subcentimeter accuracy in ranging with a 3-cm-range resolution, which currently is limited by the bandwidth of the oscilloscope used.
The Ice, Cloud, and land Elevation Satellite-2 has provided unprecedented global surface elevation measurements through photon-counting Lidar (Light detection and ranging), yet its low signal-to-noise ratio (SNR) poses significant challenges … The Ice, Cloud, and land Elevation Satellite-2 has provided unprecedented global surface elevation measurements through photon-counting Lidar (Light detection and ranging), yet its low signal-to-noise ratio (SNR) poses significant challenges for denoising algorithms. Existing methods, relying on fixed parameters, struggle to adapt to dynamic noise distribution in rugged mountain regions where signal and noise change rapidly. This study proposes an adaptive Bayesian denoising algorithm integrating minimum spanning tree (MST) -based slope estimation and probabilistic parameter optimization. First, a simulation framework based on ATL03 data generates point clouds with ground truth labels under varying SNRs, achieving correlation coefficients &gt; 0.9 between simulated and measured distributions. The algorithm then extracts surface profiles via MST and coarse filtering, fits slopes with &gt;0.9 correlation to reference data, and derives the probability distribution function (PDF) of neighborhood photon counts. Bayesian estimation dynamically selects optimal clustering parameters (search radius and threshold), achieving F-scores &gt; 0.9 even at extremely low SNR (1 photon/10 MHz noise). Validation against three benchmark algorithms (OPTICS, quadtree, DRAGANN) on simulated and ATL03 datasets demonstrates superior performance in mountainous terrain, with precision and recall improvements of 10–20% under high noise conditions. This work provides a robust framework for adaptive parameter selection in low-SNR photon-counting Lidar applications.
This paper introduces a single-shot ultrafast imaging technique termed wavelength and polarization time-encoded ultrafast raster imaging (WP-URI). By integrating raster imaging principles with wavelength- and polarization-based temporal encoding, the system … This paper introduces a single-shot ultrafast imaging technique termed wavelength and polarization time-encoded ultrafast raster imaging (WP-URI). By integrating raster imaging principles with wavelength- and polarization-based temporal encoding, the system uses a spatial raster mask and time–space mapping to aggregate multiple two-dimensional temporal raster images onto a single detector plane, thereby enabling the effective spatial separation and extraction of target information. Finally, the target dynamics are recovered using a reconstruction algorithm based on the Nyquist–Shannon sampling theorem. Numerical simulations demonstrate the single-shot acquisition of four dynamic frames at 25 trillion frames per second (Tfps) with an intrinsic spatial resolution of 50 line pairs per millimeter (lp/mm) and a wide field of view. The WP-URI technique achieves unparalleled spatio-temporal resolution and frame rates, offering significant potential for investigating ultrafast phenomena such as matter interactions, carrier dynamics in semiconductor devices, and femtosecond laser–matter processes.
During high-speed maneuvers, aircraft experience rapid distance changes, necessitating high-frame-rate ranging for accurate characterization. However, existing optical ranging technologies often lack simplicity, affordability, and sufficient frame rates. While dual-station triangulation … During high-speed maneuvers, aircraft experience rapid distance changes, necessitating high-frame-rate ranging for accurate characterization. However, existing optical ranging technologies often lack simplicity, affordability, and sufficient frame rates. While dual-station triangulation enables high-frame-rate distance calculation via geometry, it suffers from complex and costly deployment. Conventional laser rangefinders are limited by low repetition rates. Single-photon ranging, using high-frequency low-energy pulses and detecting single reflected photons, offers a promising alternative. This study presents a kilohertz-level single-photon ranging system validated through civil aviation field tests. At 1000 Hz, relative distance, velocity, and acceleration were successfully captured. Simulating lower frame rates (100 Hz, 50 Hz, 10 Hz) via misalignment merging revealed standard deviations of 0.1661 m, 0.2361 m, and 0.2683 m, respectively, indicating that higher frame rates enhance distance measurement reproducibility. Error analysis against the 1000 Hz baseline further confirms that high-frame-rate ranging improves precision when monitoring high-speed maneuvers.
Abstract Recently, light detecting and ranging (LiDAR) technology has gained significant attention due to its wide‐ranging applications, particularly in 3D terrain mapping, atmospheric measurement, and autonomous driving. Most commercially available … Abstract Recently, light detecting and ranging (LiDAR) technology has gained significant attention due to its wide‐ranging applications, particularly in 3D terrain mapping, atmospheric measurement, and autonomous driving. Most commercially available LiDAR systems employ mechanical beam steering, which presents limitations such as slower scanning speeds, lower reliability, and larger device size. In contrast, solid‐state LiDAR is emerging as a viable alternative, offering enhanced performance and integration potential. Among the various techniques, frequency‐modulated continuous wave (FMCW) LiDAR stands out, especially for its suitability in velocity measurements and its compatibility with silicon‐based integration. This work introduces a new bridge‐balanced photodetector (Bridge‐BPD) for enhanced performance in FMCW LiDAR systems. By integrating optical couplers and Germanium‐Silicon photodetectors (Ge/Si PDs), this BPD improves the common‐mode rejection ratio (CMRR) by up to 12.8 dB compared to traditional methods, achieving 45.8 dB at a wavelength of 1550 nm. With FMCW LiDAR systems, high detection sensitivity is demonstrated and a detection probability of 90% at −98 dBm. This novel BPD offers results comparable to commercial InP‐based detectors, paving the way for further optoelectronic integration in LiDAR applications.
Abstract To enhance the bandwidth characteristics, the size of the active region of avalanche photodiodes (APDs) has been generally reduced to less than 10 μm, and the optical coupling efficiency … Abstract To enhance the bandwidth characteristics, the size of the active region of avalanche photodiodes (APDs) has been generally reduced to less than 10 μm, and the optical coupling efficiency and tolerance have been drastically reduced, which seriously affects the practicality of the devices. To solve the above problems, this paper presents a flip-chip bonded high-speed InGaAs/InAlAs APD integrated with a micro-lens. The fabricated APD with the integrated micro-lens exhibits a maximum responsivity of 0.7 A/W at unit gain and a coupling efficiency of 96%, which are much higher than the 0.5 A/W and 68% of the APD without the micro-lens. The maximum bandwidth is 27 GHz, an ultra-low dark current is 5.7 nA under 0.9 times breakdown voltage, and a gain-bandwidth product is 210 GHz. With the micro-lens fabricated on the device's backside, a Gaussian beam with a diameter of 10 μm can be effectively focused on an area of 7 μm width. The optical coupling tolerance (the radial distance from the active region center to the point where the responsivity drops to 90% of its maximum value) of the device is extended from 1.8 µm to 7 µm, larger than the active region radius of 3 μm. These features will significantly enhance the practicality of the high-speed APDs.
High-resolution 3D visualization of dynamic environments is critical for applications such as remote sensing. Traditional 3D imaging systems, such as lidar, rely on avalanche photodiode (APD) arrays to determine the … High-resolution 3D visualization of dynamic environments is critical for applications such as remote sensing. Traditional 3D imaging systems, such as lidar, rely on avalanche photodiode (APD) arrays to determine the flight time of light for each scene pixel. In this context, we introduce and demonstrate a high-resolution 3D imaging approach leveraging an Electron Multiplying Charge Coupled Device (EMCCD). This sensor’s low bandwidth properties allow for the use of electro-optic modulators to achieve both temporal resolution and rapid shuttering at sub-nanosecond speeds. This enables range-gated 3D imaging, which significantly enhances the signal-to-noise ratio (SNR) within our proposed framework. By employing a dual EMCCD setup, it is possible to reconstruct both a depth image and a grayscale image from a single raw data frame, thereby improving dynamic imaging capabilities, irrespective of object or platform movement. Additionally, the adaptive gate-opening range technology can further refine the range resolution of specific scene objects to as low as 10 cm.
A conventional microchannel plate framing camera is typically utilized for inertial confinement fusion diagnosis. However, as a vacuum electronic device, it has inherent limitations, such as a complex structure and … A conventional microchannel plate framing camera is typically utilized for inertial confinement fusion diagnosis. However, as a vacuum electronic device, it has inherent limitations, such as a complex structure and the inability to achieve single-line-of-sight imaging. To address these challenges, a CMOS image sensor that can be seamlessly integrated with an electronic pulse broadening system can provide a viable alternative to the microchannel plate detector. This paper introduces the design of an 8 × 8 pixel-array ultrashort shutter-time single-framing CMOS image sensor, which leverages silicon epitaxial processing and a 0.18 μm standard CMOS process. The focus of this study is on the photodiode and the readout pixel-array circuit. The photodiode, designed using the silicon epitaxial process, achieves a quantum efficiency exceeding 30% in the visible light band at a bias voltage of 1.8 V, with a temporal resolution greater than 200 ps for visible light. The readout pixel-array circuit, which is based on the 0.18 μm standard CMOS process, incorporates 5T structure pixel units, voltage-controlled delayers, clock trees, and row-column decoding and scanning circuits. Simulations of the pixel circuit demonstrate an optimal temporal resolution of 60 ps. Under the shutter condition with the best temporal resolution, the maximum output swing of the pixel circuit is 448 mV, and the output noise is 77.47 μV, resulting in a dynamic range of 75.2 dB for the pixel circuit; the small-signal responsivity is 1.93 × 10-7 V/e-, and the full-well capacity is 2.3 Me-. The maximum power consumption of the 8 × 8 pixel-array and its control circuits is 0.35 mW. Considering both the photodiode and the pixel circuit, the proposed CMOS image sensor achieves a temporal resolution better than 209 ps.
Abstract Efficient detection of ultra‐weak optical signals, particularly at the single‐photon level, is critical for the advancement of technologies such as LiDAR and quantum communication. Conventional linear optical detectors exhibit … Abstract Efficient detection of ultra‐weak optical signals, particularly at the single‐photon level, is critical for the advancement of technologies such as LiDAR and quantum communication. Conventional linear optical detectors exhibit insufficient sensitivity to meet the rigorous demands of these applications. Single‐photon detectors, with their unparalleled sensitivity and ultrafast response, offer substantial promise. However, their performance is limited by factors including material properties, device architecture, and environmental noise. Current research efforts are focused on optimizing materials, refining device designs, and enhancing cooling technologies, yet a systematic theoretical framework remains lacking. This review addresses these challenges by exploring the fundamental principles, material innovations, and cooling strategies essential to overcoming existing limitations. It emphasizes the inherent trade‐off in achieving high detection efficiency, low dark count rates, and minimal afterpulse probability. An integrated optimization approach is proposed, aligning front‐end device design with back‐end application needs, balancing detection efficiency, dark count rates, and temporal resolution. This strategy aims to facilitate the practical deployment of high‐performance single‐photon detectors.
Tamás Faitli , Heikki Hyyti , Juha Hyyppä +2 more | ˜The œinternational archives of the photogrammetry, remote sensing and spatial information sciences/International archives of the photogrammetry, remote sensing and spatial information sciences
Abstract. This paper addresses how to utilize multiple spinning lidar sensors for real-time applications. Especially how to derive back the problem to having only a single lidar input, to which … Abstract. This paper addresses how to utilize multiple spinning lidar sensors for real-time applications. Especially how to derive back the problem to having only a single lidar input, to which there are countless available algorithms solving odometry, mapping, object detection and tracking and many other tasks. We provide a strategy that can be implemented to most if not all spinning lidars on the market. Instead of traditional data batching that accumulates data packets based on the spinning angle, we propose batching based on the sampling time, which also enable us to ensure strict time alignment within the multiple lidar sources. In order to demonstrate our batching strategy, we provide a case study where we evaluated a SLAM algorithm with a single and a dual-lidar setup. Our batching algorithm enabled us to use the SLAM algorithm that was previously designed for a single spinning lidar without any additional change, while it showcased benefits, especially in stability due to the larger field of view and reduced occlusion.
Fiber photometry is a neuroscience technique that can continuously monitor in vivo fluorescence to assess population neural activity or neuropeptide/transmitter release in freely behaving animals. Despite the widespread adoption of … Fiber photometry is a neuroscience technique that can continuously monitor in vivo fluorescence to assess population neural activity or neuropeptide/transmitter release in freely behaving animals. Despite the widespread adoption of this technique, methods to statistically analyse data in an unbiased, objective, and easily adopted manner are lacking. Various pipelines for data analysis exist, but they are often system-specific, only for pre-processing data, and/or lack usability. Current post hoc statistical approaches involve inadvertently biased user-defined time-binned averages or area under the curve analysis. To date, no post-hoc user-friendly and assumption-free tool for a standardised unbiased analysis exists, yet such a tool would improve reproducibility and statistical reliability for all users. Hence, we have developed a user-friendly post hoc statistical analysis package in Python that is easily downloaded and applied to data from any fiber photometry system. This Fibre Photometry Post Hoc Analysis (FiPhoPHA) package incorporates a variety of tools, a downsampler, bootstrapped confidence intervals (CIs) for analyzing peri-event signals between groups and compared to baseline, and permutation tests for comparing peri-event signals across comparison periods. We also include the ability to quickly and efficiently sort the data into mean time bins, if desired. This provides an open-source, user-friendly python package for unbiased and standardised post-hoc statistical analysis to improve reproducibility using data from any fiber photometry system.
Abstract 3D scanning, measurement, and reconstruction have wide-ranging applications in industrial fields, medical devices, and cultural heritage preservation. This study focuses on the field of 3D measurement and reconstruction, proposing … Abstract 3D scanning, measurement, and reconstruction have wide-ranging applications in industrial fields, medical devices, and cultural heritage preservation. This study focuses on the field of 3D measurement and reconstruction, proposing a dual-degree-of-freedom layer-by-layer 3D scanner based on the principle of optical reflection. The scanner employs a laser range sensor to perform layered scanning through a linear motion guide and a translational-rotary platform driven by stepper motors. To improve the computational efficiency of the ICP algorithm, the Newton iteration method is employed to apply global rotation and translation to the sampled point cloud, optimizing the accuracy of point cloud registration. The scanned objects are categorized into three types based on whether the surfaces of their spatial geometries are second-order differentiable. The final matching results show an RMSE range of 0.2–0.7 and a residual variance range of 0.02–0.31, indicating that the RMSE meets the requirements of 3D reconstruction according to international standards. Experimental results demonstrate that the proposed 3D scanner achieves high-precision scanning and reconstruction capabilities, and the surface characteristics of the scanned objects significantly influence the measurement results. The analysis of error distribution provides theoretical guidance for further optimizing device performance and algorithm accuracy.
A superconducting transition-edge sensor (TES) as a microcalorimeter detects incoming photons by measuring heat converted from photon energy. With high resolving power and low noise levels, a TES is sensitive … A superconducting transition-edge sensor (TES) as a microcalorimeter detects incoming photons by measuring heat converted from photon energy. With high resolving power and low noise levels, a TES is sensitive to single photons and able to count photons within a wide spectral band from X-ray to near-infrared. We have developed a TES detector aiming at soft X-ray spectroscopy applications. In this work, the performance of this detector is characterized. It is shown that the energy resolution of this detector is about 1.8 eV for 1.5 keV photons. The good resolution is also kept in visible range, enabling photon-number resolving for 405 nm photons.