ISSCC 2021 / SESSION 7 / IMAGERS AND RANGE SENSORS / 7.
7.3 A 189×600 Back-Illuminated Stacked SPAD Direct can be carried out. The average and variance values of acquired BG light is output for
Time-of-Flight Depth Sensor for Automotive LiDAR Systems each pixel as common-mode information in the output format. This common-mode
information is used as values for subtracting BG light components in the histogram ACQ.
Oichi Kumagai1, Junichi Ohmachi1, Masao Matsumura1, Shinichiro Yagi1, A block diagram of the high-resolution SPAD dTOF depth sensor is shown in Figure
Kenichi Tayu1, Keitaro Amagawa2, Tomohiro Matsukawa1, Osamu Ozawa1, 7.3.3. PQR front-end circuitry is controlled and actuated by dedicated column and row
Daisuke Hirono1, Yasuhiro Shinozuka1, Ryutaro Homma1, Kumiko Mahara2, driving circuitry. Laser pulses reflected off objects are focused by optics and detected
Toshio Ohyama1, Yousuke Morita1, Shohei Shimada1, Takahisa Ueno3, by an array of SPADs. The received photon signals are amplified via cathode, high-speed
Akira Matsumoto1, Yusuke Otake1, Toshifumi Wakano1, Takashi Izawa1 analog-to-digital conversion in the PQR front-end circuit, and TDCs are carried out after
that. During line readout, the signals from the column SPADs are assembled into 81b in
the horizontal direction. Finally, the column shifter block selects the appropriate portion
2021 IEEE International Solid- State Circuits Conference (ISSCC) | 978-1-7281-9549-0/20/$31.00 ©2021 IEEE | DOI: 10.1109/ISSCC42613.2021.9365961
1
Sony Semiconductor Solutions, Atsugi, Japan
2
Sony LSI Design, Atsugi, Japan of 200 ToF MPs into the signal-processing section of 192 ToF MPs. The dead time (DT)
3
Sony Depthsensing Solutions, Brussels, Belgium can be quite short (6ns) at room temperature (RT), using a PQR readout circuit.
There have been many developments in Light Detection And Ranging (LiDAR) sensors The chip characteristics are summarized in Figure 7.3.4. The top chip uses a 90nm SPAD
used in Autonomous Driving (AD) and Advanced Driver Assistance Systems (ADAS) to CMOS process with specialized add-on steps to enable back-illumination. The bottom
measure the precise distance to an object, recognize the shape of an intersection, and chip uses a 40nm 1Al-10Cu logic process. The total number of pixels is 100,000 SPADs
classify road types. These LiDAR sensors can achieve fantastic results day and night (189 (H) × 600 (V)), including unused SPADs. The SPAD pixel pitch is 10μm and
without any loss of performance. In the past, Time-Correlated Single Photon Counting incorporates an on-chip micro-lens. A peak photon detection efficiency (PDE) of 22%
(TCSPC) and complete digital signal processing (DSP) have been used in to achieve a at 905nm is illustrated. PDE is reduced to 14% at -40°C as shown in the graph in Figure
100m range Time-of-Flight (ToF) sensor [1]. Background (BG) noise-rejection 7.3.4. Measurements with a DT of 6ns, a dark count rate (DCR) of 2kcps at 60°C and
techniques [2] have been used to improve the signal-to-noise ratio (SNR), leading to 600kcps at 125°C, an after pulse (AP) of 0.1% and 4.9% at -40°C, saturation is reached
detection of objects at a 6km range. Single Photon Avalanche Diode (SPAD)-based for 60Mcps at 200μW/cm2. The chip performance at an extended range of up to 200m
architectures implement per-pixel level histogramming, Time-to-Digital Conversion demonstrates distance accuracy of 30cm, allowing it to detect objects with 95%
(TDC) and signal processing [3,4]. Another ToF sensor has been shown that enables reflectivity under 117klux sunlight conditions.
significantly higher resolution, 1200×900 pixels [5]. With the emerging need for a high-
resolution solid-state LiDAR using a scanning 2D-SPAD array [6], we report a SPAD Figure 7.3.5 shows a MEMS-based SPAD LiDAR PoC measurement of a 3D point-cloud,
direct Time-of-Flight (dToF) depth sensor [1-5] to realize long-distance 300m range and top-view projection, intensity image, BG light passive image, and 2D depth Image.
high resolution over an automotive-grade temperature range of -40 to 125°C. This Various objects such as pedestrians, cars, road curbs, trees, and buildings are detected
microelectromechanical systems (MEMS)-based SPAD LiDAR can measure over ranges by the SPAD dToF depth sensor. This sensor provides effective detecting distance ranges
up to 150m with 0.1% accuracy for a 10%-reflectivity target and 200m with 0.1% from 0 to 300m. Figure 7.3.5 shows measurements of the LiDAR system at 0 to 150m
accuracy for a 95%-reflectivity target. This paper presents a back-illuminated stacked for a 10%-reflectivity target at 117klux sunlight conditions. In addition, this shows
SPAD dToF depth sensor deployed with passive quenching and recharge (PQR) front- measurements at 150 to 200m for a 95%-reflectivity target at 117klux sunlight
end circuitry, TCSPC, and on-chip DSP. Under 117klux sunlight conditions, the conditions. Distance measurement error is 0.1% at both 150m (<15cm accuracy) and
MEMS-based SPAD LiDAR measures distances up to 200m with 168×63 resolution at 200m (<30cm accuracy).
20 frames/s.
A performance summary and comparison with recently published state-of-the-art
This sensor is assembled in a proof-of-concept (PoC) MEMS-based SPAD LiDAR devices and LiDAR systems is shown in Figure 7.3.6. Background light count width, 9b
system. Details of the PoC are shown in Figure 7.3.1. This LiDAR system comprises 189×600-SPAD passive images are successfully captured with high resolution using the
three major components: (1) a pulsed laser diode (PLD), (2) a MEMS mirror unit, and fabricated SPAD dToF depth sensor. The chip micrograph and cross-sectional view of
(3) a SPAD dToF depth sensor. The PLD provides a 905nm, 4.5ns laser pulse, at a high stacked SPAD with Cu–Cu Connection are shown in Figure 7.3.7. The chip size is 6.9mm
peak optical output power of 45W. In this 1D-scanning LiDAR approach, the MEMS (H) × 7.8mm (V).
mirror is used to steer the laser beam. The MEMS-based LiDAR with scanning in a SPAD
dToF depth sensor can measure distances up to 300m with a 25.2°×9.45° field of view Acknowledgement:
(FoV) at 0.15° angular resolution. In this approach, the MEMS mirror oscillates The authors received generous support from Shunpei Suzuki and Takahiro Kado for the
horizontally, while the laser scans vertically. The SPAD array of vertical ToF macro-pixels optical design of LiDAR systems, and Toru Takashimizu for the mechanical design. We
(MPs) scans all 63 vertical ToF MPs in parallel with the same laser pulse, while a portion would also like to thank Sony Depthsensing Solutions for their technical support on the
of the 192 ToF MPs is used for the 168 horizontal active ToF MPs. The scanning speed simulation of the 3D ToF systems.
is very high, resulting in a full scan of 168×63 MPs at 20 frames/s.
References:
The overall architecture of the chip containing a 189×600 SPAD array is shown in Figure [1] C. Niclass et al., “A 0.18μm CMOS SoC for a 100m-Range 10-Frame/s 200×96-Pixel
7.3.2, comprising a column and a row driver, coincidence detection circuits (CDCs), Time-of-Flight Depth Sensor,” ISSCC, pp. 488-489, Feb. 2013.
TDCs, and the DSP block. The sensor consists of a light-receptive area of 189 SPADs in [2] M. Perenzoni et al., “A 64×64-Pixel Digital Silicon Photomultiplier Direct ToF Sensor
the vertical direction and 600 SPADs in the horizontal direction. A selectable with 100Mphotons/s/pixel Background Rejection and Imaging/Altimeter Mode with
configuration of 3×3 SPADs or 6×6 SPADs make up one macro-pixel, which is the 0.14% Precision up to 6km for Spacecraft Navigation and Landing,” ISSCC, pp. 118-
minimum unit of resolution. The signal-processing flow from CDCs, to histogram 119, Feb. 2016.
acquisition (ACQ), to echo analysis (EA) and peak detection (PD), involves two sets of [3] A. R. Ximenes et al., “A 256x256 45/65nm 3D-Stacked SPAD-Based Direct TOF
circuits per phase, each of which operates on opposite phases of a 500MHz timing Image Sensor for LiDAR Applications with Optical Polar Modulation for up to 18.6dB
signal, thereby achieving a 1GHz effective sampling rate with each measurement Interference Suppression,” ISSCC, pp. 96-97, Feb. 2018.
occurring on every other phase. After ACQ, the data for the two phases are integrated [4] R. K. Henderson et al., “A 256×256 40nm/90nm CMOS 3D-Stacked 120dBDynamic-
alternately, then processing runs at 250MHz in a single phase. There are three types of Range Reconfigurable Time-Resolved SPAD Imager,” ISSCC, pp. 106-107, Feb. 2019.
output modes for the operation of this sensor: (1) histogram mode: histogram data in [5] T. Okino et al., “A 1200×900 6μm 450fps Geiger-Mode Vertical Avalanche
the range set, (2) echo mode: histogram data for up to 5 echoes, and (3) ranging mode: Photodiodes CMOS Image Sensor for a 250m Time-of-Flight Ranging System Using
multi-echo analysis information for up to 5 echoes. Upsampling is carried out by using Direct-Indirect-Mixed Frame Synthesis with Configurable-Depth-Resolution Down to
an FIR filter, improving accuracy up to 7.5cm. This sensor has two types of operating 10cm,” ISSCC, pp. 96-97, Feb. 2020.
modes:(1) line type, where the active area is fixed, and (2) array type, where readout [6] T. T. Ta et al., “A 2D-SPAD Array and Read-Out AFE for Next-Generation Solid-State
lines are switched for each slot. The synchronous timing of this sensor is controlled by LiDAR,” IEEE Symp. VLSI Circuits, 2 pages, June 2020.
the synchronization signals F_SYNC, S_SYNC, PRE_TRG and TRG_I. A diagram of the
synchronization signal timing is shown in Figure 7.3.2. The F_SYNC 50ms period is
divided into S_SYNC 63 time slots of 761.60μs each. The S_SYNC 63 time slots refers
to the period for generating a vertical scanning of 63 ToF images. When BG light is
present, the ACQ timing signal PRE_TRG is input to this sensor, and ACQ with BG light
110 • 2021 IEEE International Solid-State Circuits Conference 978-1-7281-9549-0/21/$31.00 ©2021 IEEE
Authorized licensed use limited to: University of Exeter. Downloaded on May 31,2021 at 01:55:10 UTC from IEEE Xplore. Restrictions apply.
ISSCC 2021 / February 16, 2021 / 8:46 AM
Figure 7.3.1: Specification of MEMS SPAD LiDAR prototype (top-left) and
assembled proof-of-concept LiDAR Evaluation Kit (top-right). 2D view of the depth Figure 7.3.2: Readout architecture of the depth sensor (top). One-frame timing
sensor and 3D view of the depth image frame (bottom). diagram of sensor operation (bottom).
Figure 7.3.3: Block diagram of the depth sensor. Schematic of passive quenching
and recharge readout circuitry and example of the circuit operation waveform Figure 7.3.4: Chip characteristics (top). PDE as a function of temperature showing
during photon detection. 14% at -40°C (bottom-left) and wavelength dependence (bottom-right).
Figure 7.3.5: The evaluation kit is a complete LiDAR system that can measure
distances up to 200m with 95% reflectivity under 117klux sunlight conditions. 3D
point cloud and its orthogonal projection, intensity, background light passive image,
along with 2D depth map with our LiDAR mounted on a vehicle (top). Measurement Figure 7.3.6: Comparison of performance to recently published state-of-the-art
distances as a function of actual target distance and measurement errors as a devices and LiDAR systems (top), and captured background light passive image
function of target distance at 20 frames/s (bottom). using 10μm 189×600 SPADs (bottom).
DIGEST OF TECHNICAL PAPERS •
Authorized licensed use limited to: University of Exeter. Downloaded on May 31,2021 at 01:55:10 UTC from IEEE Xplore. Restrictions apply.
111
ISSCC 2021 PAPER CONTINUATIONS
Figure 7.3.7: Die micrograph (top) and cross-sectional view of stacked SPAD with
Cu-Cu connections (bottom).
• 2021 IEEE International Solid-State Circuits Conference 978-1-7281-9549-0/21/$31.00 ©2021 IEEE
Authorized licensed use limited to: University of Exeter. Downloaded on May 31,2021 at 01:55:10 UTC from IEEE Xplore. Restrictions apply.