Publications by authors named "Zhenya Zang"

This study presents a framework for classifying a wooden mannequin's poses using a single-photon avalanche diode (SPAD) array in dynamic and heterogeneous fog conditions. The target and fog generator are situated within an enclosed fog chamber. Training datasets are continuously collected by configuring the temporal and spatial resolutions on the sensor's firmware, utilizing a low-cost SPAD array sensor priced below 5, consisting of an embedded SPAD array and diffused VCSEL laser.

View Article and Find Full Text PDF
Article Synopsis
  • This study introduces a compact deep learning architecture and specialized hardware to enhance the reconstruction of blood flow index (BFi) in diffuse correlation spectroscopy (DCS) using autocorrelation functions (ACFs) for training.
  • The proposed lightweight deep learning model displays significant improvements in mean squared error (MSE) compared to traditional convolutional neural networks (CNN), while also simplifying computations through feature extraction, which is optimized for hardware implementation.
  • The developed system allows for real-time, parallel processing of autocorrelation functions on FPGA, offering an integrated on-chip solution for converting photon data into BFi and coherence factor β, surpassing conventional post-processing methods.
View Article and Find Full Text PDF

Significance: Diffuse correlation spectroscopy (DCS) is a powerful, noninvasive optical technique for measuring blood flow. Traditionally the blood flow index (BFi) is derived through nonlinear least-square fitting the measured intensity autocorrelation function (ACF). However, the fitting process is computationally intensive, susceptible to measurement noise, and easily influenced by optical properties (absorption coefficient and reduced scattering coefficient ) and scalp and skull thicknesses.

View Article and Find Full Text PDF

This paper reports a bespoke adder-based deep learning network for time-domain fluorescence lifetime imaging (FLIM). By leveraging thel1-norm extraction method, we propose a 1D Fluorescence Lifetime AdderNet (FLAN) without multiplication-based convolutions to reduce the computational complexity. Further, we compressed fluorescence decays in temporal dimension using a log-scale merging technique to discard redundant temporal information derived as log-scaling FLAN (FLAN+LS).

View Article and Find Full Text PDF

Fluorescence lifetime imaging (FLIM) is a powerful tool that provides unique quantitative information for biomedical research. In this study, we propose a multi-layer-perceptron-based mixer (MLP-Mixer) deep learning (DL) algorithm named FLIM-MLP-Mixer for fast and robust FLIM analysis. The FLIM-MLP-Mixer has a simple network architecture yet a powerful learning ability from data.

View Article and Find Full Text PDF

Wide-field fluorescence lifetime imaging (FLIM) is a promising technique for biomedical and clinic applications. Integrating with CMOS single-photon avalanche diode (SPAD) sensor arrays can lead to cheaper and portable real-time FLIM systems. However, the FLIM data obtained by such sensor systems often have sophisticated noise features.

View Article and Find Full Text PDF

Convolutional neural networks (CNN) have revealed exceptional performance for fluorescence lifetime imaging (FLIM). However, redundant parameters and complicated topologies make it challenging to implement such networks on embedded hardware to achieve real-time processing. We report a lightweight, quantized neural architecture that can offer fast FLIM imaging.

View Article and Find Full Text PDF

We present a fast and accurate analytical method for fluorescence lifetime imaging microscopy (FLIM), using the extreme learning machine (ELM). We used extensive metrics to evaluate ELM and existing algorithms. First, we compared these algorithms using synthetic datasets.

View Article and Find Full Text PDF

We present a deep learning approach to obtain high-resolution (HR) fluorescence lifetime images from low-resolution (LR) images acquired from fluorescence lifetime imaging (FLIM) systems. We first proposed a theoretical method for training neural networks to generate massive semi-synthetic FLIM data with various cellular morphologies, a sizeable dynamic lifetime range, and complex decay components. We then developed a degrading model to obtain LR-HR pairs and created a hybrid neural network, the spatial resolution improved FLIM net (SRI-FLIMnet) to simultaneously estimate fluorescence lifetimes and realize the nonlinear transformation from LR to HR images.

View Article and Find Full Text PDF

Single-photon avalanche diodes (SPAD) are powerful sensors for 3D light detection and ranging (LiDAR) in low light scenarios due to their single-photon sensitivity. However, accurately retrieving ranging information from noisy time-of-arrival (ToA) point clouds remains a challenge. This paper proposes a photon-efficient, non-fusion neural network architecture that can directly reconstruct high-fidelity depth images from ToA data without relying on other guiding images.

View Article and Find Full Text PDF

Measuring fluorescence lifetimes of fast-moving cells or particles have broad applications in biomedical sciences. This paper presents a dynamic fluorescence lifetime sensing (DFLS) system based on the time-correlated single-photon counting (TCSPC) principle. It integrates a CMOS 192 × 128 single-photon avalanche diode (SPAD) array, offering an enormous photon-counting throughput without pile-up effects.

View Article and Find Full Text PDF