We demonstrate compressive-sensing (CS) spectroscopy in a planar-waveguide Fourier-transform spectrometer (FTS) device. The spectrometer is implemented as an array of Mach-Zehnder interferometers (MZIs) integrated on a photonic chip. The signal from a set of MZIs is composed of an undersampled discrete Fourier interferogram, which we invert using l-norm minimization to retrieve a sparse input spectrum. To implement this technique, we use a subwavelength-engineered spatial heterodyne FTS on a chip composed of 32 independent MZIs. We demonstrate the retrieval of three sparse input signals by collecting data from restricted sets (8 and 14) of MZIs and applying common CS reconstruction techniques to this data. We show that this retrieval maintains the full resolution and bandwidth of the original device, despite a sampling factor as low as one-fourth of a conventional (non-compressive) design.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1364/OL.42.001440 | DOI Listing |
Sensors (Basel)
January 2025
School of Artificial Intelligence, Beijing University of Posts and Telecommunications (BUPT), Beijing 100876, China.
The advent of millimeter-wave (mmWave) massive multiple-input multiple-output (MIMO) systems, coupled with reconfigurable intelligent surfaces (RISs), presents a significant opportunity for advancing wireless communication technologies. This integration enhances data transmission rates and broadens coverage areas, but challenges in channel estimation (CE) remain due to the limitations of the signal processing capabilities of RIS. To address this, we propose an adaptive channel estimation framework comprising two algorithms: log-sum normalized least mean squares (Log-Sum NLMS) and hybrid normalized least mean squares-normalized least mean fourth (Hybrid NLMS-NLMF).
View Article and Find Full Text PDFMicromachines (Basel)
January 2025
Research Center for Novel Computing Sensing and Intelligent Processing, Zhejiang Lab, Hangzhou 311100, China.
General matrix multiplication (GEMM) in machine learning involves massive computation and data movement, which restricts its deployment on resource-constrained devices. Although data reuse can reduce data movement during GEMM processing, current approaches fail to fully exploit its potential. This work introduces a sparse GEMM accelerator with a weight-and-output stationary (WOS) dataflow and a distributed buffer architecture.
View Article and Find Full Text PDFSci Rep
January 2025
Institute of Oceanography, Center for Earth System Sustainability, Universität Hamburg, Hamburg, Germany.
Oceanic subsurface observations are sparse and lead to large uncertainties in any model-based estimate. We investigate the applicability of transfer learning based neural networks to reconstruct North Atlantic temperatures in times with sparse observations. Our network is trained on a time period with abundant observations to learn realistic physical behavior.
View Article and Find Full Text PDFACS Environ Au
January 2025
Biological and Agricultural Engineering, North Carolina State University, Raleigh, North Carolina 27695, United States.
The U.S. Clean Water Act is believed to have driven widespread decreases in pollutants from point sources and developed areas, but has not substantially affected nutrient pollution from agriculture.
View Article and Find Full Text PDFbioRxiv
January 2025
Center for Theoretical Neuroscience, Zuckerman Mind Brain Behavior Institute, Columbia University, New York, NY.
Storing complex correlated memories is significantly more efficient when memories are recoded to obtain compressed representations. Previous work has shown that compression can be implemented in a simple neural circuit, which can be described as a sparse autoencoder. The activity of the encoding units in these models recapitulates the activity of hippocampal neurons recorded in multiple experiments.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!