Vector-valued neural learning has emerged as a promising direction in deep learning recently. Traditionally, training data for neural networks (NNs) are formulated as a vector of scalars; however, its performance may not be optimal since associations among adjacent scalars are not modeled. In this article, we propose a new vector neural architecture called the Arbitrary BIlinear Product NN (ABIPNN), which processes information as vectors in each neuron, and the feedforward projections are defined using arbitrary bilinear products. Such bilinear products can include circular convolution, 7-D vector product, skew circular convolution, reversed-time circular convolution, or other new products that are not seen in the previous work. As a proof-of-concept, we apply our proposed network to multispectral image denoising and singing voice separation. Experimental results show that ABIPNN obtains substantial improvements when compared to conventional NNs, suggesting that associations are learned during training.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TNNLS.2019.2933882 | DOI Listing |
Phys Chem Chem Phys
November 2024
Interdisciplinary Nanoscience Center (iNANO) and Department of Chemistry, Aarhus University, Gustav Wieds Vej 14, DK-8000 Aarhus C, Denmark.
Dynamic nuclear polarization (DNP) has proven to be a powerful technique to enhance nuclear spin polarization by transferring the much higher electron spin polarization to nuclear spins prior to detection. While major attention has been devoted to high-field applications with continuous microwave irradiation, the introduction of fast arbitrary waveform generators is gradually increasing opportunities for the realization of pulsed DNP. Here, we describe how static-powder DNP pulse sequences may systematically be designed using single-spin vector effective Hamiltonian theory.
View Article and Find Full Text PDFSensors (Basel)
September 2024
School of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan 430073, China.
To accurately estimate the 6D pose of objects, most methods employ a two-stage algorithm. While such two-stage algorithms achieve high accuracy, they are often slow. Additionally, many approaches utilize encoding-decoding to obtain the 6D pose, with many employing bilinear sampling for decoding.
View Article and Find Full Text PDFSci Rep
May 2024
College of Computer, Jiangxi University of Chinese Medicine, Jiangxi, 330004, China.
The (3 + 1)-dimensional Painlevé integrable equation are a class of nonlinear differential equations with special properties, which play an important role in nonlinear science and are of great significance in solving various practical problems, such as many important models in fields such as quantum mechanics, statistical physics, nonlinear optics, and celestial mechanics. In this work, we utilize the Hirota bilinear form and Mathematica software to formally obtain the interaction solution among lump wave, solitary wave and periodic wave, which has not yet appeared in other literature. Additionally, using the -expansion method, we provide a rich set of exact solutions for the (3 + 1)-dimensional Painlevé integrable equation, which includes two functions with arbitrary values.
View Article and Find Full Text PDFSci Rep
September 2023
Faculty of Sciences, UNESP-São Paulo State University, 17033-360, Bauru, SP, Brazil.
Machine learning has transformed science and technology. In this article, we present a model-independent classifier that uses the k-Nearest Neighbors algorithm to classify phases of a model for which it has never been trained. This is done by studying three different spin-1 chains with some common phases: the XXZ chains with uniaxial single-ion-type anisotropy, the bond alternating XXZ chains, and the bilinear biquadratic chain.
View Article and Find Full Text PDFCharacterizing the asymptotic distributions of eigenvectors for large random matrices poses important challenges yet can provide useful insights into a range of statistical applications. To this end, in this paper we introduce a general framework of asymptotic theory of eigenvectors (ATE) for large spiked random matrices with diverging spikes and heterogeneous variances, and establish the asymptotic properties of the spiked eigenvectors and eigenvalues for the scenario of the generalized Wigner matrix noise. Under some mild regularity conditions, we provide the asymptotic expansions for the spiked eigenvalues and show that they are asymptotically normal after some normalization.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!