Multiple Discrimination and Pairwise CNN for view-based 3D object retrieval.

Neural Netw

School of Information and Safety Engineering, Zhongnan University of Economics and Law, Wuhan 430073, PR China. Electronic address:

Published: May 2020

With the rapid development and wide application of computer, camera device, network and hardware technology, 3D object (or model) retrieval has attracted widespread attention and it has become a hot research topic in the computer vision domain. Deep learning features already available in 3D object retrieval have been proven to be better than the retrieval performance of hand-crafted features. However, most existing networks do not take into account the impact of multi-view image selection on network training, and the use of contrastive loss alone only forcing the same-class samples to be as close as possible. In this work, a novel solution named Multi-view Discrimination and Pairwise CNN (MDPCNN) for 3D object retrieval is proposed to tackle these issues. It can simultaneously input multiple batches and multiple views by adding the Slice layer and the Concat layer. Furthermore, a highly discriminative network is obtained by training samples that are not easy to be classified by clustering. Lastly, we deploy the contrastive-center loss and contrastive loss as the optimization objective that has better intra-class compactness and inter-class separability. Large-scale experiments show that the proposed MDPCNN can achieve a significant performance over the state-of-the-art algorithms in 3D object retrieval.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2020.02.017DOI Listing

Publication Analysis

Top Keywords

object retrieval
16
discrimination pairwise
8
pairwise cnn
8
network training
8
contrastive loss
8
retrieval
6
object
5
multiple discrimination
4
cnn view-based
4
view-based object
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!