This paper presents a method for selecting salient 2D views to describe 3D objects for the purpose of retrieval. The views are obtained by first identifying salient points via a learning approach that uses shape characteristics of the 3D points (Atmosukarto and Shapiro in International workshop on structural, syntactic, and statistical pattern recognition, 2008; Atmosukarto and Shapiro in ACM multimedia information retrieval, 2008). The salient views are selected by choosing views with multiple salient points on the silhouette of the object. Silhouette-based similarity measures from Chen et al. (Comput Graph Forum 22(3):223-232, 2003) are then used to calculate the similarity between two 3D objects. Retrieval experiments were performed on three datasets: the Heads dataset, the SHREC2008 dataset, and the Princeton dataset. Experimental results show that the retrieval results using the salient views are comparable to the existing light field descriptor method (Chen et al. in Comput Graph Forum 22(3):223-232, 2003), and our method achieves a 15-fold speedup in the feature extraction computation time.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3702181PMC
http://dx.doi.org/10.1007/s13735-012-0015-3DOI Listing

Publication Analysis

Top Keywords

salient views
16
retrieval salient
8
salient points
8
atmosukarto shapiro
8
chen comput
8
comput graph
8
graph forum
8
forum 223223-232
8
223223-232 2003
8
salient
6

Similar Publications

Background: This study investigated the factor structure of the parenting sense of competence (PSoC), a measure of parenting self-efficacy, in a sample of parents recruited when their infants were under 2 months old. Due to the lack of longitudinal analysis of the PSoC's factor structure over time, the study sought to establish if the published two-factor structure was consistent over an 18-month period.

Methods: Data collected from 536 parents who had participated in a randomised controlled trial of universal proportionate parenting support, delivered in five sites in England, were subject to confirmatory factor analysis (CFA).

View Article and Find Full Text PDF

Background: Self-management support is widely considered a critical aspect of nursing. Still, many studies indicate that nurses frequently experience difficulties in daily practice.

Objective: To gain a deeper understanding of the factors perceived by nurses to impede or promote their support of patients' self-management within the dynamic environment of the in-patient hospital setting.

View Article and Find Full Text PDF

Background: Widespread antibiotic prescribing contributes to globally emerging antimicrobial resistance (AMR). Despite stewardship recommendations by the Infectious Diseases Society of America, there is a lack of literature identifying barriers and facilitators to antimicrobial stewardship programs (ASPs) in United States (U.S.

View Article and Find Full Text PDF

Salient emotional visual cues receive prioritized processing in human visual cortex. To what extent emotional facilitation relies on preattentional stimulus processing preceding semantic analysis remains controversial. Making use of steady-state visual evoke potentials frequency-tagged to meaningful complex emotional scenes and their scrambled versions, presented in a 4-Hz rapid serial visual presentation fashion, the current study tested temporal dynamics of semantic and emotional cue processing.

View Article and Find Full Text PDF

Excessive salt or sodium intake is strongly linked to increased blood pressure, which is a major risk factor for cardiovascular diseases. This study aimed to qualitatively explore the views of key stakeholders on salt intake reduction and barriers and facilitators to reducing salt intake in Malaysian schools. The stakeholders in this study were school administrators, food operators, and consumers.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!