Multimodal Analytics in Big Data architectures implies compounded configurations of the data processing tasks. Each modality in data requires specific analytics that triggers specific data processing tasks. Scalability can be reached at the cost of an attentive calibration of the resources shared by the different tasks searching for a trade-off with the multiple requirements they impose. We propose a methodology to address multimodal analytics within the same data processing approach to get a simplified architecture that can fully exploit the potential of the parallel processing of Big Data infrastructures. Multiple data sources are first integrated into a unified knowledge graph (KG). Different modalities of data are addressed by specifying views on the KG and producing a rewriting of the graph containing merely the data to be processed. Graph traversal and rule extraction are this way boosted. Using graph embeddings methods, the different views can be transformed into low-dimensional representation following the same data format. This way a single machine learning procedure can address the different modalities, simplifying the architecture of our system. The experiments we executed demonstrate that our approach reduces the cost of execution and improves the accuracy of analytics.

Download full-text PDF

Source
http://dx.doi.org/10.1089/big.2021.0326DOI Listing

Publication Analysis

Top Keywords

big data
12
data processing
12
data
11
multimodal analytics
8
processing tasks
8
general framework
4
framework multimodal
4
multimodal big
4
data analysis
4
analysis multimodal
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!