Recent advancements in mobile devices, data analysis, and wearable sensors render the capability of in-place health monitoring. Supervised machine learning algorithms, the core intelligence of these systems, learn from labeled training data. However, labeling vast amount of data is time-consuming and expensive. Moreover, sensor data often contains personal information that a user may not be comfortable sharing. Therefore, there is a strong need to develop methods for generating realistic labeled sensor data. In this paper, we propose a supervised generative adversarial network architecture that learns from feedback from both a discriminator and a classifier in order to create synthetic sensor data. We demonstrate the effectiveness of the architecture on a publicly available human activity dataset. We show that our generator learns to output diverse samples that are similar but not identical to the training data.

Download full-text PDF

Source
http://dx.doi.org/10.1109/EMBC.2018.8512470DOI Listing

Publication Analysis

Top Keywords

sensor data
16
synthetic sensor
8
data
8
training data
8
data generation
4
generation health
4
health applications
4
applications supervised
4
supervised deep
4
deep learning
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!