Low-Pass Image Filtering to Achieve Adversarial Robustness.

Sensors (Basel)

Science and Research Department, Moscow Technical University of Communications and Informatics, 111024 Moscow, Russia.

Published: November 2023

In this paper, we continue the research cycle on the properties of convolutional neural network-based image recognition systems and ways to improve noise immunity and robustness. Currently, a popular research area related to artificial neural networks is adversarial attacks. The adversarial attacks on the image are not highly perceptible to the human eye, and they also drastically reduce the neural network's accuracy. Image perception by a machine is highly dependent on the propagation of high frequency distortions throughout the network. At the same time, a human efficiently ignores high-frequency distortions, perceiving the shape of objects as a whole. We propose a technique to reduce the influence of high-frequency noise on the CNNs. We show that low-pass image filtering can improve the image recognition accuracy in the presence of high-frequency distortions in particular, caused by adversarial attacks. This technique is resource efficient and easy to implement. The proposed technique makes it possible to measure up the logic of an artificial neural network to that of a human, for whom high-frequency distortions are not decisive in object recognition.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10675189PMC
http://dx.doi.org/10.3390/s23229032DOI Listing

Publication Analysis

Top Keywords

adversarial attacks
12
high-frequency distortions
12
low-pass image
8
image filtering
8
image recognition
8
artificial neural
8
image
5
filtering achieve
4
adversarial
4
achieve adversarial
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!