Unpaired mesh-to-image translation for 3D fluorescent microscopy images of neurons.

Med Image Anal

Department of Engineering Science, University of Oxford, Oxford OX3 7DQ, UK. Electronic address:

Published: May 2023

While Generative Adversarial Networks (GANs) can now reliably produce realistic images in a multitude of imaging domains, they are ill-equipped to model thin, stochastic textures present in many large 3D fluorescent microscopy (FM) images acquired in biological research. This is especially problematic in neuroscience where the lack of ground truth data impedes the development of automated image analysis algorithms for neurons and neural populations. We therefore propose an unpaired mesh-to-image translation methodology for generating volumetric FM images of neurons from paired ground truths. We start by learning unique FM styles efficiently through a Gramian-based discriminator. Then, we stylize 3D voxelized meshes of previously reconstructed neurons by successively generating slices. As a result, we effectively create a synthetic microscope and can acquire realistic FM images of neurons with control over the image content and imaging configurations. We demonstrate the feasibility of our architecture and its superior performance compared to state-of-the-art image translation architectures through a variety of texture-based metrics, unsupervised segmentation accuracy, and an expert opinion test. In this study, we use 2 synthetic FM datasets and 2 newly acquired FM datasets of retinal neurons.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.media.2023.102768DOI Listing

Publication Analysis

Top Keywords

images neurons
12
unpaired mesh-to-image
8
mesh-to-image translation
8
fluorescent microscopy
8
microscopy images
8
realistic images
8
neurons
6
images
5
translation fluorescent
4
neurons generative
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!