Tracking many cells in time-lapse 3D image sequences is an important challenging task of bioimage informatics. Motivated by a study of brain-wide 4D imaging of neural activity in C. elegans, we present a new method of multi-cell tracking. Data types to which the method is applicable are characterized as follows: (i) cells are imaged as globular-like objects, (ii) it is difficult to distinguish cells on the basis of shape and size only, (iii) the number of imaged cells in the several-hundred range, (iv) movements of nearly-located cells are strongly correlated, and (v) cells do not divide. We developed a tracking software suite that we call SPF-CellTracker. Incorporating dependency on the cells' movements into the prediction model is the key for reducing the tracking errors: the cell switching and the coalescence of the tracked positions. We model the target cells' correlated movements as a Markov random field and we also derive a fast computation algorithm, which we call spatial particle filter. With the live-imaging data of the nuclei of C. elegans neurons in which approximately 120 nuclei of neurons were imaged, the proposed method demonstrated improved accuracy compared to the standard particle filter and the method developed by Tokunaga et al. (2014).
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TCBB.2017.2782255 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!