Enhanced Cell Tracking Using A GAN-based Super-Resolution Video-to-Video Time-Lapse Microscopy Generative Model.
Journal
bioRxiv : the preprint server for biology
Titre abrégé: bioRxiv
Pays: United States
ID NLM: 101680187
Informations de publication
Date de publication:
14 Jun 2024
14 Jun 2024
Historique:
medline:
25
6
2024
pubmed:
25
6
2024
entrez:
25
6
2024
Statut:
epublish
Résumé
Cells are among the most dynamic entities, constantly undergoing various processes such as growth, division, movement, and interaction with other cells as well as the environment. Time-lapse microscopy is central to capturing these dynamic behaviors, providing detailed temporal and spatial information that allows biologists to observe and analyze cellular activities in real-time. The analysis of time-lapse microscopy data relies on two fundamental tasks: cell segmentation and cell tracking. Integrating deep learning into bioimage analysis has revolutionized cell segmentation, producing models with high precision across a wide range of biological images. However, developing generalizable deep-learning models for tracking cells over time remains challenging due to the scarcity of large, diverse annotated datasets of time-lapse movies of cells. To address this bottleneck, we propose a GAN-based time-lapse microscopy generator, termed tGAN, designed to significantly enhance the quality and diversity of synthetic annotated time-lapse microscopy data. Our model features a dual-resolution architecture that adeptly synthesizes both low and high-resolution images, uniquely capturing the intricate dynamics of cellular processes essential for accurate tracking. We demonstrate the performance of tGAN in generating high-quality, realistic, annotated time-lapse videos. Our findings indicate that tGAN decreases dependency on extensive manual annotation to enhance the precision of cell tracking models for time-lapse microscopy.
Identifiants
pubmed: 38915545
doi: 10.1101/2024.06.11.598572
pmc: PMC11195160
pii:
doi:
Types de publication
Journal Article
Preprint
Langues
eng