Skip to content
Snippets Groups Projects

SpatioPath Project Overview

Welcome to the SpatioPath project, a comprehensive suite of tools designed for spatial statistical analysis and cell detection in biomedical images. This project consists of two main subprojects:

  1. SpatioPath
  2. CellDetection

Below you will find an overview of each subproject, along with instructions for installation, usage, and contributing.

Table of Contents

Installation

To install the necessary dependencies for both subprojects, first create separate Python environments using the provided requirements.txt files in each subproject directory.

SpatioPath Installation

cd SpatioPath
python -m venv spatiopath-env
source spatiopath-env/bin/activate  # On Windows use `spatiopath-env\Scripts\activate`
pip install -r requirements.txt

CellDetection Installation

cd CellDetection
python -m venv celldetection-env
source celldetection-env/bin/activate  # On Windows use `celldetection-env\Scripts\activate`
pip install -r requirements.txt

SpatioPath

SpatioPath is designed for spatial statistical analysis of cell-to-cell and region-to-cell interactions.

Simulation Data Generation

To generate simulation configurations, run the SimulationGeneration.ipynb notebook. Set parameters like the number of experiments, zone widths, analysis neighborhood, and alpha, mu, sigma values.

Visualizing Simulation Results

Run the SimulationFigures.ipynb notebook to visualize the results of the simulation, helping you understand spatial patterns and interactions.

Analyzing True Images

For true image analysis, run the SpatioPathCreateh5.ipynb notebook. This will generate analysis files for each patient based on configurations you set within the notebook. The output will be stored in the paperResults folder.

Running SpatioPath on a Single Image

Use the SpatiopathInference.ipynb notebook to run SpatioPath on a single image. This will provide detailed spatial statistical insights for the specified image.

CellDetection

CellDetection is a framework for detecting cells in images and generating corresponding masks.

Creating Masks

Run the createMasks.ipynb notebook to generate masks from images. Provide the path to your images, and the notebook will handle the rest.

Training the Model

Use the CellDetection.ipynb notebook to train a cell detection model. Configure parameters such as learning rate, number of epochs, and the train-test-validation split. Logs can be visualized with TensorBoard by running:

tensorboard --logdir=logs

Evaluating the Model

Run the Evaluate.ipynb notebook to evaluate the model. This notebook generates figures displaying the F1 score to assess model performance.

Running Inference

Use the Inference.ipynb notebook to perform inference on new images. Provide a folder of images, and the notebook will generate detections and save them in a format compatible with ICY software.

Project Structure

The project is organized into two main subprojects:

  • SpatioPath/: Contains notebooks and scripts for spatial statistical analysis.
    • PaperImages/: True images for analysis.
    • paperResults/: Results of the analysis.
    • SimulationGeneration.ipynb: Notebook for generating simulation configurations.
    • SimulationFigures.ipynb: Notebook for visualizing simulation results.
    • SpatioPathCreateh5.ipynb: Notebook for generating analysis files from true images.
    • SpatiopathInference.ipynb: Notebook for running SpatioPath on a single image.
    • requirements.txt: Dependencies for SpatioPath.
  • CellDetection/: Contains notebooks and scripts for cell detection and mask generation.
    • models/: Trained model weights.
    • createMasks.ipynb: Notebook for generating masks from images.
    • CellDetection.ipynb: Notebook for training the cell detection model.
    • Evaluate.ipynb: Notebook for evaluating the model.
    • Inference.ipynb: Notebook for running inference on new images.
    • requirements.txt: Dependencies for CellDetection.
    • logs/: Logs for TensorBoard visualization.

Acknowledgments

We thank the authors of this work for their significant contributions:

  • Mohamed M. Benimam, Vannary Meas-Yedid, and Suvadip Mukherjee for their equal efforts.
  • Astri Frafjord and Alexandre Corthay for their expertise and collaboration.
  • Thibault Lagache and Jean-Christophe Olivo-Marin for their supervision and guidance.

This work was made possible by contributions from:

  • Institut Pasteur, Université de Paris Cité, CNRS UMR 3691 (BioImage Analysis Unit, Paris, France).
  • Oslo University Hospital and the University of Oslo (Tumor Immunology Lab and Hybrid Technology Hub, Oslo, Norway).

For inquiries, contact the corresponding authors: thibault.lagache@pasteur.fr, jcolivo@pasteur.fr.