Skip to content

Official repository of "Towards Learning Contrast Kinetics with Multi-Condition Latent Diffusion Models"

License

Notifications You must be signed in to change notification settings

RichardObi/ccnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CCnet

ContrastControlNet: Official repository of "Towards Learning Contrast Kinetics with Multi-Condition Latent Diffusion Models" In MICCAI 2024.

method

Getting Started

The Duke Dataset used in this study is available on The Cancer Imaging Archive (TCIA).

  • Scripts: Here you can find the scripts to start trainings and inference runs of AEKL, LDM, and ControlNet.
  • Code: Here you can find the code for training, inference and evaluation of the models.
  • Config: Here you can find the configs for training, inference and evaluation of the models.
  • Data: In the LDM_metadata.csv you can find the extracted metadata (e.g. containing scanner and contrast info) that can be input as text into the LDMs and ControlNet

Fréchet Radiomics Distance (FRD)

You can find the FRD repository here.

Let's get started and calculate the FRD with the code below:

pip install frd-score

python -m frd_score path/to/dataset_A path/to/dataset_B

Summary

poster presentation

Reference

Please consider citing our work if you found it useful:

@inproceedings{osuala2024towards,
  title={Towards learning contrast kinetics with multi-condition latent diffusion models},
  author={Osuala, Richard and Lang, Daniel M and Verma, Preeti and Joshi, Smriti and Tsirikoglou, Apostolia and Skorupko, Grzegorz and Kushibar, Kaisar and Garrucho, Lidia and Pinaya, Walter HL and Diaz, Oliver and others},
  booktitle={International Conference on Medical Image Computing and Computer-Assisted Intervention},
  pages={713--723},
  year={2024},
  organization={Springer}
}

Acknowledgements

This repository borrows and extends the code from the generative_brain_controlnet repo.