Back to Projects List
Data and model exchange across different sources
Kaapana tutorial for the 38th NA-MIC project week:
https://drive.google.com/file/d/1A7-8Ru0uTJHFFa17rZtkBpvNhJao_F7x/view?usp=share_link
Key Investigators
- Benjamin Hamm (German Cancer Research Center, Germany)
- Ünal Akünal (German Cancer Research Center, Germany)
- Markus Bujotzek (German Cancer Research Center, Germany)
- Klaus Kades (German Cancer Research Center, Germany)
- Andrey Fedorov (Brigham and Women’s Hospital, USA)
Project Description
Implementations and discussion about a standardized data and model exchange between different platforms such as Kaapana and MONAI. Working on integrating Kaapana with other toolkits.
- Motivation: Running Kaapana platforms in multiple (inter-)national projects: RACOON, DART, …
- Goal: Standarized and Federated Data Analysis / Federated Learning require standardized model exchange formats
Objective
Support standardized data and AI model I/O interfaces in Kaapana.
- Support of various AI model sources
- Integration of MONAI Model Zoo into Kaapana
- inference pipeline as a Kaapana workflow / as a Kaapana extension
- training pipeline
- generic support of MONAI Bundles (MONAI Label / MONAI Deploy / MONAI FL)
- Standardized remote model execution, execution of models from modelhub.ai within Kaapana
- Integration/support of data sources:
- TCIA download/(upload) into Kaapana
- Integration with IDC: download of data via Google Cloud SDK
- Integration of new analysis tools into Kaapana
- Javascript/Python library client to communicate with Kaapana
Relate to:
Approach and Plan
- Support of various AI model sources
- Integration of MONAI Model Zoo into Kaapana
- inference pipeline as a Kaapana workflow / as a Kaapana extension
- training pipeline
- generic support of MONAI Bundles (MONAI Label / MONAI Deploy / MONAI FL)
- Standardized remote model execution, execution of models from modelhub.ai within Kaapana
- Current progress:
- Integration/support of data sources:
- TCIA download/(upload) into Kaapana
- Kaapana workflow to download specific TCIA datasets
- select to-be-downloaded dataset via UI
- send downloaded dataset to Kaapana’s PACS
Progress and Next Steps
- Support of various AI model sources
- Integration of MONAI Model Zoo into Kaapana
- Proof of concept: Intgration of MONAI Model Zoos spleen CT segmentation works
- tbd: Finalize integration in Kaapana
- tbd: Add more monai bundles
- Completed the implementation of a workflow in Kaapana for modelhub.ai
- Supports each model already available in mhub
- A wrapper around the dockerfile of models in mhub
- Ability to visualize the segmentations using Slicer, MITK or OHIF on a web browser
- Integration/support of data sources:
- TCIA download/(upload) into Kaapana
- Implemented
service-tcia-download
. Now it is possible to drag and drop a .tcia manifest file into Kaapana (in minio). This will start a workflow which downloads the data from TCIA via their REST-API. Number of workers can be set in the operator.
Illustrations
tbd
Background and References
- https://www.kaapana.ai/
- http://app.modelhub.ai/
- https://www.cancerimagingarchive.net/
- https://monai.io/