LNCD

Table of Contents

Table of Contents

  • nnUNet
    • Use
    • Install
  • LNCD Home
  • Administration
  • Notebooks
  • Journal Club Presentations
  • Publications
  • Current Projects
  • Completed Projects
  • Current Grants
  • Datasets by Project
  • Brain ROIs and Measures
  • ️Tools And Methods
  • Big Data
  • RA Homepage
  • Undergrad Resources
  • Recent Changes
  • Maintenance
  • Site Map
  • Random Page
LNCD
Docs » nnUNet

nnUNet

From https://github.com/MIC-DKFZ/nnUNet initially (2026-03) for CSpine.

nnU-Net is a semantic segmentation method that automatically adapts to a given dataset.

Use

https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/dataset_format.md

[Input] Datasets consist of three components: raw images, corresponding segmentation maps and a dataset.json file specifying some metadata.
Remember: For each training case, all images must have the same geometry to ensure that their pixel arrays are aligned. Also make sure that all your data is co-registered!

Install

On Rhea (linux server) using uv with UV_TOOL_BIN_DIR set with ''/opt/ni_tools'

uv tool install\
  git+https://github.com/MIC-DKFZ/nnUNet.git \
  --with git+https://github.com/FabianIsensee/hiddenlayer.git

Yields

Installed 21 executables: 
nnUNetv2_accumulate_crossval_results, nnUNetv2_apply_postprocessing, nnUNetv2_convert_MSD_dataset, nnUNetv2_convert_old_nnUNet_dataset, nnUNetv2_determine_postprocessing, nnUNetv2_download_pretrained_model_by_url, nnUNetv2_ensemble, nnUNetv2_evaluate_folder, nnUNetv2_evaluate_simple, nnUNetv2_export_model_to_zip, nnUNetv2_extract_fingerprint, nnUNetv2_find_best_configuration, nnUNetv2_install_pretrained_model_from_zip, nnUNetv2_move_plans_between_datasets, nnUNetv2_plan_and_preprocess, nnUNetv2_plan_experiment, nnUNetv2_plot_overlay_pngs, nnUNetv2_predict, nnUNetv2_predict_from_modelfolder, nnUNetv2_preprocess, nnUNetv2_train
stat -c "%N" $(which nnUNetv2_train)
#'/opt/ni_tools/python/uv/bin/nnUNetv2_train' -> '/opt/ni_tools/python/uv/tool/nnunetv2/bin/nnUNetv2_t
Previous Next