Skip to content

Neural Networks for estimating spatially resolved radiation fields as presented in the paper "Estimating Spatially Resolved Radiation Fields Using Neural Networks"

License

Notifications You must be signed in to change notification settings

Centrasis/radfield3d-nn

Repository files navigation

RadField3D Neural Networks

This framework is designed to train neural models using spatially resolved radiation fields stored and loaded by RadFiled3D. It supports direct per volume and per voxel predictors. The framework leverages PyTorch and PyTorch Lightning for model training.

All methods are described in detail by the paper: Estimating Spatially Resolved Radiation Fields Using Neural Networks

Requirements

All required modules are listed in the requirements.txt file. Ensure you have all dependencies installed before running the training scripts. Note: install pytorch for your existing CUDA version prior to installing the requirements.txt. If not, the CPU variant of pytorch gets installed.

Usage

  1. Install the required dependencies:

    pip install -r requirements.txt
  2. Run the training script:

    python -m run_network_task --task train --model_config </path/to/config.json> --epochs <n> --dataset_path </path/to/dataset_folder or .zip> --dataset_type <Layerwise/Voxelwise> --batch_size <n> --num_workers <n> <--join_channels |> --effective_batch_size <n> --normalization <linear0_1 | linear-1_1 | log_1e+3> --logs_path </path/to/store/logs> --mu_tr_file <path/to/used/mu_tr.txt> --enforce_voxel_resolution <W H D> --logger <mlflow | wandb> <--max_inner_batch_size <n> |> <--use_beam_parameters |> <--use_beam_parameters |> <--validate_gt |>
  3. To tune hyperparameters of a model:

    python -m run_network_task --task tune --model_config </path/to/config.json> --epochs <n> --dataset_path </path/to/dataset_folder or .zip> --dataset_type <Layerwise/Voxelwise> --batch_size <n> --num_workers <n> <--join_channels |> --effective_batch_size <n> --normalization <linear0_1 | linear-1_1 | log_1e+3> --logs_path </path/to/store/logs> --mu_tr_file <path/to/used/mu_tr.txt> --enforce_voxel_resolution <W H D> --logger <mlflow | wandb> <--max_inner_batch_size <n> |> <--use_beam_parameters |> <--use_beam_parameters |> <--validate_gt |> --n_trials <n>

For a short description of each parameter please just call:

    python -m run_network_task --help

Optional dependencies

  • WandB: Optional cloud-based logger.
  • mlflow: Optional local logger.
  • tcnn: Highly optimized CUDA implementation of fully connected neural networks and encodings, like hashgrid or spherical harmonics.

Datasets

  • Datasets are located on Zenodo:
    • DS-01: Fixed H-100 cone beam; fixed distance
    • DS-02: Dynamic C-Arm spectra cone beam; fixed distance
    • DS-03: Dynamic C-Arm spectra rectangular beam; dynamic distance

Getting started

Using models

In order to load models, place the models configuration json together with the weights file, sharing the same basename in a folder. Just load the weights file to let the module search for the configuration to create the matching model.

Adding models

Inherit from BaseNeuralRadFieldModel and set the __model_name__ class attribute with a matching name. Make sure, that the file of the new model was imported before importing radfield3dnn.models to allow the model factory to access the new model defininition. In order to store and load hyperparameters of the model, that are passed to the constructor, override the get_custom_parameters(self) method.

About

Neural Networks for estimating spatially resolved radiation fields as presented in the paper "Estimating Spatially Resolved Radiation Fields Using Neural Networks"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages