⚠️ This release changed network architecture and training strategies. Therefore the previous checkpoints will not be compatible. The changes may also impact model performance.
Added
- Added dropout in U-net, this may increase the memory consumption.
- Support anisotropic volumes for data augmentation.
- Added data augmentation including, random gamma adjustment, random flip, random shearing.
- Added registration related metrics and losses.
Changed
- ⚠️ Moved package
imgx_datasets
into imgx/datasets
as submodule.
- 😃 Moved data set iterator out of
Experiment
to facilitate using non-TFDS data sets.
- Aligned Transformer to haiku implementation.
- Used
jax.random.fold_in
for random key splitting to avoid passing key between functions.
- Used
optax.softmax_cross_entropy
to replace custom implementation.