NNTrainer 0.4.0 Release
We are releasing NNTrainer v0.4.0
RPM Files are for Tizen, built daily at build.tizen.org (https://build.tizen.org/package/show/Tizen:Unified/nntrainer)
and available at download.tizen.org. ( http://download.tizen.org/snapshots/tizen/unified/latest/repos/standard/packages/ )
DEB files are for Ubuntu, built and download from launchpad.net (https:://launchpad.net/~nnstreamer/+archive/ubuntu/ppa )
If you have unresoved dependencies, please download them from Ubuntu universe and nnstreamer PPA
In this release:
Fixes
- Fix Batch Normalization Bugs
- Fix Embedding Layer Bugs
- Fix Grdient Access Bugs
- Add a lot of unit tests to evaluate NNTrainer implementation
and more.
New Features
- New Layers
- Attention Layer
- Eanble Weight / Tensor Sharing
- Implement Realizer to manipulate the network graph
- Flatten Realizer, Recurrent Realizer with in/out property, Privious Input Realizer, Attach Activation Layer Realizer
- Support Conv1D Layer
- Support Dilation Property
- Support multi-label/input for model
- Support reshape Layer
- Support Batch normalization 1 D
- Support LSTM Cell Layer
- Support RNNCell Layer
- Support GRUCell Layer
- Support Mol Attention Layer
- Support Multi-Head Attention Layer
- Support Gradient Clipping by Global Norm
- Support Reduce Mean Layer
- Support Leaky Relu Layer
- Support Zoneout LSTM Cell Layer
- Support Learning Rate Scheduling
- Improve Load/Save Model
- Support TFLite Export (Experimental)
- Support Positional Encoding Layer
- Support Layer Normalization
and more
- Provides More C/C++ APIs
- New Applications
- Transformer Applications
and more
- Transformer Applications