Skip to content
This repository was archived by the owner on Nov 17, 2023. It is now read-only.

Commit 679ec51

Browse files
authored
Update NEWS.md
1 parent 7b0258e commit 679ec51

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

NEWS.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ MXNet Change Log
44

55
### New Features - Gluon RNN layers are now HybridBlocks
66
- In this release, Gluon RNN layers such as `gluon.rnn.RNN`, `gluon.rnn.LSTM`, `gluon.rnn.GRU` becomes `HybridBlock`s as part of [gluon.rnn improvements project](https://github.com/apache/incubator-mxnet/projects/11) (#11482).
7-
- This is the result of newly available fused RNN operators added for CPU: LSTM([#10104](https://github.com/apache/incubator-mxnet/pull/10104)), vanilla RNN([#10104](https://github.com/apache/incubator-mxnet/pull/10104)), GRU([#10311](https://github.com/apache/incubator-mxnet/pull/10311))
7+
- This is the result of newly available fused RNN operators added for CPU: LSTM([#10104](https://github.com/apache/incubator-mxnet/pull/10104)), vanilla RNN([#11399](https://github.com/apache/incubator-mxnet/pull/11399)), GRU([#10311](https://github.com/apache/incubator-mxnet/pull/10311))
88
- Now many dynamic networks that are based on Gluon RNN layers can now be completely hybridized, exported, and used in the inference APIs in other language bindings such as R, Scala, etc.
99

1010
### MKL-DNN improvements

0 commit comments

Comments
 (0)