Skip to content
This repository was archived by the owner on Nov 17, 2023. It is now read-only.

Commit c1327f3

Browse files
lebegnswamy
authored andcommitted
news, readme update for v1.3.1 release (#13225)
* news, readme update for v1.3.1 release * Added release notes
1 parent 0cb2ad6 commit c1327f3

36 files changed

Lines changed: 159 additions & 62 deletions

File tree

NEWS.md

Lines changed: 97 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,102 @@
11
MXNet Change Log
22
================
3+
4+
## 1.3.1
5+
6+
### Bug fixes
7+
8+
* [MXNET-953] Fix oob memory read (v1.3.x) / [#13118](https://github.com/apache/incubator-mxnet/pull/13118)
9+
Simple bugfix addressing an out-of-bounds memory read.
10+
11+
12+
* [MXNET-969] Fix buffer overflow in RNNOp (v1.3.x) / [#13119](https://github.com/apache/incubator-mxnet/pull/13119)
13+
This fixes an buffer overflow detected by ASAN.
14+
15+
16+
* CudnnFind() usage improvements (v1.3.x) / [#13123](https://github.com/apache/incubator-mxnet/pull/13123)
17+
This PR improves the MXNet's use of cudnnFind() to address a few issues:
18+
1. With the gluon imperative style, cudnnFind() is called during forward(), and so might have its timings perturbed by other GPU activity (including potentially other cudnnFind() calls).
19+
2. With some cuda drivers versions, care is needed to ensure that the large I/O and workspace cudaMallocs() performed by cudnnFind() are immediately released and available to MXNet.
20+
3. cudnnFind() makes both conv I/O and workspace allocations that must be covered by the GPU global memory headroom defined by MXNET_GPU_MEM_POOL_RESERVE. Per issue #12662, large convolutions can result in out-of-memory errors, even when MXNet's storage allocator has free memory in its pool.
21+
22+
This PR addresses these issues, providing the following benefits:
23+
1. Consistent algo choice for a given convolution type in a model, both for instances in the same GPU and in other GPUs in a multi-GPU training setting.
24+
2. Consistent algo choice from run to run, based on eliminating sources of interference of the cudnnFind() timing process.
25+
3. Consistent model global memory footprint, both because of the consistent algo choice (algo's can have markedly different workspace requirements) and changes to MXNet's use of cudaMalloc.
26+
4. Increased training performance based on being able to consistently run with models that approach the GPU's full global memory footprint.
27+
5. Adds a unittest for and solves issue #12662.
28+
29+
* [MXNET-922] Fix memleak in profiler (v1.3.x) / [#13120](https://github.com/apache/incubator-mxnet/pull/13120)
30+
Fix a memleak reported locally by ASAN during a normal inference test.
31+
32+
* Fix lazy record io when used with dataloader and multi_worker > 0 (v1.3.x) / [#13124](https://github.com/apache/incubator-mxnet/pull/13124)
33+
Fixes multi_worker data loader when record file is used. The MXRecordIO instance needs to require a new file handler after fork to be safely manipulated simultaneously.
34+
35+
This fix also safely voids the previous temporary fixes #12093 #11370.
36+
37+
* fixed symbols naming in RNNCell, LSTMCell, GRUCell (v1.3.x) / [#13158](https://github.com/apache/incubator-mxnet/pull/13158)
38+
This fixes #12783, by assigning all nodes in hybrid_forward a unique name. Some operations were in fact performed without attaching the appropriate (time) prefix to the name, which makes serialized graphs non-deserializable.
39+
40+
* Fixed `__setattr__` method of `_MXClassPropertyMetaClass` (v1.3.x) / [#13157](https://github.com/apache/incubator-mxnet/pull/13157)
41+
Fixed `__setattr__` method
42+
43+
* allow foreach on input with 0 length (v1.3.x) / [#13151](https://github.com/apache/incubator-mxnet/pull/13151)
44+
Fix #12470. With this change, outs shape can be inferred correctly.
45+
46+
* Infer dtype in SymbolBlock import from input symbol (v1.3.x) / [#13117](https://github.com/apache/incubator-mxnet/pull/13117)
47+
Fix for the issue - #11849
48+
Currently, Gluon symbol block cannot import any symbol with type other than fp32. All the parameters are created as FP32 leading to failure in importing the params when it is of type fp16, fp64 etc,
49+
In this PR, we infer the type of the symbol being imported and create the Symbol Block Parameters with that inferred type.
50+
Added the tests
51+
52+
### Documentation fixes
53+
54+
* Document the newly added env variable (v1.3.x) / [#13156](https://github.com/apache/incubator-mxnet/pull/13156)
55+
Document the env variable: MXNET_ENFORCE_DETERMINISM added in PR: [#12992](https://github.com/apache/incubator-mxnet/pull/12992)
56+
57+
* fix broken links (v1.3.x) / [#13155](https://github.com/apache/incubator-mxnet/pull/13155)
58+
This PR fixes broken links on the website.
59+
60+
* fix broken Python IO API docs (v1.3.x) / [#13154](https://github.com/apache/incubator-mxnet/pull/13154)
61+
Fixes [#12854: Data Iterators documentation is broken](https://github.com/apache/incubator-mxnet/issues/12854)
62+
63+
This PR manually specifies members of the IO module so that the docs will render as expected. This is workaround in the docs to deal with a bug introduced in the Python code/structure since v1.3.0. See the comments for more info.
64+
65+
This PR also fixes another issue that may or may not be related. Cross references to same-named entities like name, shape, or type are confusing Sphinx and it seems to just link to whatever it last dealt with that has the same name, and not the current module. To fix this you have to be very specific. Don't use type, use np.type if that's what you want. Otherwise you might end up with mxnet.kvstore.KVStore.type. This is a known Sphinx issue, so it might be something we have to deal with for the time being.
66+
67+
This is important for any future modules - that they recognize this issue and make efforts to map the params and other elements.
68+
69+
* add/update infer_range docs (v1.3.x) / [#13153](https://github.com/apache/incubator-mxnet/pull/13153)
70+
This PR adds or updates the docs for the infer_range feature.
71+
72+
Clarifies the param in the C op docs
73+
Clarifies the param in the the Scala symbol docs
74+
Adds the param for the the Scala ndarray docs
75+
Adds the param for the Python symbol docs
76+
Adds the param for the Python ndarray docs
77+
78+
### Other Improvements
79+
80+
* [MXNET-1179] Enforce deterministic algorithms in convolution layers (v1.3.x) / [#13152](https://github.com/apache/incubator-mxnet/pull/13152)
81+
Some of the CUDNN convolution algorithms are non-deterministic (see issue #11341). This PR adds an env variable to enforce determinism in the convolution operators. If set to true, only deterministic CUDNN algorithms will be used. If no deterministic algorithm is available, MXNet will error out.
82+
83+
84+
### Submodule updates
85+
86+
* update mshadow (v1.3.x) / [#13122](https://github.com/apache/incubator-mxnet/pull/13122)
87+
Update mshadow for omp acceleration when nvcc is not present
88+
89+
### Known issues
90+
91+
The test test_operator.test_dropout has issues and has been disabled on the branch:
92+
93+
* Disable flaky test test_operator.test_dropout (v1.3.x) / [#13200](https://github.com/apache/incubator-mxnet/pull/13200)
94+
95+
96+
97+
For more information and examples, see [full release notes](https://cwiki.apache.org/confluence/x/eZGzBQ)
98+
99+
3100
## 1.3.0
4101

5102
### New Features - Gluon RNN layers are now HybridBlocks

R-package/DESCRIPTION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
Package: mxnet
22
Type: Package
33
Title: MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems
4-
Version: 1.3.0
4+
Version: 1.3.1
55
Date: 2017-06-27
66
Author: Tianqi Chen, Qiang Kou, Tong He
77
Maintainer: Qiang Kou <qkou@qkou.info>

contrib/clojure-package/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -42,11 +42,11 @@ and _Install MXNet dependencies_
4242

4343
To use the prebuilt jars (easiest), you will need to replace the native version of the line in the project dependencies with your configuration.
4444

45-
`[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-gpu "1.2.1"]`
45+
`[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-gpu "1.3.1"]`
4646
or
47-
`[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-cpu "1.2.1"]`
47+
`[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-cpu "1.3.1"]`
4848
or
49-
`[org.apache.mxnet/mxnet-full_2.11-osx-x86_64-cpu "1.2.1"]`
49+
`[org.apache.mxnet/mxnet-full_2.11-osx-x86_64-cpu "1.3.1"]`
5050

5151
If you are using the prebuilt jars they may have a slightly different dependencies then building from source:
5252

@@ -114,7 +114,7 @@ Checkout the latest sha from the main package
114114

115115
If you need to checkout a particular release you can do it with:
116116

117-
`git checkout tags/1.2.1 -b release-1.2.1`
117+
`git checkout tags/1.3.1 -b release-1.3.1`
118118

119119
`git submodule update --init --recursive`
120120

@@ -126,7 +126,7 @@ Go here to do the base package installation https://mxnet.incubator.apache.org/i
126126

127127
Run `make scalapkg` then `make scalainstall`
128128

129-
then replace the correct jar for your architecture in the project.clj, example `[org.apache.mxnet/mxnet-full_2.11-osx-x86_64-cpu "1.3.0-SNAPSHOT"]`
129+
then replace the correct jar for your architecture in the project.clj, example `[org.apache.mxnet/mxnet-full_2.11-osx-x86_64-cpu "1.3.1-SNAPSHOT"]`
130130

131131
#### Test your installation
132132

contrib/clojure-package/examples/cnn-text-classification/project.clj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,6 @@
1919
:description "CNN text classification with MXNet"
2020
:plugins [[lein-cljfmt "0.5.7"]]
2121
:dependencies [[org.clojure/clojure "1.9.0"]
22-
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
22+
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
2323
:pedantic? :skip
2424
:main cnn-text-classification.classifier)

contrib/clojure-package/examples/gan/project.clj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,6 @@
1919
:description "GAN MNIST with MXNet"
2020
:plugins [[lein-cljfmt "0.5.7"]]
2121
:dependencies [[org.clojure/clojure "1.9.0"]
22-
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]
22+
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]
2323
[nu.pattern/opencv "2.4.9-7"]]
2424
:main gan.gan-mnist)

contrib/clojure-package/examples/imclassification/project.clj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,6 @@
1919
:description "Clojure examples for image classification"
2020
:plugins [[lein-cljfmt "0.5.7"]]
2121
:dependencies [[org.clojure/clojure "1.9.0"]
22-
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
22+
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
2323
:pedantic? :skip
2424
:main imclassification.train-mnist)

contrib/clojure-package/examples/module/project.clj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
:description "Clojure examples for module"
2020
:plugins [[lein-cljfmt "0.5.7"]]
2121
:dependencies [[org.clojure/clojure "1.9.0"]
22-
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
22+
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
2323
:pedantic? :skip
2424
:main mnist-mlp)
2525

contrib/clojure-package/examples/multi-label/project.clj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,5 +19,5 @@
1919
:description "Example of multi-label classification"
2020
:plugins [[lein-cljfmt "0.5.7"]]
2121
:dependencies [[org.clojure/clojure "1.9.0"]
22-
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
22+
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
2323
:main multi-label.core)

contrib/clojure-package/examples/neural-style/project.clj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
:description "Neural Style Transfer with MXNet"
2020
:plugins [[lein-cljfmt "0.5.7"]]
2121
:dependencies [[org.clojure/clojure "1.9.0"]
22-
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]
22+
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]
2323
[net.mikera/imagez "0.12.0"]
2424
[thinktopic/think.image "0.4.16"]]
2525
:main neural-style.core)

contrib/clojure-package/examples/pre-trained-models/project.clj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
:description "Example of using pre-trained models with MXNet"
2020
:plugins [[lein-cljfmt "0.5.7"]]
2121
:dependencies [[org.clojure/clojure "1.9.0"]
22-
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]
22+
[org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]
2323
[net.mikera/imagez "0.12.0"]
2424
[thinktopic/think.image "0.4.16"]]
2525
:main pre-trained-models.fine-tune)

0 commit comments

Comments
 (0)