Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Support sparse for custom python operators #8620

Merged
merged 26 commits into from
Nov 16, 2017

Conversation

anirudh2290
Copy link
Member

@anirudh2290 anirudh2290 commented Nov 12, 2017

Description

This PR is to support sparse for custom python operators. Specifically, one should be able to use sparse storage types with custom python operators, right now only default storage types are supported.

Checklist

Essentials

  • Passed code style checking (make lint)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • For user-facing API changes, API doc string has been updated. For new C++ functions in header files, their functionalities and arguments are well-documented.
  • To my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

  • Support for infer storage type in forward pass
  • Support for infer storage type in backward pass

Comments

@piiswrong @eric-haibin-lin

@eric-haibin-lin eric-haibin-lin self-assigned this Nov 12, 2017
return in_stype, [in_stype[0]]*len(self.list_outputs()), \
[in_stype[0]]*len(self.list_auxiliary_states())

def infer_storage_type(self, in_stype):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: would it be more natural to put def infer_storage_type before def infer_storage_type_backward?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed this

Returns
-------
in_stype : list
list of argument stypes. Can be modified from in_stype.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does Can be modified from in_stype mean? Can they be modified when it's already set?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No it cannot be modified if it is already set. Can be modified only if it is undefined storage type. I have removed the line to avoid confusion

for (size_t i = 0; i < oattr->size(); i++) {
STORAGE_TYPE_ASSIGN_CHECK(*oattr, i, kDefaultStorage);
}
dispatch_mode_assign(dispatch_mode, DispatchMode::kFComputeEx);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we use DISPATCH_MODE_ASSIGN_CHECK instead?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes Changed.

if in_data[0].stype == 'default':
aux[0][:] = 1
self.assign(out_data[0], req[0], in_data[0]*in_data[0])
else:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you add check here:

if data.stype == 'csr': 
    assert(isinstance(data, CSRNDArray))

check_numeric_gradient(op, [x], [aux])

x = mx.nd.array(np.random.uniform(-1, 1, size=(4, 10)))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this line duplicated on purpose?

@@ -3570,12 +3570,19 @@ def test_rcbrt_op():
def test_custom_op():
class Sqr(mx.operator.CustomOp):
def forward(self, is_train, req, in_data, out_data, aux):
self.assign(out_data[0], req[0], in_data[0]*in_data[0])
aux[0][:] = 1
#self.assign(out_data[0], req[0], in_data[0]*in_data[0])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove unused code?

@piiswrong piiswrong merged commit 938eda9 into apache:master Nov 16, 2017
piiswrong added a commit to piiswrong/mxnet that referenced this pull request Nov 20, 2017
cjolivier01 pushed a commit that referenced this pull request Nov 21, 2017
eric-haibin-lin pushed a commit to eric-haibin-lin/mxnet that referenced this pull request Dec 3, 2017
* Add asscipy support and coo format support

* Comment misalignment change

* Add documentation for Sparse NDarray

* Change comment

* Adding comments and support for dtype

* Modifying tests

* Add spsp None check

* Fix lint

* Custom operators for sparse

* Use DISPATCH_MODE_ASSIGN_CHECK

* Change NDArray to _ndarray_cls

* Remove redundant code

* Add a test to make sure the NDArray is an instance of CSRNDArray

* Fix lint

* Fix test

* Trigger CI
eric-haibin-lin pushed a commit to eric-haibin-lin/mxnet that referenced this pull request Dec 3, 2017
zhreshold pushed a commit to zhreshold/mxnet that referenced this pull request Dec 14, 2017
rahul003 pushed a commit to rahul003/mxnet that referenced this pull request Jun 4, 2018
* Add asscipy support and coo format support

* Comment misalignment change

* Add documentation for Sparse NDarray

* Change comment

* Adding comments and support for dtype

* Modifying tests

* Add spsp None check

* Fix lint

* Custom operators for sparse

* Use DISPATCH_MODE_ASSIGN_CHECK

* Change NDArray to _ndarray_cls

* Remove redundant code

* Add a test to make sure the NDArray is an instance of CSRNDArray

* Fix lint

* Fix test

* Trigger CI
rahul003 pushed a commit to rahul003/mxnet that referenced this pull request Jun 4, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants