Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[Numpy] unknown type_flag=7 #17638

Closed
zheyuye opened this issue Feb 20, 2020 · 3 comments · Fixed by #17674
Closed

[Numpy] unknown type_flag=7 #17638

zheyuye opened this issue Feb 20, 2020 · 3 comments · Fixed by #17674
Labels

Comments

@zheyuye
Copy link
Contributor

zheyuye commented Feb 20, 2020

Description

A series of issues related to kBool occured afther the pull #17438 and #4571 in tvm. This is, after all, a serious problem that makes many of deep numpy's features unusable.

Here is a simple error case.

Error Message

---------------------------------------------------------------------------
MXNetError                                Traceback (most recent call last)
<ipython-input-2-48e3c90810f4> in <module>
     17 foo = Foo()
     18 foo.hybridize()
---> 19 out = foo(mx.np.ones((10,), ctx=mx.gpu()))

~/incubator-mxnet/python/mxnet/gluon/block.py in __call__(self, *args)
    680             hook(self, args)
    681 
--> 682         out = self.forward(*args)
    683 
    684         for hook in self._forward_hooks.values():

~/incubator-mxnet/python/mxnet/gluon/block.py in forward(self, x, *args)
   1175                                      'Find all contexts = {}'.format(ctx_set))
   1176                 with ctx:
-> 1177                     return self._call_cached_op(x, *args)
   1178             with ctx:
   1179                 try:

~/incubator-mxnet/python/mxnet/gluon/block.py in _call_cached_op(self, *args)
   1022         cargs = [args_without_none[i] if is_arg else i.data()
   1023                  for is_arg, i in self._cached_op_args]
-> 1024         out = self._cached_op(*cargs)
   1025         if isinstance(out, NDArray):
   1026             out = [out]

~/incubator-mxnet/python/mxnet/_ctypes/ndarray.py in __call__(self, *args, **kwargs)
    167             ctypes.byref(num_output),
    168             ctypes.byref(output_vars),
--> 169             ctypes.byref(out_stypes)))
    170 
    171         if original_output is not None:

~/incubator-mxnet/python/mxnet/base.py in check_call(ret)
    244     """
    245     if ret != 0:
--> 246         raise get_last_ffi_error()
    247 
    248 

MXNetError: Traceback (most recent call last):
  File "../src/nnvm/plan_memory.cc", line 58
MXNetError: unknown type_flag=7

To Reproduce

import mxnet as mx
import numpy as np
from numpy.testing import assert_allclose
from mxnet.gluon import HybridBlock
mx.npx.set_np()

class Foo(HybridBlock):
    def __init__(self, prefix=None, params=None):
        super(Foo, self).__init__(prefix=prefix, params=params)

    def hybrid_forward(self, F, valid_length):
        mask = (F.np.ones((10,)) < valid_length).astype(np.float32)
        mask2 = (F.np.ones((10,)) < valid_length).astype(np.float32)
        mask = mask * F.np.expand_dims(mask2, axis=-1)
        return mask

foo = Foo()
foo.hybridize()
out = foo(mx.np.ones((10,), ctx=mx.gpu()))

Comments

@sxjscience @yzhliu

@zheyuye zheyuye added the Bug label Feb 20, 2020
@yzhliu
Copy link
Member

yzhliu commented Feb 20, 2020

looking into it. thanks for reporting.

@sxjscience
Copy link
Member

I find it could be further simplified:

import mxnet as mx
import numpy as np
from numpy.testing import assert_allclose
from mxnet.gluon import HybridBlock
mx.npx.set_np()

class Foo(HybridBlock):
    def __init__(self, prefix=None, params=None):
        super(Foo, self).__init__(prefix=prefix, params=params)

    def hybrid_forward(self, F, valid_length):
        mask = (F.np.ones((10,)) < valid_length).astype(np.float32)
        return mask

foo = Foo()
foo.hybridize()
out = foo(mx.np.ones((10,), ctx=mx.gpu()))
print(out)

@sxjscience
Copy link
Member

import mxnet as mx
import numpy as np
import os
from numpy.testing import assert_allclose
from mxnet.gluon import HybridBlock
mx.npx.set_np()

os.environ['DMLC_LOG_STACK_TRACE_DEPTH'] = '30'

class Foo(HybridBlock):
    def __init__(self, prefix=None, params=None):
        super(Foo, self).__init__(prefix=prefix, params=params)

    def hybrid_forward(self, F, valid_length):
        mask = (F.np.ones((10,)) < valid_length).astype(np.float32)
        return mask

foo = Foo()
foo.hybridize()
out = foo(mx.np.ones((10,), ctx=mx.gpu()))
print(out)
MXNetError: Traceback (most recent call last):
  [bt] (12) libmxnet.so(MXInvokeCachedOpEx+0x60) [0x7fce362ab420]
  [bt] (11) libmxnet.so(MXInvokeCachedOp+0x42e) [0x7fce362aad4e]
  [bt] (10) libmxnet.so(mxnet::CachedOp::Forward(std::shared_ptr<mxnet::CachedOp> const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&)+0xc77) [0x7fce36421d27]
  [bt] (9) libmxnet.so(mxnet::CachedOp::DynamicForward(mxnet::Context const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&, bool)+0x20c) [0x7fce3641973c]
  [bt] (8) libmxnet.so(mxnet::CachedOp::SetForwardGraph(mxnet::CachedOp::GraphInfo*, bool, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&)+0x965) [0x7fce36418525]
  [bt] (7) libmxnet.so(mxnet::imperative::MXPlanMemory(nnvm::Graph*, std::vector<int, std::allocator<int> >&&, std::vector<unsigned int, std::allocator<unsigned int> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::pair<unsigned int, unsigned int> const&, std::pair<unsigned int, unsigned int> const&, bool)+0x1e8) [0x7fce3642f598]
  [bt] (6) libmxnet.so(nnvm::ApplyPass(nnvm::Graph, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)+0x213) [0x7fce362d0ab3]
  [bt] (5) libmxnet.so(nnvm::ApplyPasses(nnvm::Graph, std::vector<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::allocator<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > > > const&)+0xecd) [0x7fce3a0b87bd]
  [bt] (4) libmxnet.so(std::_Function_handler<nnvm::Graph (nnvm::Graph), nnvm::Graph (*)(nnvm::Graph)>::_M_invoke(std::_Any_data const&, nnvm::Graph&&)+0x10a) [0x7fce365e6b9a]
  [bt] (3) libmxnet.so(+0x1f279d0) [0x7fce366169d0]
  [bt] (2) libmxnet.so(+0x1f26a6e) [0x7fce36615a6e]
  [bt] (1) libmxnet.so(+0x1f2424d) [0x7fce3661324d]
  [bt] (0) libmxnet.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x72) [0x7fce3622b982]
  File "../src/nnvm/plan_memory.cc", line 58
MXNetError: unknown type_flag=7

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants