-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Relay][Dyn] Dynamic TopK Op #6008
Conversation
out = _make.topk(data, k, axis, ret_type, is_ascend, dtype) | ||
if isinstance(k, Constant): | ||
k = np.asscalar(k.data.asnumpy()) | ||
if isinstance(k, Expr): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm concered about this case: If data's shape is not static, it needs to use _dyn_make.topk() too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I might be missing something. If it's a Relay Constant, we go static. If it's another Relay Expr, we go dynamic, if it's a python literal, we go static. Is there another possibility?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I meant is data
, not k
. data
can only be Relay Expr. Should we always use _dyn.topk and let DynmaicToStatic determine whether can convert it to static op?
Because TopK's out_shape depends on shape of data, if shape of data is dynamic, out_shape becomes dynamic, we need to use shape function in that case
https://github.com/apache/incubator-tvm/blob/eafb2aa13d6cd223629f17d5f6aab5a8d4fce7f5/src/relay/op/algorithm/topk.cc#L50
shape = infer_shape(data)
if Any_in_shape():
# go dynamic
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Current dyn namespace is for attrs to be Relay Expr which makes shape func data dependent. For data independent shape func we can still relay on "static" version op.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Current dyn namespace is for attrs to be Relay Expr which makes shape func data dependent. For data independent shape func we can still relay on "static" version op.
I understand that. But it's possbile that an op is input shape depandeant but input shape itself is dynamic. In this kind case, we still need to use shape function.
This is a simliar case which doesn't have topk, but has similar issue.
v = var("v", int32)
t0 = arange(0, v, 1) # output an dynamic shape tensor
t1 = strided_slice(t0, [0], [-1], [1]) # output shape depends on a dynamic input shape, even if attrs are static
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I think this is a bit of complexity around how @icemelon9 and co implemented dynamic shapes in Relay. Basically, any op can take in Any
shapes, at which point the graph becomes un-runable on the graph runtime and has to be executed through the VM.
As @kevinthesun said, what we've been working on recently is allowing for non-constant attributes, the dyn
namespace is mostly to make non-constant attributes explicit. If we need to separate all dynamic shapes, I think we'd need to implement a vm
specific op namespace, and I'm not sure we want to go that far?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lixiaoquan Data independent shape func is still there and can handle dynamic input shape cases. @mbrookhart I think for now we can limit dyn namespace to Expr attrs and later consider how to have a more uniform interface.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for pointing that out, I just realize 'static' version op still have some dynamic shape handling ability. I think I misunderstood 'dyn' namespace before.
Thanks @mbrookhart @lixiaoquan @kevinthesun |
* add dynamic topk op * add topk to dynamic_to_static pass * fix TF test * fix pylint
* add dynamic topk op * add topk to dynamic_to_static pass * fix TF test * fix pylint
@zhiics @lixiaoquan @kevinthesun
This splits dynamic TopK out of the standard op, many thanks to @kevinthesun for all of the pieces. :)
Adds the op to the dynamic to static pass, and adds dynamic testing.
Thanks!