Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Frontend][TFLite] Try Infer the value of shape expr to avoid dynamic #12313

Merged
merged 1 commit into from
Aug 10, 2022

Conversation

blackkker
Copy link
Contributor

@blackkker blackkker commented Aug 5, 2022

When importing model yolov4 and seresenet_xx.

I got dynamic shape error Check failed: (pval != nullptr) is false: Cannot allocate memory symbolic tensor shape [?, ?].

I firgue out that It happened when the expression type of shape is relay.Expr, _dyn_make.reshape is called. Under normal circumstances, _dyn_make.reshape will not be a problem, but it work incorrcetly in these case.
So i infer the value of shape expr to avoid dynamic. The code is refered to #9459 #6504 #7712

Thanks for contributing to TVM! Please refer to guideline https://tvm.apache.org/docs/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers by @ them in the pull request thread.

@blackkker
Copy link
Contributor Author

The module print as below

{
  %1122 = concatenate(%1120) /* ty=Tensor[(2), int32] */;
  %1123 = dyn.reshape(%1121, %1122, newshape=[]) /* ty=Tensor[(?, ?), float32] */;
  %1124 = reshape(%1123, newshape=[-1, 2048]) /* ty=Tensor[(?, 2048), float32] */;
  %1125 = nn.dense(%1124, %v_param_342, units=1000) /* ty=Tensor[(?, 1000), float32] */;
  %1126 = expand_dims(%v_param_343, axis=0) /* ty=Tensor[(1, 1000), float32] */;
  %1127 = add(%1125, %1126) /* ty=Tensor[(?, 1000), float32] */;
  nn.softmax(%1127) /* ty=Tensor[(?, 1000), float32] */
}

@blackkker blackkker changed the title [TFLite] Try Infer the value of shape expr to avoid dynamic [Frontend][TFLite] Try Infer the value of shape expr to avoid dynamic Aug 5, 2022
@blackkker blackkker force-pushed the fix_reshape_dynamic branch from f73cc85 to e375933 Compare August 5, 2022 08:07
@blackkker
Copy link
Contributor Author

cc @AndrewZhaoLuo

@AndrewZhaoLuo
Copy link
Contributor

Will take a look tomorrow or today

@AndrewZhaoLuo AndrewZhaoLuo merged commit 8115849 into apache:main Aug 10, 2022
xinetzone pushed a commit to daobook/tvm that referenced this pull request Nov 25, 2022
mikeseven pushed a commit to mikeseven/tvm that referenced this pull request Sep 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants