Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[fx2trt] Engineholder feature improvement, test fixes #1143

Merged
merged 2 commits into from
Jun 23, 2022
Merged

Conversation

frank-wei
Copy link
Contributor

@frank-wei frank-wei commented Jun 23, 2022

Description

7babb0b82c599ba26b2d2c01d3abdff7e8cbf150 Shiyan Deng [email protected] Support multistream with dynmaic shape in TRT engineholder
4ae1863e28ff30c23795c4a7a79461dd76484143 Ankur Singla [email protected] Back out "[const_fold] Set requires_grad based on the folded tensor; add device_for_folding option"
5ca806253f880c49f77ce583fac5b3407eb422b1 Andrew Or [email protected] Fix test failures caused by D37088095
14812710f33e0c61728f7a2bbcb5a166063f5686 Shirong Wu [email protected] MTS conv1d fuse pass
05f6a66f3e4db210340c6a067a65656f2deac10c Oleg Khabinov [email protected] [fx2trt] Fix tests after upgrading TRT
e6dd390a38fd7507881a1a6323cbedb979278ea3 Wei Wei [email protected] [fx2trt] move common_fx2trt.py into fx folder
671464256263361dcabdbdea61bdb10cdd4fc82a Janet Yang [email protected] [fx2trt] fix vanilla convolution test
86b711c5720223a7ee81b3928b532c2851fc1b26 Shirong Wu [email protected] Allow specify opt_profile_size for trt
32d880619b0c5a49d03cd9f39a49c652b7be63f4 wwei6 [email protected] [fx2trt] move common_fx2trt.py into fx folder
Fixes # (issue)

Type of change

Please delete options that are not relevant and/or add your own.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist:

  • My code follows the style guidelines of this project (You can use the linters)
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas and hacks
  • I have made corresponding changes to the documentation
  • I have added tests to verify my fix or my feature
  • New and existing unit tests pass locally with my changes
  • I have added the relevant labels to my PR in so that relevant reviewers are notified

Wei Wei added 2 commits June 22, 2022 21:17
7babb0b82c599ba26b2d2c01d3abdff7e8cbf150 Shiyan Deng <[email protected]> Support multistream with dynmaic shape in TRT engineholder
4ae1863e28ff30c23795c4a7a79461dd76484143 Ankur Singla <[email protected]> Back out "[const_fold] Set requires_grad based on the folded tensor; add device_for_folding option"
5ca806253f880c49f77ce583fac5b3407eb422b1 Andrew Or <[email protected]> Fix test failures caused by D37088095
14812710f33e0c61728f7a2bbcb5a166063f5686 Shirong Wu <[email protected]> MTS conv1d fuse pass
05f6a66f3e4db210340c6a067a65656f2deac10c Oleg Khabinov <[email protected]> [fx2trt] Fix tests after upgrading TRT
e6dd390a38fd7507881a1a6323cbedb979278ea3 Wei Wei <[email protected]> [fx2trt] move common_fx2trt.py into fx folder
671464256263361dcabdbdea61bdb10cdd4fc82a Janet Yang <[email protected]> [fx2trt] fix vanilla convolution test
86b711c5720223a7ee81b3928b532c2851fc1b26 Shirong Wu <[email protected]> Allow specify opt_profile_size for trt
32d880619b0c5a49d03cd9f39a49c652b7be63f4 wwei6 <[email protected]> [fx2trt] move common_fx2trt.py into fx folder
Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

Copy link

@yinghai yinghai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We also need to do an import to sync oss code into internal, right?

@frank-wei
Copy link
Contributor Author

frank-wei commented Jun 23, 2022

We also need to do an import to sync oss code into internal, right?

yes, planned to do it after this merge. Exporting needs to be done first.

@frank-wei frank-wei merged commit dfbf9b5 into master Jun 23, 2022
@frank-wei frank-wei deleted the fb-sync-wwei6 branch June 23, 2022 05:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants