Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tiny fix of mmedit/apis/test.py #261

Merged
merged 4 commits into from
Apr 21, 2021
Merged

Conversation

Yshuo-Li
Copy link
Collaborator

In the current test program (mmedit/apis/test.py), the GPU memory occupied by the processed data is not released in time, and CUDA out of memory if the test data is large.
Fix the issue in this PR.

@innerlee
Copy link
Contributor

Nice. Is this a common problem for all repos?

@innerlee
Copy link
Contributor

innerlee commented Apr 19, 2021

Could you pls test how this change affects the inference time?
Some discussions say that this will slow things down.

@codecov
Copy link

codecov bot commented Apr 19, 2021

Codecov Report

Merging #261 (546499c) into master (2541089) will decrease coverage by 1.40%.
The diff coverage is 0.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #261      +/-   ##
==========================================
- Coverage   81.34%   79.93%   -1.41%     
==========================================
  Files         158      159       +1     
  Lines        7773     7941     +168     
  Branches     1152     1177      +25     
==========================================
+ Hits         6323     6348      +25     
- Misses       1306     1449     +143     
  Partials      144      144              
Flag Coverage Δ
unittests 79.93% <0.00%> (-1.41%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmedit/apis/test.py 25.25% <0.00%> (-0.53%) ⬇️
mmedit/models/restorers/basicvsr.py 69.23% <0.00%> (-4.27%) ⬇️
...edit/models/backbones/sr_backbones/basicvsr_net.py 87.87% <0.00%> (-0.68%) ⬇️
mmedit/models/backbones/sr_backbones/__init__.py 100.00% <0.00%> (ø)
mmedit/models/backbones/sr_backbones/iconvsr.py 14.10% <0.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2541089...546499c. Read the comment docs.

@@ -82,6 +83,8 @@ def multi_gpu_test(model,
save_path (str): The path to save image. Default: None.
iteration (int): Iteration number. It is used for the save image name.
Default: None.
limited_gpu (bool): Limited CUDA memory or not. Default: False.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

limited_gpu -> empty_cache

Empty GPU cache in each iteration or not

@@ -83,8 +83,7 @@ def multi_gpu_test(model,
save_path (str): The path to save image. Default: None.
iteration (int): Iteration number. It is used for the save image name.
Default: None.
limited_gpu (bool): Limited CUDA memory or not. Default: False.
If limited_gpu and not gpu_collect, empty cache in every batch.
empty_cache (bool): empty cache in every batch. Default: False.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

batch -> iteration.

tools/test.py Outdated
@@ -88,7 +88,7 @@ def main():
model = build_model(cfg.model, train_cfg=None, test_cfg=cfg.test_cfg)

args.save_image = args.save_path is not None
limited_gpu = cfg.limited_gpu is not None and cfg.limited_gpu
empty_cache = cfg.empty_cache is not None and cfg.empty_cache
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cfg.get('empty_cache', False)

@innerlee
Copy link
Contributor

I suggest adding an item in FAQ https://github.com/open-mmlab/mmediting/blob/master/docs/faq.md, to demonstrate how to use this option.

@innerlee innerlee merged commit d1320da into open-mmlab:master Apr 21, 2021
@Yshuo-Li Yshuo-Li deleted the fix_test branch April 29, 2021 11:26
Yshuo-Li added a commit to Yshuo-Li/mmediting that referenced this pull request Jul 15, 2022
* tiny fix

* Tiny Fix, add limited_gpu.

* Tiny Fix

* Tiny Fix

Co-authored-by: liyinshuo <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants