Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump to detrex v0.3.0 #236

Merged
merged 1 commit into from
Mar 17, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 10 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,13 +60,16 @@ The repo name detrex has several interpretations:
- <font color=#008000> <b> de-t.rex </b> </font>: de means 'the' in Dutch. T.rex, also called Tyrannosaurus Rex, means 'king of the tyrant lizards' and connects to our research work 'DINO', which is short for Dinosaur.

## What's New
v0.2.1 was released on 01/02/2023:
- Support **MaskDINO** coco instance segmentation.
- Support new **DINO** baselines: `ViTDet-DINO`, `Focal-DINO`.
- Support `FocalNet` Backbone.
- Add tutorial about `downloading pretrained backbones`, `verify installation`.
- Modified learning rate scheduler usage and add tutorial on `customized scheduler`.
- Add more readable logging information for criterion and matcher.
v0.3.0 was released on 03/17/2023:
- Support new algorithms including `Anchor-DETR` and `DETA`.
- Release more than 10+ pretrained models (including the converted weights): `DETR-R50 & R101`, `DETR-R50 & R101-DC5`, `DAB-DETR-R50 & R101-DC5`, `DAB-DETR-R50-3patterns`, `Conditional-DETR-R50 & R101-DC5`, `DN-DETR-R50-DC5`, `Anchor-DETR` and the `DETA-Swin-o365-finetune` model which can achieve **`62.9AP`** on coco val.
- Support **MaskDINO** on ADE20k semantic segmentation task.
- Support `EMAHook` during training by setting `train.model_ema.enabled=True`, which can enhance the model performance. DINO with EMA can achieve **`49.4AP`** with only 12epoch training.
- Support mixed precision training by setting `train.amp.enabled=True`, which will **reduce 20% to 30% GPU memory usage**.
- Support `train.fast_dev_run=True` for **fast debugging**.
- Support **encoder-decoder checkpoint** in DINO, which may reduce **30% GPU** memory usage.
- Support a great `slurm training scripts` by @rayleizhu, please check this issue for more details [#213](https://github.com/IDEA-Research/detrex/issues/213)


Please see [changelog.md](./changlog.md) for details and release history.

Expand Down
11 changes: 11 additions & 0 deletions changlog.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
## Change Log

### v0.3.0 (17/03/2023)
- Support new algorithms including `Anchor-DETR` and `DETA`.
- Release more than 10+ pretrained models (including the converted weights): `DETR-R50 & R101`, `DETR-R50 & R101-DC5`, `DAB-DETR-R50 & R101-DC5`, `DAB-DETR-R50-3patterns`, `Conditional-DETR-R50 & R101-DC5`, `DN-DETR-R50-DC5`, `Anchor-DETR` and the `DETA-Swin-o365-finetune` model which can achieve **`62.9AP`** on coco val.
- Support **MaskDINO** on ADE20k semantic segmentation task.
- Support `EMAHook` during training by setting `train.model_ema.enabled=True`, which can enhance the model performance. DINO with EMA can achieve **`49.4AP`** with only 12epoch training.
- Support mixed precision training by setting `train.amp.enabled=True`, which will **reduce 20% to 30% GPU memory usage**.
- Support `train.fast_dev_run=True` for **fast debugging**.
- Support **encoder-decoder checkpoint** in DINO, which may reduce **30% GPU** memory usage.
- Support a great slurm training scripts by @rayleizhu, please check this issue for more details [#213](https://github.com/IDEA-Research/detrex/issues/213)


### v0.2.1 (01/02/2023)
#### New Algorithm
- MaskDINO COCO instance-seg/panoptic-seg pre-release [#154](https://github.com/IDEA-Research/detrex/pull/154)
Expand Down
11 changes: 11 additions & 0 deletions docs/source/changelog.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,16 @@
## Change Log

### v0.3.0 (17/03/2023)
- Support new algorithms including `Anchor-DETR` and `DETA`.
- Release more than 10+ pretrained models (including the converted weights): `DETR-R50 & R101`, `DETR-R50 & R101-DC5`, `DAB-DETR-R50 & R101-DC5`, `DAB-DETR-R50-3patterns`, `Conditional-DETR-R50 & R101-DC5`, `DN-DETR-R50-DC5`, `Anchor-DETR` and the `DETA-Swin-o365-finetune` model which can achieve **`62.9AP`** on coco val.
- Support **MaskDINO** on ADE20k semantic segmentation task.
- Support `EMAHook` during training by setting `train.model_ema.enabled=True`, which can enhance the model performance. DINO with EMA can achieve **`49.4AP`** with only 12epoch training.
- Support mixed precision training by setting `train.amp.enabled=True`, which will **reduce 20% to 30% GPU memory usage**.
- Support `train.fast_dev_run=True` for **fast debugging**.
- Support **encoder-decoder checkpoint** in DINO, which may reduce **30% GPU** memory usage.
- Support a great slurm training scripts by @rayleizhu, please check this issue for more details [#213](https://github.com/IDEA-Research/detrex/issues/213)


### v0.2.1 (01/02/2023)
#### New Algorithm
- MaskDINO COCO instance-seg/panoptic-seg pre-release [#154](https://github.com/IDEA-Research/detrex/pull/154)
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
from torch.utils.cpp_extension import CUDA_HOME, CppExtension, CUDAExtension

# detrex version info
version = "0.2.1"
version = "0.3.0"
package_name = "detrex"
cwd = os.path.dirname(os.path.abspath(__file__))

Expand Down