forked from open-mmlab/mmpose
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* first tsm cn config * template * tin c3d * csn * i3d * r31d * slowonly slowfast * not full tanet * ssn bsn * half bmn * bmn bsn * polish
- Loading branch information
1 parent
dea32a5
commit 24d1b7e
Showing
18 changed files
with
1,394 additions
and
7 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,98 @@ | ||
# BMN | ||
|
||
## 简介 | ||
|
||
[ALGORITHM] | ||
|
||
```BibTeX | ||
@inproceedings{lin2019bmn, | ||
title={Bmn: Boundary-matching network for temporal action proposal generation}, | ||
author={Lin, Tianwei and Liu, Xiao and Li, Xin and Ding, Errui and Wen, Shilei}, | ||
booktitle={Proceedings of the IEEE International Conference on Computer Vision}, | ||
pages={3889--3898}, | ||
year={2019} | ||
} | ||
``` | ||
|
||
[DATASET] | ||
|
||
```BibTeX | ||
@article{zhao2017cuhk, | ||
title={Cuhk \& ethz \& siat submission to activitynet challenge 2017}, | ||
author={Zhao, Y and Zhang, B and Wu, Z and Yang, S and Zhou, L and Yan, S and Wang, L and Xiong, Y and Lin, D and Qiao, Y and others}, | ||
journal={arXiv preprint arXiv:1710.08011}, | ||
volume={8}, | ||
year={2017} | ||
} | ||
``` | ||
|
||
## 模型库 | ||
|
||
### ActivityNet feature | ||
|
||
|配置文件 |特征 | GPU 数量 | AR@100| AUC | [email protected] | [email protected] | [email protected] | mAP | GPU 显存占用 (M) | 推理时间 (s) | ckpt | log| json| | ||
|:-:|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:-:|---|:-:|:-:|---| | ||
|[bmn_400x100_9e_2x8_activitynet_feature](/configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py) |cuhk_mean_100 |2|75.28|67.22|42.47|31.31|9.92|30.34|5420|3.27|[ckpt](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature_20200619-42a3b111.pth)| [log](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature.log)| [json](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature.log.json)| | ||
| |mmaction_video |2|75.43|67.22|42.62|31.56|10.86|30.77|5420|3.27|[ckpt](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809-c9fd14d2.pth)| [log](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809.log) | [json](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809.json) | | ||
| |mmaction_clip |2|75.35|67.38|43.08|32.19|10.73|31.15|5420|3.27|[ckpt](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809-10d803ce.pth)| [log](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809.log) | [json](https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809.json) | | ||
| [BMN-official](https://github.com/JJBOY/BMN-Boundary-Matching-Network) (for reference)* |cuhk_mean_100 |-|75.27|67.49|42.22|30.98|9.22|30.00|-|-|-| - | - | | ||
|
||
- Notes: | ||
|
||
1. 这里的 **GPU 数量** 指的是得到模型权重文件对应的 GPU 个数。默认地,MMAction2 所提供的配置文件对应使用 8 块 GPU 进行训练的情况。 | ||
依据 [线性缩放规则](https://arxiv.org/abs/1706.02677),当用户使用不同数量的 GPU 或者每块 GPU 处理不同视频个数时,需要根据批大小等比例地调节学习率。 | ||
如,lr=0.01 对应 4 GPUs x 2 video/gpu,以及 lr=0.08 对应 16 GPUs x 4 video/gpu。 | ||
2. 对于 **特征** 这一列,`cuhk_mean_100` 表示所使用的特征为利用 [anet2016-cuhk](https://github.com/yjxiong/anet2016-cuhk) 代码库抽取的,被广泛利用的 CUHK ActivityNet 特征, | ||
`mmaction_video` 和 `mmaction_clip` 分布表示所使用的特征为利用 MMAction 抽取的,视频级别 ActivityNet 预训练模型的特征;视频片段级别 ActivityNet 预训练模型的特征。 | ||
3. MMAction2 使用 ActivityNet2017 未剪辑视频分类赛道上 [anet_cuhk_2017](https://download.openmmlab.com/mmaction/localization/cuhk_anet17_pred.json) 所提交的结果来为每个视频的时序动作候选指定标签,以用于 BMN 模型评估。 | ||
|
||
*MMAction2 在 [原始代码库](https://github.com/JJBOY/BMN-Boundary-Matching-Network) 上训练 BMN,并且在 [anet_cuhk_2017](https://download.openmmlab.com/mmaction/localization/cuhk_anet17_pred.json) 的对应标签上评估时序动作候选生成和时序检测的结果。 | ||
|
||
对于数据集准备的细节,用户可参考 [数据集准备文档](/docs_zh_CN/data_preparation.md) 中的 ActivityNet 特征部分。 | ||
|
||
## 如何训练 | ||
|
||
用户可以使用以下指令进行模型训练。 | ||
|
||
```shell | ||
python tools/train.py ${CONFIG_FILE} [optional arguments] | ||
``` | ||
|
||
例如:在 ActivityNet 特征上训练 BMN。 | ||
|
||
```shell | ||
python tools/train.py configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py | ||
``` | ||
|
||
更多训练细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#训练配置) 中的 **训练配置** 部分。 | ||
|
||
## 如何测试 | ||
|
||
用户可以使用以下指令进行模型测试。 | ||
|
||
```shell | ||
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments] | ||
``` | ||
|
||
例如:在 ActivityNet 特征上测试 BMN 模型。 | ||
|
||
```shell | ||
# 注意:如果需要进行指标验证,需确测试数据的保标注文件包含真实标签 | ||
python tools/test.py configs/localization/bmn/bmn_400x100_2x8_9e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth --eval AR@AN --out results.json | ||
``` | ||
|
||
用户也可以利用 [anet_cuhk_2017](https://download.openmmlab.com/mmaction/localization/cuhk_anet17_pred.json) 的预测文件评估模型时序检测的结果,并生成时序动作候选文件(即命令中的 `results.json`) | ||
|
||
```shell | ||
python tools/analysis/report_map.py --proposal path/to/proposal_file | ||
``` | ||
|
||
注意: | ||
|
||
1. (可选项) 用户可以使用以下指令生成格式化的时序动作候选文件,该文件可被送入动作识别器中(目前只支持 SSN 和 P-GCN,不包括 TSN, I3D 等),以获得时序动作候选的分类结果。 | ||
|
||
```shell | ||
python tools/data/activitynet/convert_proposal_format.py | ||
``` | ||
|
||
更多测试细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,155 @@ | ||
# BSN | ||
|
||
## 简介 | ||
|
||
[ALGORITHM] | ||
|
||
```BibTeX | ||
@inproceedings{lin2018bsn, | ||
title={Bsn: Boundary sensitive network for temporal action proposal generation}, | ||
author={Lin, Tianwei and Zhao, Xu and Su, Haisheng and Wang, Chongjing and Yang, Ming}, | ||
booktitle={Proceedings of the European Conference on Computer Vision (ECCV)}, | ||
pages={3--19}, | ||
year={2018} | ||
} | ||
``` | ||
|
||
## 模型库 | ||
|
||
### ActivityNet feature | ||
|
||
|配置文件 |特征 | GPU 数量| 预训练 | AR@100| AUC | GPU 显存占用 (M) | 迭代时间 (s) | ckpt | log| json| | ||
|:--|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:-:| | ||
|bsn_400x100_1x16_20e_activitynet_feature |cuhk_mean_100 |1| None |74.66|66.45|41(TEM)+25(PEM)|0.074(TEM)+0.036(PEM)|[ckpt_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature/bsn_tem_400x100_1x16_20e_activitynet_feature_20200619-cd6accc3.pth) [ckpt_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature/bsn_pem_400x100_1x16_20e_activitynet_feature_20210203-1c27763d.pth)| [log_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature/bsn_tem_400x100_1x16_20e_activitynet_feature.log) [log_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature/bsn_pem_400x100_1x16_20e_activitynet_feature.log)| [json_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature/bsn_tem_400x100_1x16_20e_activitynet_feature.log.json) [json_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature/bsn_pem_400x100_1x16_20e_activitynet_feature.log.json)| | ||
| |mmaction_video |1| None |74.93|66.74|41(TEM)+25(PEM)|0.074(TEM)+0.036(PEM)|[ckpt_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_video/bsn_tem_400x100_1x16_20e_mmaction_video_20200809-ad6ec626.pth) [ckpt_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_video/bsn_pem_400x100_1x16_20e_mmaction_video_20200809-aa861b26.pth)| [log_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_video/bsn_tem_400x100_1x16_20e_mmaction_video_20200809.log) [log_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_video/bsn_pem_400x100_1x16_20e_mmaction_video_20200809.log) | [json_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_video/bsn_tem_400x100_1x16_20e_mmaction_video_20200809.json) [json_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_video/bsn_pem_400x100_1x16_20e_mmaction_video_20200809.json) | | ||
| |mmaction_clip |1| None |75.19|66.81|41(TEM)+25(PEM)|0.074(TEM)+0.036(PEM)|[ckpt_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_clip/bsn_tem_400x100_1x16_20e_mmaction_clip_20200809-0a563554.pth) [ckpt_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_clip/bsn_pem_400x100_1x16_20e_mmaction_clip_20200809-e32f61e6.pth)| [log_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_clip/bsn_tem_400x100_1x16_20e_mmaction_clip_20200809.log) [log_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_clip/bsn_pem_400x100_1x16_20e_mmaction_clip_20200809.log) | [json_tem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_clip/bsn_tem_400x100_1x16_20e_mmaction_clip_20200809.json) [json_pem](https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_clip/bsn_pem_400x100_1x16_20e_mmaction_clip_20200809.json) | | ||
|
||
Notes: | ||
|
||
1. 这里的 **GPU 数量** 指的是得到模型权重文件对应的 GPU 个数。默认地,MMAction2 所提供的配置文件对应使用 8 块 GPU 进行训练的情况。 | ||
依据 [线性缩放规则](https://arxiv.org/abs/1706.02677),当用户使用不同数量的 GPU 或者每块 GPU 处理不同视频个数时,需要根据批大小等比例地调节学习率。 | ||
如,lr=0.01 对应 4 GPUs x 2 video/gpu,以及 lr=0.08 对应 16 GPUs x 4 video/gpu。 | ||
2. For feature column, cuhk_mean_100 denotes the widely used cuhk activitynet feature extracted by [anet2016-cuhk](https://github.com/yjxiong/anet2016-cuhk), mmaction_video and mmaction_clip denote feature extracted by mmaction, with video-level activitynet finetuned model or clip-level activitynet finetuned model respectively. | ||
|
||
对于数据集准备的细节,用户可参考 [数据集准备文档](/docs_zh_CN/data_preparation.md) 中的 ActivityNet 特征部分。 | ||
|
||
## 如何训练 | ||
|
||
用户可以使用以下指令进行模型训练。 | ||
|
||
```shell | ||
python tools/train.py ${CONFIG_FILE} [optional arguments] | ||
``` | ||
|
||
例如: | ||
|
||
1. 在 ActivityNet 特征上训练 BSN(TEM) 模型。 | ||
|
||
```shell | ||
python tools/train.py configs/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature.py | ||
``` | ||
|
||
2. 基于 PGM 的结果训练 BSN(PEM)。 | ||
|
||
```shell | ||
python tools/train.py configs/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature.py | ||
``` | ||
|
||
更多训练细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#训练配置) 中的 **训练配置** 部分。 | ||
|
||
## 如何进行推理 | ||
|
||
用户可以使用以下指令进行模型推理。 | ||
|
||
1. 推理 TEM 模型。 | ||
|
||
```shell | ||
# Note: This could not be evaluated. | ||
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments] | ||
``` | ||
|
||
2. 推理 PGM 模型 | ||
|
||
```shell | ||
python tools/bsn_proposal_generation.py ${CONFIG_FILE} [--mode ${MODE}] | ||
``` | ||
|
||
3. 推理 PEM 模型 | ||
|
||
```shell | ||
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments] | ||
``` | ||
|
||
例如 | ||
|
||
1. 利用预训练模型进行 BSN(TEM) 模型的推理。 | ||
|
||
```shell | ||
python tools/test.py configs/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth | ||
``` | ||
|
||
2. 利用预训练模型进行 BSN(PGM) 模型的推理 | ||
|
||
```shell | ||
python tools/bsn_proposal_generation.py configs/localization/bsn/bsn_pgm_400x100_activitynet_feature.py --mode train | ||
``` | ||
|
||
3. 推理 BSN(PEM) 模型,并计算 'AR@AN' 指标,输出结果文件。 | ||
|
||
```shell | ||
# 注意:如果需要进行指标验证,需确测试数据的保标注文件包含真实标签 | ||
python tools/test.py configs/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth --eval AR@AN --out results.json | ||
``` | ||
|
||
## 如何测试 | ||
|
||
用户可以使用以下指令进行模型测试。 | ||
|
||
1. TEM | ||
|
||
```shell | ||
# 注意:该命令无法进行指标验证 | ||
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments] | ||
``` | ||
|
||
2. PGM | ||
|
||
```shell | ||
python tools/bsn_proposal_generation.py ${CONFIG_FILE} [--mode ${MODE}] | ||
``` | ||
|
||
3. PEM | ||
|
||
```shell | ||
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments] | ||
``` | ||
|
||
例如: | ||
|
||
1. 在 ActivityNet 数据集上测试 TEM 模型。 | ||
|
||
```shell | ||
python tools/test.py configs/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth | ||
``` | ||
|
||
2. 在 ActivityNet 数据集上测试 PGM 模型。 | ||
|
||
```shell | ||
python tools/bsn_proposal_generation.py configs/localization/bsn/bsn_pgm_400x100_activitynet_feature.py --mode test | ||
``` | ||
|
||
3. 测试 PEM 模型,并计算 'AR@AN' 指标,输出结果文件。 | ||
|
||
```shell | ||
python tools/test.py configs/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature.py checkpoints/SOME_CHECKPOINT.pth --eval AR@AN --out results.json | ||
``` | ||
|
||
注意: | ||
|
||
1. (可选项) 用户可以使用以下指令生成格式化的时序动作候选文件,该文件可被送入动作识别器中(目前只支持 SSN 和 P-GCN,不包括 TSN, I3D 等),以获得时序动作候选的分类结果。 | ||
|
||
```shell | ||
python tools/data/activitynet/convert_proposal_format.py | ||
``` | ||
|
||
更多测试细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
# SSN | ||
|
||
## 简介 | ||
|
||
[ALGORITHM] | ||
|
||
```BibTeX | ||
@InProceedings{Zhao_2017_ICCV, | ||
author = {Zhao, Yue and Xiong, Yuanjun and Wang, Limin and Wu, Zhirong and Tang, Xiaoou and Lin, Dahua}, | ||
title = {Temporal Action Detection With Structured Segment Networks}, | ||
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)}, | ||
month = {Oct}, | ||
year = {2017} | ||
} | ||
``` | ||
|
||
## 模型库 | ||
|
||
| 配置文件 | GPU 数量 | 主干网络 | 预训练 | [email protected] | [email protected] | [email protected] | 参考代码的 [email protected] | 参考代码的 [email protected] | 参考代码的 [email protected] | GPU 显存占用 (M) | ckpt | log | json | 参考代码的 ckpt | 参考代码的 json | | ||
|:-:|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:--:|:-:|:-:|:-:|:-:|---|:--:|:--:| | ||
|[ssn_r50_450e_thumos14_rgb](/configs/localization/ssn/ssn_r50_450e_thumos14_rgb_train.py) |8| ResNet50 | ImageNet |29.37|22.15|15.69|[27.61](https://github.com/open-mmlab/mmaction/tree/c7e3b7c11fb94131be9b48a8e3d510589addc3ce#Get%20started)|[21.28](https://github.com/open-mmlab/mmaction/tree/c7e3b7c11fb94131be9b48a8e3d510589addc3ce#Get%20started)|[14.57](https://github.com/open-mmlab/mmaction/tree/c7e3b7c11fb94131be9b48a8e3d510589addc3ce#Get%20started)|6352|[ckpt](https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/ssn_r50_450e_thumos14_rgb_20201012-1920ab16.pth)| [log](https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/20201005_144656.log)| [json](https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/20201005_144656.log.json)| [ckpt](https://download.openmmlab.com/mmaction/localization/ssn/mmaction_reference/ssn_r50_450e_thumos14_rgb_ref/ssn_r50_450e_thumos14_rgb_ref_20201014-b6f48f68.pth)| [json](https://download.openmmlab.com/mmaction/localization/ssn/mmaction_reference/ssn_r50_450e_thumos14_rgb_ref/20201008_103258.log.json)| | ||
|
||
注意: | ||
|
||
1. 这里的 **GPU 数量** 指的是得到模型权重文件对应的 GPU 个数。默认地,MMAction2 所提供的配置文件对应使用 8 块 GPU 进行训练的情况。 | ||
依据 [线性缩放规则](https://arxiv.org/abs/1706.02677),当用户使用不同数量的 GPU 或者每块 GPU 处理不同视频个数时,需要根据批大小等比例地调节学习率。 | ||
如,lr=0.01 对应 4 GPUs x 2 video/gpu,以及 lr=0.08 对应 16 GPUs x 4 video/gpu。 | ||
2. 由于 SSN 在训练和测试阶段使用不同的结构化时序金字塔池化方法(structured temporal pyramid pooling methods),请分别参考 [ssn_r50_450e_thumos14_rgb_train](/configs/localization/ssn/ssn_r50_450e_thumos14_rgb_train.py) 和 [ssn_r50_450e_thumos14_rgb_test](/configs/localization/ssn/ssn_r50_450e_thumos14_rgb_test.py)。 | ||
3. MMAction2 使用 TAG 的时序动作候选进行 SSN 模型的精度验证。关于数据准备的更多细节,用户可参考 [Data 数据集准备文档](/docs_zh_CN/data_preparation.md) 准备 thumos14 的 TAG 时序动作候选。 | ||
4. 参考代码的 SSN 模型是和 MMAction2 一样在 `ResNet50` 主干网络上验证的。注意,这里的 SSN 的初始设置与原代码库的 `BNInception` 骨干网络的设置相同。 | ||
|
||
## 如何训练 | ||
|
||
用户可以使用以下指令进行模型训练。 | ||
|
||
```shell | ||
python tools/train.py ${CONFIG_FILE} [optional arguments] | ||
``` | ||
|
||
例如:在 thumos14 数据集上训练 SSN 模型。 | ||
|
||
```shell | ||
python tools/train.py configs/localization/ssn/ssn_r50_450e_thumos14_rgb_train.py | ||
``` | ||
|
||
更多训练细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#训练配置) 中的 **训练配置** 部分。 | ||
|
||
## 如何测试 | ||
|
||
用户可以使用以下指令进行模型测试。 | ||
|
||
```shell | ||
python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [optional arguments] | ||
``` | ||
|
||
例如:在 ActivityNet 特征上测试 BMN。 | ||
|
||
```shell | ||
# Note: If evaluated, then please make sure the annotation file for test data contains groundtruth. | ||
python tools/test.py configs/localization/ssn/ssn_r50_450e_thumos14_rgb_test.py checkpoints/SOME_CHECKPOINT.pth --eval mAP | ||
``` | ||
|
||
更多测试细节,可参考 [基础教程](/docs_zh_CN/getting_started.md#测试某个数据集) 中的 **测试某个数据集** 部分。 |
Oops, something went wrong.