Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

FBNet Pre-trained Weights for a Custom Dataset #1050

Open
mrteera opened this issue Aug 19, 2019 · 2 comments
Open

FBNet Pre-trained Weights for a Custom Dataset #1050

mrteera opened this issue Aug 19, 2019 · 2 comments

Comments

@mrteera
Copy link

mrteera commented Aug 19, 2019

❓ Questions and Help

I'm trying to use FBNet as a pre-trained weights for my dataset (ROI_BOX_HEAD: NUM_CLASSES: 3). But it cannot load the weights using this utils _load_c2_pickled_weights() https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/utils/c2_model_loading.py#L192

Do you have any suggestion to load pre-trained FBNet weights for a custom dataset?

Related coment: #507 (comment)

@Dorozhko-Anton
Copy link
Contributor

_load_c2_pickled_weights() is used to load only Caffe2 checkpoints in .pkl format

# convert Caffe2 checkpoint from pkl
if f.endswith(".pkl"):
return load_c2_format(self.cfg, f)

To load pre-trained weights into the model with different ROI_BOX_HEAD.NUM_CLASSES refer to
https://github.com/facebookresearch/maskrcnn-benchmark#finetuning-from-detectron-weights-on-custom-datasets

To use trim_detectron_model.py suggested in README.md with .pth file check out #15 , more precisely #15 (comment)

@mrteera
Copy link
Author

mrteera commented Aug 30, 2019

@Dorozhko-Anton Thank you. I successfully convert .pth model file using trim_detectron_model. These are my changed. I'll update again when I train a model with this pre-train weights.

_d =  torch.load(DETECTRON_PATH)
newdict['model'] = removekey(_d['model'],
['module.roi_heads.box.predictor.cls_score.bias', 'module.roi_heads.box.predictor.cls_score.weight', 'module.roi_heads.box.predictor.bbox_pred.bias', 'module.roi_heads.box.predictor.bbox_pred.weight'])

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants