Skip to content

I am aiming to write different Semantic Segmentation models from scratch with different pretrained backbones.

Notifications You must be signed in to change notification settings

tshr-d-dragon/Semantic_Segmentation_Models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

74 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Semantic_Segmentation_Models

I am aiming to write different Semantic Segmentation models from scratch with different pretrained backbones.

1. DeepLabV3plus with SqueezeAndExcitation:

Image1

Paper(SqueezeAndExcitation): https://arxiv.org/pdf/1709.01507.pdf

Image2

Implementation:

  1. DeepLabV3plus with EfficientNet as a backbone
  2. DeepLabV3plus_SqueezeExcitation with EfficientNet as a backbone
  3. DeepLabV3plus with ResNet as a backbone
  4. DeepLabV3plus with DenseNet as a backbone
  5. DeepLabV3plus with SqueezeNet as a backbone
  6. DeepLabV3plus with VGG16 as a backbone
  7. DeepLabV3plus with ResNext101 as a backbone. We need to clone qubevl's classification_models repository (I have used the pre-trained model from this repository).

2. Global Convolutional Network (GCN):

Paper link: GCN

Image3

Implementation:

  1. GCN with ResNet50_v2 as a backbone
  2. GCN with EfficientNetB0 as a backbone

3. PSPNet:

Image4

Implementation:

  1. PSPNet with ResNet50 as a backbone

4. DeepLabV3Plus_PSPNet_SE:

This model is to try whether Deeplabv3p model with an added PSPNet module for deeper semantic feature extraction and Squeeze&Excitation modules to add channel-wise attention. I will be sharing results of this if this works out well. Image5

5. Unet:

Image6

Implementation:

  1. Unet with MobileNetV2 as a backbone
  2. Unet with EfficientNet as a backbone Coming Soon...
  3. Unet with ResNet50 as a backbone Coming Soon...

Note: We can directly use segmentation_models package for Unet (Github: http://github.com/qubvel/segmentation_models).