Skip to content

Latest commit

 

History

History
60 lines (41 loc) · 2.16 KB

README.md

File metadata and controls

60 lines (41 loc) · 2.16 KB

ENet-keras

license Read the Docs

This is an implementation of ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation, ported from ENet-training (lua-torch) to keras.

Installation

Get code

git clone https://github.com/PavlosMelissinos/enet-keras.git
cd enet-keras

Setup environment

Dependencies

On poetry: poetry install

On Anaconda/miniconda: conda env create -f environment.yml

On pip: pip install -r requirements.txt

Set up data/model

make setup

The setup script only sets up some directories and converts the model to an appropriate format.

Usage

Train on MS-COCO

make train

Remaining tasks

  • Clean up code
    • Remove hardcoded paths
    • Add documentation everywhere
  • Test code
    • Add tests
  • Fix performance (mostly preprocessing bottleneck)
    • Remove unnecessary computations in data preprocessing
    • Index dataset category internals. Dataset categories have fields with one-to-one correspondence like id, category_id, palette, categories. This seems like perfect table structure. Might be too much though.
    • (Optionally) Make data loader multithreaded (no idea how to approach this one, multithreadedness is handled by keras though)
  • Enhance reproducibility/usability
    • Upload pretrained model
    • Finalize predict.py
      • Test whether it works after latest changes
      • Modify predict.py to load a single image or from a file. There's no point in loading images from the validation set.
  • Fix bugs
    • Investigate reason for bad results, see #11
    • Fix MSCOCOReduced, also see #9
    • ?????