Skip to content

Elyorcv/Ranked-List-Loss-for-DML

 
 

Repository files navigation

Ranked-List-Loss-for-Deep-Metric-Learning

Repository Tree Structure

Dependencies

The core functions are implemented in the caffe framework. We use matlab interfaces matcaffe for data preparation.

Setup

  • Clone our repository and the submodule: Simply copy and execute following commands in the command line

    git clone [email protected]:XinshaoAmosWang/Ranked-List-Loss-for-D
    eep-Metric-Learning.git
    cd Ranked-List-Loss-for-Deep-Metric-Learning/
    git submodule add [email protected]:sciencefans/CaffeMex_v2.git
    git submodule init
    git submodule update
    git submodule update --remote --merge 
  • Put the files of new layers to the corresponding directories of submodule CaffeMex_v2

    cp New_Layers_by_XinshaoAmosWang/*.cpp CaffeMex_v2/src/caffe/layers/
    cp New_Layers_by_XinshaoAmosWang/*.hpp CaffeMex_v2/include/caffe/layers/
    cp New_Layers_by_XinshaoAmosWang/caffe.proto CaffeMex_v2/src/caffe/proto/
    cp New_Layers_by_XinshaoAmosWang/Makefile.config CaffeMex_v2/
  • Install dependencies on Ubuntu 16.04

    sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libhdf5-serial-dev protobuf-compiler
    sudo apt-get install --no-install-recommends libboost-all-dev
    sudo apt-get install libopenblas-dev
    sudo apt-get install python-dev
    sudo apt-get install libgflags-dev libgoogle-glog-dev liblmdb-dev
  • Install MATLAB 2017b

    Download and Run the install binary file

    ./install
  • Compile Caffe and matlab interface

    Note you may need to change some paths in Makefile.config according your system environment and MATLAB path

    cd CaffeMex_v2
    make -j8  && make matcaffe

Usage

Examples for reproducing our results on Stanford Online Product dataset are given.

  • Data preparation for SOP

    Downlaod Stanford_Online_Products dataset from ftp://cs.stanford.edu/cs/cvgl/Stanford_Online_Products.zip

    For simplicity, you can use the data mat file in pre_post_process directory, which is ready training and testing scripts. To solve the data path, you can do eithor a or b:

      a. Changing the path within the mat files. 
      b. A Simpler way: Create a soft link of your data
      e.g sudo ln -s /.../Stanford_Online_Products /home/xinshao/Papers_Projects/Data/Stanford_Online_Products
    
  • Custom data preparation

    You only need to create training/testing mat files with the same structure as SOP_TrainImagePathBoxCell.mat and SOP_TestImagePathBoxCell.mat in directory SOP_GoogLeNet_Ori_V05/pre_pro_process.

    e.g. SOP_TrainImagePathBoxCell.mat contains , TrainImagePathBoxCell storing all image paths and class_ids storing their corresponding semantic labels.

  • Train & Test

    Run the training and testing scripts in the training folder of a specific setting defined by its corresponding prototxt folder.

Our trained model on SOP

You can use the test scripts to test the performance of our trained model in the directory Our_trained_models_on_SOP_T10_m12_pn04_iter_16000.

More Qualitative results

[Slides] [Poster]

Citation

If you find our code and paper useful in your research, please kindly cite our paper:

InProceedings{Wang_2019_CVPR,
author = {Wang, Xinshao and Hua, Yang and Kodirov, Elyor and Hu, Guosheng and Garnier, Romain and Robertson, Neil M.},
title = {Ranked List Loss for Deep Metric Learning},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}

Common questions

1. What does ranking mean?

The overall objective is to make the postive set rank before the negative set by a distance margin. We do not need to consider the exact order of examples within the positive set and negative set.

2. What are the key components which influence the performance a lot?

  • Sample mining;
  • Sample weighting;
  • Two distance hyper-parameters for optimisation and regularisation jointly;
  • Exploiting a weighted combination of more data points.

3. How is a loss function related with deep metric learning?

Acknowledgements

Our work benefits from:

Licence

BSD 3-Clause "New" or "Revised" License

Affiliations:

  • Queen's University Belfast, UK
  • Anyvision Research Team, UK

Contact

Xinshao Wang (You can call me Amos as well) xwang39 at qub.ac.uk

About

CVPR 2019: Ranked List Loss for Deep Metric Learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • MATLAB 62.0%
  • C++ 38.0%