Skip to content
This repository has been archived by the owner on Oct 15, 2019. It is now read-only.

Integrate with PS

hjk41 edited this page Jun 3, 2015 · 7 revisions

Parameter servers provides a way to efficiently communicate between learners. Minerva can be integrated with Mu's parameter server to perform distributed machine learning.

Pre-requisites

  1. Download and compile parameter server.
    git clone https://github.com/hjk41/parameter_server
    cd parameter_server
    make minerva

If everything goes successfully, you will get a file named libminervaps.a under build directory.

Write PS-enabled applications

Writing distributed applications is different from single-machine apps. Fortunately, Minerva provides a wrapper that makes this task easy. Just implement three functions, and you are ready to go. As an example, you can read {minerva_dir}/apps/ps/mnist_mlp_ps.cpp.

To leverage PS, you need to implement the following functions:

  • void InitLayer(const std::string & name, float * data, size_t size)
  • void UpdateLayer(const std::string & name, float * weight, float * grad, size_t size)
  • int MinervaWorkerMain(int workerId, int numWorkers, int argc, char ** argv)

InitLayer and UpdateLayer are executed in parameter server, while MinervaWorkerMain is executed in workers. As the names suggest, the first two functions are used to initiate and update a layer, while the last is the main function that is executed in worker. The workerId and numWorkers parameters can be used to decide which part of input the worker should work on.

Once these three functions are implemented, you should be able to compile the code and integrate it with parameter server.

Compiling the code

  1. Put your program under apps/ps, say apps/ps/myapp.cpp

  2. Configure Minerva to support PS: set BUILD_WITH_PS=1 in configure.in, and set PS_ROOT to your parameter server folder

  3. ./build.sh --clean && ./build.sh

If all goes well, you should get release/apps/myapp, which can then be used by parameter server