Skip to content

Optimization of hyperparamers for deep learning

Notifications You must be signed in to change notification settings

donatasrep/param_pipeline

 
 

Repository files navigation

param_pipeline

Optimization of hyperparamers for deep learning models using hyperopt and snakemake

Add your params to params.yml and model Edit config.yml

To run 4 parallel jobs on 2 gpu cluster:

nohup snakemake -j 4 --resources gpu=200 mem_frac=160 >> _run_hyperas_e500_i1500.log 2>&1 &

About

Optimization of hyperparamers for deep learning

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.2%
  • Shell 0.8%