Skip to content
Pedro Bruel edited this page Mar 24, 2016 · 8 revisions

Logo Build Status GitHub version Documentation Status StochasticSearch Coverage Status

StochasticSearch.jl is a julia package that provides tools and optimization algorithms for implementing different Stochastic Local Search methods, such as Simulated Annealing and Tabu Search. StochasticSearch.jl is an ongoing project, and will implement more optimization and local search algorithms. The API provides tools for implementing parallel and distributed program autotuners.

Currently, it's possible to optimize user-defined functions with a few Stochastic Local Search basic methods, that are composed by building blocks also provided in the package. Evaluations of functions and technique executions are distributed between Julia workers using remotecalls. It's possible to instantiate multiple instances of a search technique, or different techniques, that run on the same problem.

Installing

StochasticSearch.jl runs on Julia v0.4.0. From the Julia REPL, run:

Pkg.add("StochasticSearch")

If you want the latest version, which may be unstable, run instead:

Pkg.clone("StochasticSearch")

Documentation

Please, refer to the documentation for more information and examples.

===

Example: The Rosenbrock Function

The following is a very simple example, and you can find its source code here.

We will optimize the Rosenbrock cost function. For this we must define a Configuration that represents the arguments to be tuned. We also have to create and configure a tuning run. First, let's import StochasticSearch and define the cost function:

@everywhere begin
    using StochasticSearch
    function rosenbrock(x::Configuration, parameters::Dict{Symbol, Any})
        return (1.0 - x["i0"].value)^2 + 100.0 * (x["i1"].value - x["i0"].value^2)^2
    end
end

We use the @everywhere macro to define the function in all Julia workers available.

Cost functions must accept a Configuration and a Dict{Symbol, Any} as input. Our cost function simply ignores the parameter dictionary, and uses the "i0" and "i1" parameters of the received configuration to calculate a value. There is no restriction on Configuration parameter naming.

Our configuration will have two FloatParameters, which will be Float64 values constrained to an interval. The intervals are [-2.0, 2.0] for both parameters, and their values start at 0.0. Since we already used the names "i0" and "i1", we name the parameters the same way:

configuration = Configuration([FloatParameter(-2.0, 2.0, 0.0,"i0"),
                               FloatParameter(-2.0, 2.0, 0.0,"i1")],
                               "rosenbrock_config")

Now we must configure a new tuning run using the Run type. There are many parameters to configure, but they all have default values. Since we won't be using them all, please see Run's source code for further details.

tuning_run = Run(cost               = rosenbrock,
                 starting_point     = configuration,
                 methods            = [[:simulated_annealing 1];
                                       [:iterative_first_improvement 1];
                                       [:randomized_first_improvement 1];
                                       [:iterative_greedy_construction 1];
                                       [:iterative_probabilistic_improvement 1];])

The methods array defines the search methods, and their respective number of instances, that will be used in this tuning run. This example uses one instance of every implemented search technique. The search will start at the point defined by starting_point.

We are ready to create a search task using the @task macro. For more information on how tasks work, please check the Julia Documentation. This new task will run the optimize method, which receives a tuning run configuration and runs the search techniques in background. The task will produce optimize's current best result whenever we call consume on it:

search_task = @task optimize(tuning_run)
result = consume(search_task)

The tuning run will use the default neighboring and perturbation methods implemented by StochasticSearch.jl to find new results. Now we can process the current result, in this case we just print it, and loop until optimize is done:

print(result)
while result.is_final == false
    result = consume(search_task)
    print(result)
end

Running the complete example, we get:

% julia --color=yes examples/rosenbrock/rosenbrock.jl
[Result]
Cost              : 40.122073057715546
Found in Iteration: 1
Current Iteration : 1
Technique         : Initialize
Function Calls    : 1
  ***
[Final Result]
Cost                  : 0.03839419856300206
Found in Iteration    : 237
Current Iteration     : 1001
Technique             : Simulated Annealing
Function Calls        : 237
Starting Configuration:
  [Configuration]
  name      : rosenbrock_config
  parameters:
    [NumberParameter]
    name : i0
    min  : -2.000000
    max  : 2.000000
    value: 0.787244
    ***
    [NumberParameter]
    name : i1
    min  : -2.000000
    max  : 2.000000
    value: 0.656131
Minimum Configuration :
  [Configuration]
  name      : rosenbrock_config
  parameters:
    [NumberParameter]
    name : i0
    min  : -2.000000
    max  : 2.000000
    value: 0.813772
    ***
    [NumberParameter]
    name : i1
    min  : -2.000000
    max  : 2.000000
    value: 0.656131

===

Clone this wiki locally