Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jSO algorithm #379

Open
cola9 opened this issue Mar 2, 2022 · 7 comments
Open

jSO algorithm #379

cola9 opened this issue Mar 2, 2022 · 7 comments
Assignees
Labels
enhancement New feature or request feature good first issue Good for newcomers

Comments

@cola9
Copy link

cola9 commented Mar 2, 2022

Implementing jSO algorithm to existing library.

@firefly-cpp firefly-cpp added enhancement New feature or request feature good first issue Good for newcomers labels Mar 2, 2022
@firefly-cpp
Copy link
Contributor

@cola9 is working on the implementation of jSO algorithm proposed by Brest et al.

Reference: Brest, Janez, Mirjam Sepesy Maučec, and Borko Bošković. "Single objective real-parameter optimization: Algorithm jSO." 2017 IEEE congress on evolutionary computation (CEC). IEEE, 2017.

@cola9
Copy link
Author

cola9 commented Mar 23, 2022

I'm currently reviewing your niapy programming library to understand how I need to implement the jSO algorithm.

I took the DE algorithm as an example and looked at the whole implementation of the class "DifferentialEvolution"(DE), which is clear to me.

I also looked at an example of using the DE algorithm and there are a couple of things that are not clear to me and I would like to ask you if you could explain them to me.

In line 14 algo = DifferentialEvolution(population_size=50, differential_weight=0.5, crossover_probability=0.9) I see that you initialize the algorithm with the necessary parameters. Then create a Task on line 16 and use it on line 17 when calling algo.run(task). I don't undestand how you run DE algorithm and all the necessary functions ("evolve", "selection", "run_iteration", . . . ) with the "run" method (from line 17)?

@zStupan
Copy link
Contributor

zStupan commented Mar 23, 2022

run_iteration is the main loop of the algorithm. Internally the run method looks something like this:

def run(self, task):
    population, fitness, params = self.init_population(task)
    best_individual, best_fitness = self.get_best(population, fitness)
    while not task.stopping_condition():
        population, fitness, best_individual, best_fitness, params = self.run_iteration(task, population, fitness, best_individual, best_fitness, params)
        task.next_iter()
    return best_individual, best_fitness

The evolve, selection and post_selection methods of DifferentialEvolution are called in run_iteration:

def run_iteration(self, task, population, population_fitness, best_x, best_fitness, **params):
r"""Core function of Differential Evolution algorithm.
Args:
task (Task): Optimization task.
population (numpy.ndarray): Current population.
population_fitness (numpy.ndarray): Current populations fitness/function values.
best_x (numpy.ndarray): Current best individual.
best_fitness (float): Current best individual function/fitness value.
**params (dict): Additional arguments.
Returns:
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]:
1. New population.
2. New population fitness/function values.
3. New global best solution.
4. New global best solutions fitness/objective value.
5. Additional arguments.
See Also:
* :func:`niapy.algorithms.basic.DifferentialEvolution.evolve`
* :func:`niapy.algorithms.basic.DifferentialEvolution.selection`
* :func:`niapy.algorithms.basic.DifferentialEvolution.post_selection`
"""
new_population = self.evolve(population, best_x, task)
population, best_x, best_fitness = self.selection(population, new_population, best_x, best_fitness, task=task)
population, best_x, best_fitness = self.post_selection(population, task, best_x, best_fitness)
population_fitness = np.asarray([x.f for x in population])
best_x, best_fitness = self.get_best(population, population_fitness, best_x, best_fitness)
return population, population_fitness, best_x, best_fitness, {}

DifferentialEvolution and related algorithms differ from the rest of the algorithms in the framework in that they use the Individual class to represent an individual, instead of a numpy array. The individual class has two attributes: x - a numpy array representing the solution vector, and f representing the individual's fitness. It also has the method evaluate(task), which puts the solution vector x in bounds of the problem and calculates the new fitness value. You can extend the Individual class to add extra parameters to it.

For other parameters that change during the runtime of the algorithm (that are not connected to the individual), you can add them to the params dict by overriding the algorithm's init_population method like so:

def init_population(self, task):
    population, fitness, params = super().init_population(task)
    params.update({'param1': 12, 'param2': 34})
    return population, fitness, params

You can then access and modify those parameters in the run_iteration method like so:

def run_iteration(population, fitness, best_individual, best_fitness, params):
    param1 = params.pop('param1')
    param2 = params.pop('param2')
    # ...
    param1 = 8
    param2 = 42
    return population, fitness, best_individual, best_fitness, {'param1': param1, 'param2': param2}

@cola9
Copy link
Author

cola9 commented Mar 25, 2022

Thank you very much for very detailed explanation. The process is now much more clear. So if I understand correctly the run method (shown below) is already written (somewhere?) and can not be changed?

def run(self, task):
    population, fitness, params = self.init_population(task)
    best_individual, best_fitness = self.get_best(population, fitness)
    while not task.stopping_condition():
        population, fitness, best_individual, best_fitness, params = self.run_iteration(population, fitness, best_individual, best_fitness, params)
        task.next_iter()
    return best_individual, best_fitness

I am asking this because I saw that task.stopping_condition() has predefined conditions:

    return (self.evals >= self.max_evals) or (self.iters >= self.max_iters) or (self.cutoff_value * self.optimization_type.value >= self.x_f * self.optimization_type.value)

Maybe this conditions won't suite my jSO algorithm.

@zStupan
Copy link
Contributor

zStupan commented Mar 25, 2022

No problem. Yes, the run method is implemented in the Algorithm base class. There is no need to change it. You only need to override run_iteration and init_population, if there's some variable run time parameters.

The stopping condition can be either max_iters, max_evals or cutoff_value. max_iters controls the maximum number of iterations (generations) of the algorithm, while max_evals is the maximum number of fitness function evaluations. cutoff_value means the algorithm will run until a fitness value <= cutoff_value is found.

I see the jSO has some equations that use max_nfes, which is max_evals in our framework, you can access it in run_iteration with task.max_evals, and you can get the current number of function evaluations with task.evals (these are attributes of task).
If the stopping condition is max_iters, I guess you could compute max_evals from max_iters and population_size.

@cola9
Copy link
Author

cola9 commented Mar 25, 2022

Thank you very much. I think that now I understand everything and can start with implementation of jSO algortihm.

@firefly-cpp
Copy link
Contributor

Hi @cola9!

How is the project coming along?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants