-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
jSO algorithm #379
Comments
@cola9 is working on the implementation of jSO algorithm proposed by Brest et al. Reference: Brest, Janez, Mirjam Sepesy Maučec, and Borko Bošković. "Single objective real-parameter optimization: Algorithm jSO." 2017 IEEE congress on evolutionary computation (CEC). IEEE, 2017. |
I'm currently reviewing your niapy programming library to understand how I need to implement the jSO algorithm. I took the DE algorithm as an example and looked at the whole implementation of the class "DifferentialEvolution"(DE), which is clear to me. I also looked at an example of using the DE algorithm and there are a couple of things that are not clear to me and I would like to ask you if you could explain them to me. In line 14 |
def run(self, task):
population, fitness, params = self.init_population(task)
best_individual, best_fitness = self.get_best(population, fitness)
while not task.stopping_condition():
population, fitness, best_individual, best_fitness, params = self.run_iteration(task, population, fitness, best_individual, best_fitness, params)
task.next_iter()
return best_individual, best_fitness The evolve, selection and post_selection methods of DifferentialEvolution are called in run_iteration: NiaPy/niapy/algorithms/basic/de.py Lines 392 to 422 in dbf2e3f
DifferentialEvolution and related algorithms differ from the rest of the algorithms in the framework in that they use the For other parameters that change during the runtime of the algorithm (that are not connected to the individual), you can add them to the params dict by overriding the algorithm's def init_population(self, task):
population, fitness, params = super().init_population(task)
params.update({'param1': 12, 'param2': 34})
return population, fitness, params You can then access and modify those parameters in the run_iteration method like so: def run_iteration(population, fitness, best_individual, best_fitness, params):
param1 = params.pop('param1')
param2 = params.pop('param2')
# ...
param1 = 8
param2 = 42
return population, fitness, best_individual, best_fitness, {'param1': param1, 'param2': param2} |
Thank you very much for very detailed explanation. The process is now much more clear. So if I understand correctly the run method (shown below) is already written (somewhere?) and can not be changed?
I am asking this because I saw that
Maybe this conditions won't suite my jSO algorithm. |
No problem. Yes, the run method is implemented in the Algorithm base class. There is no need to change it. You only need to override The stopping condition can be either max_iters, max_evals or cutoff_value. max_iters controls the maximum number of iterations (generations) of the algorithm, while max_evals is the maximum number of fitness function evaluations. cutoff_value means the algorithm will run until a fitness value <= cutoff_value is found. I see the jSO has some equations that use max_nfes, which is |
Thank you very much. I think that now I understand everything and can start with implementation of jSO algortihm. |
Hi @cola9! How is the project coming along? |
Implementing jSO algorithm to existing library.
The text was updated successfully, but these errors were encountered: