Replies: 2 comments 1 reply
-
Hi @MustyKey. Are you trying to minimize the objective? The As a more general suggestion, if your goal is to optimize a function, we recommend using Ax rather than BoTorch. BoTorch is primarily designed for BO research. Ax uses BoTorch under the hood while offering a more intuitive interface for experiment setup. This tutorial is a good place to start: https://ax.dev/tutorials/gpei_hartmann_service.html |
Beta Was this translation helpful? Give feedback.
-
Hi @saitcakmak , Thank you for your quick and insightful feedback! You’re absolutely right—I corrected the code, and now the objective values are minimizing as expected. My goal is to minimize the metabolic costs of gait in a simulation, which involves optimizing the PID parameters of an actuator rather than minimizing a mathematical function. Do you think using Ax is still appropriate for this optimization? I appreciate your recommendation and will check out the tutorial you linked. Thanks again for your help!!!! |
Beta Was this translation helpful? Give feedback.
-
Hi everyone,
I'm working on a project to optimize PID parameters in order to minimize an objective function using Bayesian Optimization. However, I’m encountering an issue where the optimal result is always found based on the initial random parameters, and the actual BO loop doesn't seem to improve upon this.
I would really appreciate it if someone could take a look at my code to help identify any potential issues. Any suggestions or guidance would be incredibly helpful! If more code snippets or details are needed, feel free to let me know.
Thanks in advance for your time!
def optimize_pid(num_initial_points, num_iterations):
bounds = torch.tensor([[0, 0, 0],
[5, 5, 1]],
dtype=torch.double)
Beta Was this translation helpful? Give feedback.
All reactions