Skip to content

Commit

Permalink
update line numbers in workshop notebook
Browse files Browse the repository at this point in the history
update line numbers in workshop notebook
  • Loading branch information
simopt-admin committed Dec 15, 2024
1 parent ecb66a8 commit 3bca0a7
Showing 1 changed file with 10 additions and 10 deletions.
20 changes: 10 additions & 10 deletions workshop/workshop.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -140,11 +140,11 @@
"### Exercise \\#2\n",
"\n",
"1. Open the file simopt/model/example.py in the VS Code editor.\n",
"2. Let's change how random search randomly samples solutions in R^2. For starters, uncomment Line 355\n",
"2. Let's change how random search randomly samples solutions in R^2. For starters, uncomment Line 430\n",
"\n",
" `x = tuple([rand_sol_rng.uniform(-2, 2) for _ in range(self.dim)])`\n",
"\n",
" and comment out Line 356\n",
" and comment out Lines 431-437\n",
" \n",
" `x = tuple(rand_sol_rng.mvnormalvariate(mean_vec=np.zeros(self.dim), cov=np.eye(self.dim), factorized=False))`\n",
"\n",
Expand Down Expand Up @@ -225,31 +225,31 @@
"### Exercise \\#3\n",
"\n",
"1. Open simopt/model/example.py again.\n",
"2. Change the noise in the objective function evaluations to create a slightly different 2D optimization problem. This can be done by changing Line 85: \n",
"2. Change the noise in the objective function evaluations to create a slightly different 2D optimization problem. This can be done by changing Line 99: \n",
" \n",
" `fn_eval_at_x = np.linalg.norm(x) ** 2 + noise_rng.normalvariate()`\n",
"\n",
" where `x` is a numpy array of length two. For starters, try passing the argument `sigma=10` into the function call `noise_rng.normalvariate()`. The default value is `sigma=1`, so this has the effect of increasing the common variance of the noise from 1 to 100.\n",
"3. Restart the kernel and run COMBO CODE CELL [0 + 1 + 3] below. *How have the plots changed? Why haven't they changed more?*\n",
"\n",
"4. Next, change the underlying objective function by replacing `np.linalg.norm(x) ** 2` in Line 85 with some other two-dimensional function of `x`, e.g., `1 - np.exp(-np.linalg.norm(x) ** 2)`. (This objective function looks like an upside-down standard bivariate normal pdf, rescaled.)\n",
"4. Next, change the underlying objective function by replacing `np.linalg.norm(x) ** 2` in Line 99 with some other two-dimensional function of `x`, e.g., `1 - np.exp(-np.linalg.norm(x) ** 2)`. (This objective function looks like an upside-down standard bivariate normal pdf, rescaled.)\n",
"5. Depending of your choice of new objective function, you MAY need to change other parts of the code, including:\n",
" * The gradient of `f(x)` in Line 89. For the example given above, this would need to be changed from\n",
" * The gradient of `f(x)` in Line 103. For the example given above, this would need to be changed from\n",
" \n",
" `gradients = {\"est_f(x)\": {\"x\": tuple(2 * x)}}`\n",
"\n",
" to\n",
"\n",
" `gradients = {\"est_f(x)\": {\"x\": tuple(2 * x * np.exp(-np.linalg.norm(x) ** 2))}}`\n",
" * If you change the problem to a maxmization problem, you will need to change Line 173 from\n",
" * If you change the problem to a maxmization problem, you will need to change Line 190 from\n",
" \n",
" `self.minmax = (-1,)`\n",
" `return (-1,)`\n",
" \n",
" to\n",
" \n",
" `self.minmax = (1,)`.\n",
" * The optimal solution in Line 204. (For the running example, this will not be necessary.)\n",
" * The optimal objective function value in Line 203. (For the running example, this will not be necessary.)\n",
" `return (1,)`.\n",
" * The optimal solution in Line 214. (For the running example, this will not be necessary.)\n",
" * The optimal objective function value in Line 208. (For the running example, this will not be necessary.)\n",
"6. Restart the kernel and run COMBO CODE CELL [0 + 1 + 3] below. *How have the plots changed?*\n",
"\n",
"**Extra for Experts:** Change the dimension of the problem. To do this, you will need to change the dimension of the default initial solution, defined in Line 185."
Expand Down

0 comments on commit 3bca0a7

Please sign in to comment.