Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DemARKs to Examples #472

Merged
merged 8 commits into from
Feb 5, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 0 additions & 3 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -1,3 +0,0 @@
[submodule "Documentation/submodules/DemARK"]
path = Documentation/submodules/DemARK
url = https://github.com/econ-ark/DemARK
1 change: 1 addition & 0 deletions Documentation/example_notebooks/GenIncProcessModel.ipynb
1 change: 1 addition & 0 deletions Documentation/example_notebooks/Gentle-Intro-To-HARK.ipynb
1 change: 1 addition & 0 deletions Documentation/example_notebooks/IndShockConsumerType.ipynb
1 change: 1 addition & 0 deletions Documentation/example_notebooks/KinkedRconsumerType.ipynb
5 changes: 5 additions & 0 deletions Documentation/example_notebooks/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
This directory contains symlinks to the notebooks in the top level `examples/` directory.

These symlinks are referenced in the sphinx documentation, e.g. in `index.rst`.

`nbsphinx`, the sphinx notebook extension, sees the `.ipynb` extension, and resolves the link, properly converting the notebook into Sphinx themed HTML.
9 changes: 7 additions & 2 deletions Documentation/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,13 @@ you might want to look at the `DemARK
:maxdepth: 2
:caption: Notebooks

submodules/DemARK/notebooks/Gentle-Intro-To-HARK
submodules/DemARK/notebooks/KinkedRconsumerType
example_notebooks/Gentle-Intro-To-HARK.ipynb
example_notebooks/PerfForesightConsumerType.ipynb
example_notebooks/IndShockConsumerType.ipynb
example_notebooks/KinkedRconsumerType.ipynb
example_notebooks/ConsPortfolioModelDoc.ipynb
example_notebooks/GenIncProcessModel.ipynb
example_notebooks/LifecycleModelExample.ipynb

.. toctree::
:maxdepth: 2
Expand Down
1 change: 0 additions & 1 deletion Documentation/submodules/DemARK
Submodule DemARK deleted from ad40b2
11 changes: 0 additions & 11 deletions Documentation/submodules/README.rst

This file was deleted.

696 changes: 696 additions & 0 deletions examples/ConsIndShockModel/IndShockConsumerType.ipynb

Large diffs are not rendered by default.

419 changes: 419 additions & 0 deletions examples/ConsIndShockModel/IndShockConsumerType.py

Large diffs are not rendered by default.

369 changes: 369 additions & 0 deletions examples/ConsIndShockModel/KinkedRconsumerType.ipynb

Large diffs are not rendered by default.

676 changes: 676 additions & 0 deletions examples/ConsIndShockModel/PerfForesightConsumerType.ipynb

Large diffs are not rendered by default.

319 changes: 319 additions & 0 deletions examples/ConsIndShockModel/PerfForesightConsumerType.py

Large diffs are not rendered by default.

539 changes: 539 additions & 0 deletions examples/GenIncProcessModel/GenIncProcessModel.ipynb

Large diffs are not rendered by default.

576 changes: 576 additions & 0 deletions examples/Gentle-Intro/Gentle-Intro-To-HARK.ipynb

Large diffs are not rendered by default.

305 changes: 305 additions & 0 deletions examples/Gentle-Intro/Gentle-Intro-To-HARK.py

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1,165 @@
# ---
# jupyter:
# jupytext:
# formats: ipynb,py:light
# text_representation:
# extension: .py
# format_name: light
# format_version: '1.4'
# jupytext_version: 1.2.4
# kernelspec:
# display_name: Python 3
# language: python
# name: python3
# ---

# # How we solve a model defined by the `IndShockConsumerType` class
# The IndShockConsumerType reprents the work-horse consumption savings model with temporary and permanent shocks to income, finite or infinite horizons, CRRA utility and more. In this DemARK we take you through the steps involved in solving one period of such a model. The inheritance chains can be a little long, so figuring out where all the parameters and methods come from can be a bit confusing. Hence this map! The intention is to make it easier to know how to inheret from IndShockConsumerType in the sense that you know where to look for specific solver logic, but also so you know can figure out which methods to overwrite or supplement in your own `AgentType` and solver!
# ## The `solveConsIndShock` function
# In HARK, a period's problem is always solved by the callable (function or callable object instance) stored in the field `solveOnePeriod`. In the case of `IndShockConsumerType`, this function is called `solveConsIndShock`. The function accepts a number of arguments, that it uses to construct an instance of either a `ConsIndShockSolverBasic` or a `ConsIndShockSolver`. These solvers both have the methods `prepareToSolve` and `solve`, that we will have a closer look at in this notebook. This means, that the logic of `solveConsIndShock` is basically:
#
# 1. Check if cubic interpolation (`CubicBool`) or construction of the value function interpolant (`vFuncBool`) are requested. Construct an instance of `ConsIndShockSolverBasic` if neither are requested, else construct a `ConsIndShockSolver`. Call this `solver`.
# 1. Call `solver.prepareToSolve()`
# 1. Call `solver.solve()` and return the output as the current solution.

# ### Two types of solvers
# As mentioned above, `solveOnePeriod` will construct an instance of the class `ConsIndShockSolverBasic`or `ConsIndShockSolver`. The main difference is whether it uses cubic interpolation or if it explicitly constructs a value function approximation. The choice and construction of a solver instance is bullet 1) from above.
#
# #### What happens in upon construction
# Neither of the two solvers have their own `__init__`. `ConsIndShockSolver` inherits from `ConsIndShockSolverBasic` that in turn inherits from `ConsIndShockSetup`. `ConsIndShockSetup` inherits from `ConsPerfForesightSolver`, which itself is just an `Object`, so we get the inheritance structure
#
# - `ConsPerfForesightSolver` $\leftarrow$ `ConsIndShockSetup` $\leftarrow$ `ConsIndShockSolverBasic` $\leftarrow$ `ConsIndShockSolver`
#
# When one of the two classes in the end of the inheritance chain is called, it will call `ConsIndShockSetup.__init__(args...)`. This takes a whole list of fixed inputs that then gets assigned to the object through a
# ```
# ConsIndShockSetup.assignParameters(solution_next,IncomeDstn,LivPrb,DiscFac,CRRA,Rfree,PermGroFac,BoroCnstArt,aXtraGrid,vFuncBool,CubicBool)
# ```
# call, that then calls
# ```
# ConsPerfForesightSolver.assignParameters(self,solution_next,DiscFac,LivPrb,CRRA,Rfree,PermGroFac)
# ```
# We're getting kind of detailed here, but it is simply to help us understand the inheritance structure. The methods are quite straight forward, and simply assign the list of variables to self. The ones that do not get assigned by the `ConsPerfForesightSolver` method gets assign by the `ConsIndShockSetup` method instead.
#
#
# After all the input parameters are set, we update the utility function definitions. Remember, that we restrict ourselves to CRRA utility functions, and these are parameterized with the scalar we call `CRRA` in HARK. We use the two-argument CRRA utility (and derivatives, inverses, etc) from `HARK.utilities`, so we need to create a `lambda` (an anonymous function) according to the fixed `CRRA` we have chosen. This gets done through a call to
#
# ```
# ConsIndShockSetup.defUtilityFuncs()
# ```
# that itself calls
# ```
# ConsPerfForesightSolver.defUtilityFuncs()
# ```
# Again, we wish to emphasize the inheritance structure. The method in `ConsPerfForesightSolver` defines the most basic utility functions (utility, its marginal and its marginal marginal), and `ConsIndShockSolver` adds additional functions (marginal of inverse, inverse of marginal, marginal of inverse of marginal, and optionally inverse if `vFuncBool` is true).
#
# To sum up, the `__init__` method lives in `ConsIndShockSetup`, calls `assignParameters` and `defUtilityFuncs` from `ConsPerfForesightSolver` and defines its own methods with the same names that adds some methods used to solve the `IndShockConsumerType` using EGM. The main things controlled by the end-user are whether cubic interpolation should be used, `CubicBool`, and if the value function should be explicitly formed, `vFuncBool`.
# ### Prepare to solve
# We are now in bullet 2) from the list above. The `prepareToSolve` method is all about grabbing relevant information from next period's solution, calculating some limiting solutions. It comes from `ConsIndShockSetup` and calls two methods:
#
# 1. `ConsIndShockSetup.setAndUpdateValues(self.solution_next,self.IncomeDstn,self.LivPrb,self.DiscFac)`
# 2. `ConsIndShockSetup.defBoroCnst(self.BoroCnstArt)`
#
# First, we have `setAndUpdateValues`. The main purpose is to grab the relevant vectors that represent the shock distributions, the effective discount factor, and value function (marginal, level, marginal marginal depending on the options). It also calculates some limiting marginal propensities to consume and human wealth levels. Second, we have `defBoroCnst`. As the name indicates, it calculates the natural borrowing constraint, handles artificial borrowing constraints, and defines the consumption function where the constraint binds (`cFuncNowCnst`).
#
# To sum, `prepareToSolve` sets up the stochastic environment an borrowing constraints the consumer might face. It also grabs interpolants from "next period"'s solution.
#
# ### Solve it!
# The last method `solveConsIndShock` will call from the `solver` is `solve`. This method essentially has four steps:
# 1. Pre-processing for EGM: solver.prepareToCalcEndOfPrdvP
# 1. First step of EGM: solver.calcEndOfPrdvP
# 1. Second step of EGM: solver.makeBasicSolution
# 1. Add MPC and human wealth: solver.addMPCandHumanWealth
#
# #### Pre-processing for EGM `prepareToCalcEndOfPrdvP`
# Find relevant values of end-of-period asset values (according to `aXtraGrid` and natural borrowing constraint) and next period values implied by current period end-of-period assets and stochastic elements. The method stores the following in `self`:
#
# 1. values of permanent shocks in `PermShkVals_temp`
# 1. shock probabilities in `ShkPrbs_temp`
# 1. next period resources in `mNrmNext`
# 1. current grid of end-of-period assets in `aNrmNow`
#
# The method also returns `aNrmNow`. The definition is in `ConsIndShockSolverBasic` and is not overwritten in `ConsIndShockSolver`.
#
# #### First step of EGM `calcEndOfPrdvP`
# Find the marginal value of having some level of end-of-period assets today. End-of-period assets as well as stochastics imply next-period resources at the beginning of the period, calculated above. Return the result as `EndOfPrdvP`.
#
# #### Second step of EGM `makeBasicSolution`
# Apply inverse marginal utility function to nodes from about to find (m, c) pairs for the new consumption function in `getPointsForInterpolation` and create the interpolants in `usePointsForInterpolation`. The latter constructs the `ConsumerSolution` that contains the current consumption function `cFunc`, the current marginal value function `vPfunc`, and the smallest possible resource level `mNrmMinNow`.
#
# #### Add MPC and human wealth `addMPCandHumanWealth`
# Add values calculated in `defBoroCnst` now that we have a solution object to put them in.
#
# #### Special to the non-Basic solver
# We are now done, but in the `ConsIndShockSolver` (non-`Basic`!) solver there are a few extra steps. We add steady state m, and depending on the values of `vFuncBool` and `CubicBool` we also add the value function and the marginal marginal value function.

# # Let's try it in action!
# First, we define a standard lifecycle model, solve it and then

import HARK.ConsumptionSaving.ConsumerParameters as Params
from HARK.ConsumptionSaving.ConsIndShockModel import IndShockConsumerType
import numpy as np
import matplotlib.pyplot as plt
LifecycleExample = IndShockConsumerType(**Params.init_lifecycle)
LifecycleExample.cycles = 1 # Make this consumer live a sequence of periods exactly once
LifecycleExample.solve()

# Let's have a look at the solution in time period second period. We should then be able to

from HARK.utilities import plotFuncs
plotFuncs([LifecycleExample.solution[0].cFunc],LifecycleExample.solution[0].mNrmMin,10)

# Let us then create a solver for the first period.

from HARK.ConsumptionSaving.ConsIndShockModel import ConsIndShockSolverBasic
solver = ConsIndShockSolverBasic(LifecycleExample.solution[1],
LifecycleExample.IncomeDstn[0],
LifecycleExample.LivPrb[0],
LifecycleExample.DiscFac,
LifecycleExample.CRRA,
LifecycleExample.Rfree,
LifecycleExample.PermGroFac[0],
LifecycleExample.BoroCnstArt,
LifecycleExample.aXtraGrid,
LifecycleExample.vFuncBool,
LifecycleExample.CubicBool)

solver.prepareToSolve()

# Many important values are now calculated and stored in solver, such as the effective discount factor, the smallest permanent income shock, and more.

solver.DiscFacEff

solver.PermShkMinNext

# These values were calculated in `setAndUpdateValues`. In `defBoroCnst` that was also called, several things were calculated, for example the consumption function defined by the borrowing constraint.

plotFuncs([solver.cFuncNowCnst],solver.mNrmMinNow,10)

# Then, we set up all the grids, grabs the discrete shock distributions, and state grids in `prepareToCalcEndOfPrdvP`.

solver.prepareToCalcEndOfPrdvP()

# Then we calculate the marginal utility of next period's resources given the stochastic environment and current grids.

EndOfPrdvP = solver.calcEndOfPrdvP()

# Then, we essentially just have to construct the (resource, consumption) pairs by completing the EGM step, and constructing the interpolants by using the knowledge that the limiting solutions are those of the perfect foresight model. This is done with `makeBasicSolution` as discussed above.

solution = solver.makeBasicSolution(EndOfPrdvP,solver.aNrmNow,solver.makeLinearcFunc)

# Lastly, we add the MPC and human wealth quantities we calculated in the method that prepared the solution of this period.

solver.addMPCandHumanWealth(solution)

# All that is left is to verify that the solution in `solution` is identical to `LifecycleExample.solution[0]`. We can plot the against each other:

plotFuncs([LifecycleExample.solution[0].cFunc, solution.cFunc],LifecycleExample.solution[0].mNrmMin,10)

# Although, it's probably even clearer if we just subtract the function values from each other at some grid.

eval_grid = np.linspace(0, 20, 200)
LifecycleExample.solution[0].cFunc(eval_grid) - solution.cFunc(eval_grid)