Skip to content

Commit

Permalink
Merge pull request #413 from zhuang13atJHU/master
Browse files Browse the repository at this point in the history
Updated links in HARK manual
  • Loading branch information
llorracc authored Oct 23, 2019
2 parents 89f24b6 + e59f808 commit 790741f
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions Documentation/HARKmanual.tex
Original file line number Diff line number Diff line change
Expand Up @@ -168,33 +168,33 @@ \subsection{HARKcore}\label{sec:HARKcore}

Macroeconomic models in HARK use the \texttt{Market} class to represent a market (or other aggregator) that combines the actions, states, and/or shocks (generally, outcomes) of individual agents in the model into aggregate outcomes that are ``passed back'' to the agents. For example, the market in a consumption-saving model might combine the individual asset holdings of all agents in the market to generate aggregate capital in the economy, yielding the interest rate on assets (as the marginal product of capital); the individual agents then learn the aggregate capital level and interest rate, conditioning their next action on this information. Objects that microeconomic agents treat as exogenous when solving (or simulating) their model are thus endogenous at the macroeconomic level. Like \texttt{AgentType}, the \texttt{Market} class also has a \texttt{solve} method, which seeks out a dynamic general equilibrium: a ``rule'' governing the dynamic evolution of macroeconomic objects such that if agents believe this rule and act accordingly, then their collective actions generate a sequence of macroeconomic outcomes that justify the belief in that rule. For a more complete description, see section \ref{sec:Macroeconomics}.

Beyond the model frameworks, \texttt{HARK.core} also defines a ``supersuperclass'' called \texttt{HARKobject}. When solving a dynamic microeconomic model with an infinite horizon (or searching for a dynamic general equilibrium), it is often required to consider whether two solutions are sufficiently close to each other to warrant stopping the process (i.e.\ approximate convergence). It is thus necessary to calculate the ``distance'' between two solutions, so HARK specifies that classes should have a \texttt{distance} method that takes a single input and returns a non-negative value representing the (generally dimensionless) distance between the object in question and the input to the method. As a convenient default, \texttt{HARKobject} provides a ``universal distance metric'' that should be useful in many contexts.\footnote{Roughly speaking, the universal distance metric is a recursive supnorm, returning the largest distance between two instances, among attributes named in \texttt{distance\_criteria}. Those attributes might be complex objects themselves rather than real numbers, generating a recursive call to the universal distance metric.} When defining a new subclass of \texttt{HARKobject}, the user simply defines the attribute \texttt{distance\_criteria} as a list of strings naming the attributes of the class that should be compared when calculating the distance between two instances of that class. For example, the class \texttt{ConsumerSolution} has \texttt{distance\_criteria = ['cFunc']}, indicating that only the consumption function attribute of the solution matters when comparing the distance between two instances of \texttt{ConsumerSolution}. See \href{https://econ-ark.github.io/HARK/generated/HARK.core.html}{here} for online documentation.
Beyond the model frameworks, \texttt{HARK.core} also defines a ``supersuperclass'' called \texttt{HARKobject}. When solving a dynamic microeconomic model with an infinite horizon (or searching for a dynamic general equilibrium), it is often required to consider whether two solutions are sufficiently close to each other to warrant stopping the process (i.e.\ approximate convergence). It is thus necessary to calculate the ``distance'' between two solutions, so HARK specifies that classes should have a \texttt{distance} method that takes a single input and returns a non-negative value representing the (generally dimensionless) distance between the object in question and the input to the method. As a convenient default, \texttt{HARKobject} provides a ``universal distance metric'' that should be useful in many contexts.\footnote{Roughly speaking, the universal distance metric is a recursive supnorm, returning the largest distance between two instances, among attributes named in \texttt{distance\_criteria}. Those attributes might be complex objects themselves rather than real numbers, generating a recursive call to the universal distance metric.} When defining a new subclass of \texttt{HARKobject}, the user simply defines the attribute \texttt{distance\_criteria} as a list of strings naming the attributes of the class that should be compared when calculating the distance between two instances of that class. For example, the class \texttt{ConsumerSolution} has \texttt{distance\_criteria = ['cFunc']}, indicating that only the consumption function attribute of the solution matters when comparing the distance between two instances of \texttt{ConsumerSolution}. See \href{https://hark.readthedocs.io/en/latest/generated/HARK.core.html}{here} for online documentation.

\subsection{HARK.utilities}\label{sec:HARKutilities}

The \texttt{HARK.utilities} module carries a double meaning in its name, as it contains both utility functions (and their derivatives, inverses, and combinations thereof) in the economic modeling sense as well as utilities in the sense of general tools. Utility functions included at this time are constant relative risk aversion and constant absolute risk aversion. Other functions in \texttt{HARK.utilities} include some data manipulation tools (e.g.\ for calculating an average of data conditional on being within a percentile range of different data), functions for constructing discrete state space grids, convenience functions for retrieving information about functions, and basic plotting tools using \texttt{matplotlib.pyplot}.

The module also includes functions for constructing discrete approximations to continuous distributions (e.g.\ \texttt{approxLognormal()} to approximate a log-normal distribution) as well as manipulating these representations (e.g.\ appending one outcome to an existing distribution, or combining independent univariate distributions into one multivariate distribution). As a convention in HARK, continuous distributions are approximated as finite discrete distributions when solving models; an $N$-dimensional random variable is formatted as a length $N+1$ list of 1D arrays, with the first element representing event probabilities and all other elements are realizations of the $N$ component RVs. This both simplifies solution methods (reducing numeric integrals to simple dot products) and allows users to easily test whether their chosen degree of discretization yields a sufficient approximation to the full distribution. See \href{https://econ-ark.github.io/HARK/generated/HARK.utilities.html}{here} for online documentation.
The module also includes functions for constructing discrete approximations to continuous distributions (e.g.\ \texttt{approxLognormal()} to approximate a log-normal distribution) as well as manipulating these representations (e.g.\ appending one outcome to an existing distribution, or combining independent univariate distributions into one multivariate distribution). As a convention in HARK, continuous distributions are approximated as finite discrete distributions when solving models; an $N$-dimensional random variable is formatted as a length $N+1$ list of 1D arrays, with the first element representing event probabilities and all other elements are realizations of the $N$ component RVs. This both simplifies solution methods (reducing numeric integrals to simple dot products) and allows users to easily test whether their chosen degree of discretization yields a sufficient approximation to the full distribution. See \href{https://hark.readthedocs.io/en/latest/generated/HARK.utilities.html}{here} for online documentation.

\subsection{HARK.interpolation}\label{sec:HARKinterpolation}

The \texttt{HARK.interpolation} module defines classes for representing interpolated function approximations. Interpolation methods in HARK all inherit from a superclass such as \texttt{HARKinterpolator1D} or \texttt{HARKinterpolator2D}, wrapper classes that ensures interoperability across interpolation methods. For example, \texttt{HARKinterpolator1D} specifies the methods \texttt{\_\_call\_\_} and \texttt{derivative} to accept an arbitrary array as an input and return an identically shaped array with the interpolated function evaluated at the values in the array or its first derivative, respectively. However, these methods do little on their own, merely reshaping arrays and referring to the \texttt{\_evaluate} and \texttt{\_der} methods, which are \textit{not actually defined in} \texttt{HARKinterpolator1D}. Each subclass of \texttt{HARKinterpolator1D} specifies their own implementation of \texttt{\_evaluate} and \texttt{\_der} particular to that interpolation method, accepting and returning only 1D arrays. In this way, subclasses of \texttt{HARKinterpolator1D} are easily interchangeable with each other, as all methods that the user interacts with are identical, varying only by ``internal'' methods.

When evaluating a stopping criterion for an infinite horizon problem, it is often necessary to know the ``distance'' between functions generated by successive iterations of a solution procedure. To this end, each interpolator class in HARK must define a \texttt{distance} method that takes as an input another instance of the same class and returns a non-negative real number representing the ``distance'' between the two. As each of the \texttt{HARKinterpolatorXD} classes inherits from \texttt{HARKobject}, all interpolator classes have the default ``universal'' distance method; the user must simply list the names of the relevant attributes in the attribute \texttt{distance\_criteria} of the class.

Interpolation methods currently implemented in HARK include (multi)linear interpolation up to 4D, 1D cubic spline interpolation, (multi)linear interpolation over 1D interpolations (up to 4D total), (multi)linear interpolation over 2D interpolations (up to 4D total), linear interpolation over 3D interpolations, 2D curvilinear interpolation over irregular grids, and a 1D ``lower envelope'' interpolator. See \href{https://econ-ark.github.io/HARK/generated/HARKinterpolation.html}{here} for online documentation.
Interpolation methods currently implemented in HARK include (multi)linear interpolation up to 4D, 1D cubic spline interpolation, (multi)linear interpolation over 1D interpolations (up to 4D total), (multi)linear interpolation over 2D interpolations (up to 4D total), linear interpolation over 3D interpolations, 2D curvilinear interpolation over irregular grids, and a 1D ``lower envelope'' interpolator. See \href{https://hark.readthedocs.io/en/latest/generated/HARK.interpolation.html}{here} for online documentation.

\subsection{HARK.simulation}\label{sec:HARKsimulation}

The \texttt{HARK.simulation} module provides tools for generating simulated data or shocks for post-solution use of models. Currently implemented distributions include normal, lognormal, Weibull (including exponential), uniform, Bernoulli, and discrete. As an example of their use, these tools are used in the consumption-saving models of \texttt{ConsIndShockModel.py} to simulate permanent and transitory income shocks as well as unemployment events. See \href{https://econ-ark.github.io/HARK/generated/HARKsimulation.html}{here} for online documentation.
The \texttt{HARK.simulation} module provides tools for generating simulated data or shocks for post-solution use of models. Currently implemented distributions include normal, lognormal, Weibull (including exponential), uniform, Bernoulli, and discrete. As an example of their use, these tools are used in the consumption-saving models of \texttt{ConsIndShockModel.py} to simulate permanent and transitory income shocks as well as unemployment events. See \href{https://hark.readthedocs.io/en/latest/generated/HARK.simulation.html}{here} for online documentation.

\subsection{HARK.estimation}\label{sec:HARKestimation}

Methods for optimizing an objective function for the purposes of estimating a model can be found in \texttt{HARK.estimation}. As of this writing, the implementation includes only minimization by the Nelder-Mead simplex method, minimization by a derivative-free Powell method variant, and two small tools for resampling data (i.e.\ for a bootstrap); the minimizers are merely convenience wrappers (with result reporting) for optimizers included in \texttt{scipy.optimize}. Future functionality will include more robust global search methods, including genetic algorithms, simulated annealing, and differential evolution. See \href{https://econ-ark.github.io/HARK/generated/HARKestimation.html}{here} for full documentation.
Methods for optimizing an objective function for the purposes of estimating a model can be found in \texttt{HARK.estimation}. As of this writing, the implementation includes only minimization by the Nelder-Mead simplex method, minimization by a derivative-free Powell method variant, and two small tools for resampling data (i.e.\ for a bootstrap); the minimizers are merely convenience wrappers (with result reporting) for optimizers included in \texttt{scipy.optimize}. Future functionality will include more robust global search methods, including genetic algorithms, simulated annealing, and differential evolution. See \href{https://hark.readthedocs.io/en/latest/generated/HARK.estimation.html}{here} for full documentation.

\subsection{HARK.parallel}\label{sec:HARKparallel}

By default, processes in Python are single-threaded, using only a single CPU core. The \texttt{HARK.parallel} module provides basic tools for using multiple CPU cores simultaneously, with minimal effort.\footnote{\texttt{HARK.parallel} uses two packages that aren't included in the default distribution of Anaconda: \texttt{joblib} and \texttt{dill}; see step 3 of the instructions in section \ref{sec:GettingStarted} for how to install them.} In particular, it provides the function \texttt{multiThreadCommands}, which takes two arguments: a list of \texttt{AgentType}s and a list of commands as strings; each command should be a method of the \texttt{AgentType}s. The function simply distributes the \texttt{AgentType}s across threads on different cores and executes each command in order, returning no output (the \texttt{AgentType}s themselves are changed by running the commands). Equivalent results would be achieved by simply looping over each type and running each method in the list. Indeed, \texttt{HARK.parallel} also has a function called \texttt{multiThreadCommandsFake} that does just that, with identical syntax to \texttt{multiThreadCommands}; multithreading in HARK can thus be easily turned on and off.\footnote{In the future, \texttt{HARK.parallel} might be absorbed into \texttt{HARK.core} and \texttt{HARK.estimation}, particularly if \texttt{joblib} and \texttt{dill} become part of the standard Anaconda distribution.} The module also has functions for a parallel implementation of the Nelder-Mead simplex algorithm, as described in Wiswall and Lee (2011). See \href{https://econ-ark.github.io/HARK/generated/HARK.parallel.html}{here} for full documentation.
By default, processes in Python are single-threaded, using only a single CPU core. The \texttt{HARK.parallel} module provides basic tools for using multiple CPU cores simultaneously, with minimal effort.\footnote{\texttt{HARK.parallel} uses two packages that aren't included in the default distribution of Anaconda: \texttt{joblib} and \texttt{dill}; see step 3 of the instructions in section \ref{sec:GettingStarted} for how to install them.} In particular, it provides the function \texttt{multiThreadCommands}, which takes two arguments: a list of \texttt{AgentType}s and a list of commands as strings; each command should be a method of the \texttt{AgentType}s. The function simply distributes the \texttt{AgentType}s across threads on different cores and executes each command in order, returning no output (the \texttt{AgentType}s themselves are changed by running the commands). Equivalent results would be achieved by simply looping over each type and running each method in the list. Indeed, \texttt{HARK.parallel} also has a function called \texttt{multiThreadCommandsFake} that does just that, with identical syntax to \texttt{multiThreadCommands}; multithreading in HARK can thus be easily turned on and off.\footnote{In the future, \texttt{HARK.parallel} might be absorbed into \texttt{HARK.core} and \texttt{HARK.estimation}, particularly if \texttt{joblib} and \texttt{dill} become part of the standard Anaconda distribution.} The module also has functions for a parallel implementation of the Nelder-Mead simplex algorithm, as described in Wiswall and Lee (2011). See \href{https://hark.readthedocs.io/en/latest/generated/HARK.parallel.html}{here} for full documentation.

\section{Microeconomics: the AgentType Class}\label{sec:Microeconomics}

Expand Down

0 comments on commit 790741f

Please sign in to comment.