Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Further revision and edits to the model descriptions #294

Closed
wants to merge 114 commits into from
Closed

Further revision and edits to the model descriptions #294

wants to merge 114 commits into from

Conversation

iworld1991
Copy link
Contributor

  • spelled out briefs like EOS
  • explained what GHH preference accomplishes and includes reference
  • more details on a few terms such as uncertainty shocks, etc.
  • corrected a few typos

TimMunday and others added 30 commits September 4, 2018 11:45
* Improvements to checkConditions that were reverted, ref #175.

* Fix the factor calculations according to the other reverted commit!

* s/impatiance/impatience

* Name the number the determines if FVAC is met FVAF instead of FVAC for Factor.

* Don't print the table information twice.

* More control over warnings and information about impatience conditions.
* Fixed imports in model files

All of the consumption-saving model files used a style of import that didn't work when the file was run directly (rather than called as a module), even though they have a __main__ block (and main() function).  This has now been fixed.

Also mostly removed extraneous file RepAgentModel.py, which seems to be an old name of ConsRepAgentModel.py.  This file now simply imports all of ConsRepAgentModel and warns the user to use that instead.

* import ConsumerParameters -> import HARK.ConsumptionSaving.ConsumerParameters
* Fixed imports in model files

All of the consumption-saving model files used a style of import that didn't work when the file was run directly (rather than called as a module), even though they have a __main__ block (and main() function).  This has now been fixed.

Also mostly removed extraneous file RepAgentModel.py, which seems to be an old name of ConsRepAgentModel.py.  This file now simply imports all of ConsRepAgentModel and warns the user to use that instead.

* import ConsumerParameters -> import HARK.ConsumptionSaving.ConsumerParameters
@mnwhite this seems to be wrong, right? I don't see any `self.constructIncomeProcess`-ish statements, so it appears that the mean one log-normal equiprobably version is hard-coded. Correct?

I think we should remove the statement unless I missed something in the code, and if it's a *planned* feature, let's just open an issue.
Remove incorrect statement in updateIncomeProcess
The MPC is calculated and stored as an attribute (MPCnow) in some models, but this was omitted in ConsGenIncProcessModel.  As it turns out, this functionality is necessary for an exercise/notebook that is being prepared for NBER SI.
One line break in a PR I merged in was invalid in Python 2.7, should now be fixed.
Add CONTRIBUTING.md file to HARK repository
* Revert "Fix one line break"

This reverts commit 39cc609.

* Revert "Merge pull request #193 from TimMunday/NanBool"

This reverts commit 4d19ea9, reversing
changes made to 7e690d9.
…instead, as these will automatically reset upon exit.
set numpy floating point error level to ignore.
We have two folders inside of /ConsumptionSaving that contain model demos.  To my knowledge, all of these have been turned into DemARK notebooks other than TryAlternativeParameters, which has no explanation or documentation.  I'm not even sure when or why it was put into HARK.

This commit removes these old files.
* Changed hardcoded updateAFunc parameters into proper parameters

Four parameters that govern how CobbDouglasEconomy.updateAFunc works
were defined locally, within the method, but are now attributes to be
assigned at init (or at least before the user tries to solve):

- update_weight --> DampingFac, now defined complementarily
- verbose --> verbose
- discard_periods --> T_discard
- max_loops --> max_loops

To prevent this from being a breaking change, init method writes old
hardcoded values if omitted from passed inputs.  This will be improved
with a warning later.

Untested, as it turns out Anaconda3 is incorrectly installed on this
computer.

* Tested de-hardcoded AFunc updating parameters

Put new parameters in dictionaries in ConsumerParameters.py.  Also added necessary lines to MarkovCobbDouglasEconomy and fixed one output description.
Make calcChoiceProbs more accurate, and make simultaneous evaluation faster (and more accurate)
* DCEGM

* Add small test.

* Fix test

* dcegmIntervals -> dcegmSegments

* Add convenience index in dcegmSegments instead.

* Some cleanup.

* Fix tests.

* Fix tests.
* Update some text in dcegm

As per our discussion, these were some of the small things Matt had spotted. @shaunagm

* Update dcegm.py
mnwhite and others added 24 commits May 11, 2019 12:38
IndShockConsumerType.__init__ was calling its checkConditions method, but this caused many subclasses to fail, as their checkConditions method either throws a notImplementedError or runs into an attribute error.

CDC wants checkConditions called automatically, so this is now done in preSolve().  All subclasses have had a trivial preSolve method added, which only calls updateSolutionTerminal.
Delinting at PyCon created a typo in the argument list for Market, now fixed.
Simple test of  initialization of IncomeDstn of IndShockConsumerType.
Gauss Hermite-based normal and lognormal quadrature nodes and weights
Somehow the non-delinted version of HARK.core (from before PyCon) ended up in this branch.  This commit simply reverts it.
Chris and I discovered that HARK.parallel.multithreadCommands did not work on some of our project code... but only in some Python environments. On Windows, I could only generate this error on Python 3, but everything worked correctly on Python 2. On Mac and/or Linux, Chris found that the error came up when running Python in a terminal, but not when running Spyder; he also tested various web servers.

After some digging, Chris found that recent versions of joblib (which multithreadCommands uses) changed the default backend of joblib from multiprocessing to loky. Apparently something in loky does not play nicely with our class structure, as it can't (de)serialize at least some AgentType subclasses.

This fixes the issue by simply changing the backend argument on the call to Parallel. It also fixes one typo in a comment.
Add non_empty argument validator
Remove invalid characters that break pypi uploading
* Move two test files from Testing to HARK/tests.

* Remove MultithreadedDemo.py

* Use assertAlmostEqual in the reference test in file test_TractableBufferStockModel.py.

* Fix inputs for MarkovConsumerType. Lacked T_cycle, LivPrb didn't match the number of states, and MrkvArray wasn't an array although it was time-varying.

* Compare vectors properly.

* Compare vectors properly.

* Update test_modelcomparisons.py

* Update test_modelcomparisons.py

* Update test_modelcomparisons.py

* Fix MrkvArray
@llorracc
Copy link
Collaborator

llorracc commented May 25, 2019 via email

@iworld1991 iworld1991 closed this May 25, 2019
@iworld1991
Copy link
Contributor Author

I confused master branch and Bayerluetticke branch in my fork, which caused the issue above. I will make a new pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants