Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testing #10

Closed
23 tasks done
sglyon opened this issue Jul 9, 2014 · 9 comments
Closed
23 tasks done

Testing #10

sglyon opened this issue Jul 9, 2014 · 9 comments

Comments

@sglyon
Copy link
Member

sglyon commented Jul 9, 2014

We have talked about writing tests, so I thought we should open up an issue to organize this effort. Below is a todo list of modules that are currently (as of 7/9/14) being imported into the main init.py file for the package. My thoughts for this issue are to organize test writing efforts so that we can track which tests have been written, which are being worked on right now, and who is working on them. This way we can write all the tests, but not duplicate work (duplicating work is already unpleasant, but when it is duplication of writing tests, the pain is magnified 😡)

  • asset_pricing
  • career
  • compute_fp
  • discrete_rv
  • ecdf
  • estspec
  • ifp
  • jv
  • kalman
  • lae
  • linproc
  • lqcontrol
  • lss
  • lucastree
  • mc_tools [Matt]
  • odu
  • optgrowth
  • quadsums
  • rank_nullspace
  • riccati
  • robustlq
  • tauchen
  • quant-econ/examples/ [Matt]

I suggest that as someone starts working on tests for a particular module that they say so in a comment on this issue so that we don't end up duplicating work.

Any other suggestions for how we should proceed are encouraged.

@cc7768
Copy link
Member

cc7768 commented Jul 18, 2014

Just a reminder, if you work on tests at all, please mark them out so we don't duplicate work. Feel free to work either through a fork or the tests branch (whichever is applicable), but regardless please comment here on what you are working on.

@sglyon
Copy link
Member Author

sglyon commented Jul 31, 2014

I have written a few tests for the asset pricing module, but need some help.

I am trying to think of a very simple example that we can "know" the answer to so that I can check the accuracy of the lucas_tree, consol_price, and call_option methods. Does anyone have any ideas?

@sanguineturtle
Copy link
Contributor

#45 Test Branch Merged.I have one more item before this issue is closed.

For the examples\ folder which contains the website code, I am looking through them to see if a few tests need to be written to ensure the website examples remain consistent etc. Any thoughts on the following:

  1. plot_directives: these contain an explicit link between the plot python file and the output of the embedded graphic in sphinx. Therefore these probably don't need testing. Any changes would be captured in changes to the underlying py file.
    @jstac do you want to know if the graphics change from what is currently shown? If so we could use md5hash's of the generated plots. For example 3dplot.pdf has checksum fb0eb23d8821237fa519057c5480d88b but this would then flag a change notice when 3dplot.py might be changed for some valid reason (updating colors etc). The alternative is not pretty (http://matplotlib.org/devel/testing.html)
  2. Given the underlying test coverage of the formal QuantEcon package. Perhaps a runtime test is all that is really needed to ensure the examples have been updated as the API changes in QuantEcon etc.
    @jstac There are some files that contain functions in the examples folder. An example is: ar1sim.py has functions proto1, proto2, ols_estimates and ope_estimates. Would you like these to be tested with the example input which has been written for the website (i.e. using theta=0.8, num_reps=100000, n = 1000 and x_obs=proto2(....) and checking those functions produce expected output?

@jstac
Copy link
Contributor

jstac commented Aug 9, 2014

@sanguineturtle Thanks for looking into this. Just a runtime test for examples is completely fine. No need to do anything beyond that. In fact the ar1sim.py file was something I threw together that doesn't actually get used anywhere.

@sanguineturtle
Copy link
Contributor

I have pushed a new tests branch to the repository. I am still trying to figure out the best way to suppress matplotlib figures and capture stderr when each file executes to report back in an assert Message. Currently this test runs but doesn't "work" as the stderr and stdout are streamed into the terminal screen.

I have posted a stackoverflow question on this topic: http://stackoverflow.com/questions/25215477/supress-matplotlib-figures-when-running-py-file-via-python-or-ipython-terminal

Any thoughts would be most welcome.

@jstac
Copy link
Contributor

jstac commented Oct 8, 2014

@sanguineturtle Can we close this?

@sglyon
Copy link
Member Author

sglyon commented Jan 28, 2015

Ping

@sglyon
Copy link
Member Author

sglyon commented Jan 28, 2015

@mmcky is this ready to close?

@mmcky
Copy link
Contributor

mmcky commented Jan 29, 2015

Closing as we have some basic test infrastructure in place to run the examples.
We can improve this later on possibly with: https://github.com/paulgb/runipy

@mmcky mmcky closed this as completed Jan 29, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants