Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Speed up caching logic #1229

Merged
merged 4 commits into from
Jan 3, 2019

Conversation

gaffney2010
Copy link
Member

The caching is inefficient when prob_end is used. The current paradigm caches by players and number of turns in match, for non-stochastic. But this ignores the fact that you can read a shorter match from the result of a longer match. For example if we have a 20-turn match cached, and we wish to run a 19-turn match, we COULD just read the first 19 moves from the 20-turn match, but the current paradigm would completely re-run because no 19-turn match has been run yet.

I changed the code to

  1. Cache on player-pairs only.
  2. Only re-run a match if the cache is shorter than the desired match. In that case overwrite the cached value.
  3. Sliced the cached value after reading, so as to get the first n moves, as desired.

I ran the code below and found an 80% speed-up. I don't think this adds complexity, and keying on player-pairs only is simpler and more natural.

Experimental code:

import axelrod as axl
import time

MATCH_SIMS = 100

player1, player2 = axl.Cooperator(), axl.DBS()
players = (player1, player2)
match = axl.Match(players, prob_end=0.01)

start_time = time.time()
prev_time = time.time()
for _ in range(MATCH_SIMS):
  match.play()
  print("Running simulation {} of {}.  Played {} rounds.  It took {} secs.".format(
      _, MATCH_SIMS, len(match.result), time.time() - prev_time))
  prev_time = time.time()
end_time = time.time()
print("================")
print("Total time for {} simulations: {} secs.".format(MATCH_SIMS, end_time - start_time))

@marcharper
Copy link
Member

Looks really nice to me. Can you add a test that ensures that when a longer run happens that the cache is overwritten with the longer run, similar to the newly added test?

@gaffney2010
Copy link
Member Author

Looks really nice to me. Can you add a test that ensures that when a longer run happens that the cache is overwritten with the longer run, similar to the newly added test?

Added.

@gaffney2010 gaffney2010 reopened this Dec 27, 2018
@drvinceknight
Copy link
Member

Nice! Thanks @gaffney2010 👍

@drvinceknight drvinceknight merged commit 3315c9f into Axelrod-Python:master Jan 3, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants