-
-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
/pin-clusters with redis (plus /heatmap) #574
Merged
Merged
Changes from 2 commits
Commits
Show all changes
4 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
FROM redis | ||
COPY redis.conf /usr/local/etc/redis/redis.conf | ||
CMD [ "redis-server", "/usr/local/etc/redis/redis.conf" ] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,43 @@ | ||
# full example config here: | ||
# http://download.redis.io/redis-stable/redis.conf | ||
|
||
# discussion of memory policies here: | ||
# https://redis.io/topics/lru-cache | ||
|
||
# how to check that this config is working inside docker: | ||
# 1. login to container: `docker exec -it 311-redis /bin/bash` | ||
# 2. start the redis cli: `redis-cli` | ||
# 3. check the config: `config get maxmemory-policy` | ||
|
||
# MAXMEMORY POLICY: how Redis will select what to remove when maxmemory | ||
# is reached. You can select one from the following behaviors: | ||
# | ||
# volatile-lru -> Evict using approximated LRU, only keys with an expire set. | ||
# allkeys-lru -> Evict any key using approximated LRU. | ||
# volatile-lfu -> Evict using approximated LFU, only keys with an expire set. | ||
# allkeys-lfu -> Evict any key using approximated LFU. | ||
# volatile-random -> Remove a random key having an expire set. | ||
# allkeys-random -> Remove a random key, any key. | ||
# volatile-ttl -> Remove the key with the nearest expire time (minor TTL) | ||
# noeviction -> Don't evict anything, just return an error on write operations. | ||
# | ||
# LRU means Least Recently Used | ||
# LFU means Least Frequently Used | ||
# | ||
# Both LRU, LFU and volatile-ttl are implemented using approximated | ||
# randomized algorithms. | ||
# | ||
# Note: with any of the above policies, Redis will return an error on write | ||
# operations, when there are no suitable keys for eviction. | ||
# | ||
# At the date of writing these commands are: set setnx setex append | ||
# incr decr rpush lpush rpushx lpushx linsert lset rpoplpush sadd | ||
# sinter sinterstore sunion sunionstore sdiff sdiffstore zadd zincrby | ||
# zunionstore zinterstore hset hsetnx hmset hincrby incrby decrby | ||
# getset mset msetnx exec sort | ||
# | ||
# The default is: | ||
# | ||
# maxmemory-policy noeviction | ||
|
||
maxmemory-policy volatile-lfu |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Large diffs are not rendered by default.
Oops, something went wrong.
181 changes: 181 additions & 0 deletions
181
server/performanceStatistics/pins-comparison/pins-comparison.jtl
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,36 @@ | ||
import pandas as pd | ||
import hashlib | ||
import json | ||
from utils.redis import cache | ||
from .dataService import DataService | ||
|
||
|
||
class HeatmapService(object): | ||
def __init__(self, config=None): | ||
self.config = config | ||
|
||
def pins_key(self, filters): | ||
filters_json = json.dumps(filters, sort_keys=True).encode('utf-8') | ||
hashed_json = hashlib.md5(filters_json).hexdigest() | ||
return 'filters:{}:pins'.format(hashed_json) | ||
|
||
async def get_heatmap(self, filters): | ||
key = self.pins_key(filters) | ||
pins = cache.get(key) | ||
|
||
fields = ['latitude', 'longitude'] | ||
if pins is None: | ||
dataAccess = DataService(self.config) | ||
|
||
filters = dataAccess.standardFilters( | ||
filters['startDate'], | ||
filters['endDate'], | ||
filters['requestTypes'], | ||
filters['ncList']) | ||
|
||
pins = dataAccess.query(fields, filters) | ||
pins = pd.DataFrame(pins, columns=fields) | ||
else: | ||
pins = pins[fields] | ||
|
||
return pins.to_numpy() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,81 @@ | ||
import pysupercluster | ||
import pandas as pd | ||
import hashlib | ||
import json | ||
from utils.redis import cache | ||
from .dataService import DataService | ||
|
||
|
||
class PinClusterService(object): | ||
def __init__(self, config=None): | ||
self.config = config | ||
|
||
def pins_key(self, filters): | ||
filters_json = json.dumps(filters, sort_keys=True).encode('utf-8') | ||
hashed_json = hashlib.md5(filters_json).hexdigest() | ||
return 'filters:{}:pins'.format(hashed_json) | ||
|
||
def get_pins(self, filters): | ||
key = self.pins_key(filters) | ||
pins = cache.get(key) | ||
|
||
if pins is None: | ||
dataAccess = DataService(self.config) | ||
|
||
fields = [ | ||
'srnumber', | ||
'requesttype', | ||
'latitude', | ||
'longitude'] | ||
|
||
filters = dataAccess.standardFilters( | ||
filters['startDate'], | ||
filters['endDate'], | ||
filters['requestTypes'], | ||
filters['ncList']) | ||
|
||
pins = dataAccess.query(fields, filters) | ||
pins = pd.DataFrame(pins, columns=fields) | ||
|
||
cache.set(key, pins) | ||
|
||
return pins | ||
|
||
def pin_clusters(self, pins, zoom, bounds, options={}): | ||
if len(pins) == 0: | ||
return [] | ||
|
||
min_zoom = options.get('min_zoom', 0) | ||
max_zoom = options.get('max_zoom', 17) | ||
radius = options.get('radius', 200) | ||
extent = options.get('extent', 512) | ||
|
||
index = pysupercluster.SuperCluster( | ||
pins[['longitude', 'latitude']].to_numpy(), | ||
min_zoom=min_zoom, | ||
max_zoom=max_zoom, | ||
radius=radius, | ||
extent=extent) | ||
|
||
north = bounds.get('north', 90) | ||
south = bounds.get('south', -90) | ||
west = bounds.get('west', -180) | ||
east = bounds.get('east', 180) | ||
|
||
clusters = index.getClusters( | ||
top_left=(west, north), | ||
bottom_right=(east, south), | ||
zoom=zoom) | ||
|
||
for cluster in clusters: | ||
if cluster['count'] == 1: | ||
pin = pins.iloc[cluster['id']] | ||
cluster['srnumber'] = pin['srnumber'] | ||
cluster['requesttype'] = pin['requesttype'] | ||
del cluster['expansion_zoom'] | ||
|
||
return clusters | ||
|
||
async def get_pin_clusters(self, filters, zoom, bounds, options): | ||
pins = self.get_pins(filters) | ||
return self.pin_clusters(pins, zoom, bounds, options) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,34 @@ | ||
import redis | ||
import pickle | ||
from datetime import timedelta | ||
|
||
|
||
class RedisCache(object): | ||
def config(self, config): | ||
self.enabled = config['ENABLED'] == 'True' | ||
if self.enabled: | ||
self.ttl = int(config['TTL_SECONDS']) | ||
self.r = redis.Redis(host='redis') | ||
|
||
def get(self, key): | ||
if not self.enabled: | ||
return None | ||
|
||
value = self.r.get(key) | ||
if value is None: | ||
return None | ||
else: | ||
return pickle.loads(value) | ||
|
||
def set(self, key, value): | ||
if not self.enabled: | ||
return None | ||
|
||
value = pickle.dumps(value) | ||
try: | ||
self.r.setex(key, timedelta(seconds=self.ttl), value) | ||
except Exception as e: | ||
print(e) | ||
|
||
|
||
cache = RedisCache() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we default this to false?
Also is there a url for redis? Im guessing its defaulting to localhost:6379
In prod we will have a environment variable override for this
OOORRRRR we can base
ENABLED
off of 'was there a redis url provided or not' that way we kill two birds with 1 stoneThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah I can change that.
for the url, right now it's using 'redis' as the host, which I guess is the hostname of the redis service within the docker compose network...if I understand docker correctly. And the port is 6379. I'll figure out how to set this up as an env variable
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ohhh okay yeah...makes sense
Within the docker env itll be exposed as both redis:6379 and localhost:6379 so either way we want to be explicit with how we address redis so we can override it with heroku's
fn8934fh3o8g3o3qhg893gg.heroku.com:6379
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok cool this is done