-
Notifications
You must be signed in to change notification settings - Fork 14.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make MetricsControl the standard across visualizations #4914
Conversation
e3b8921
to
684d6d4
Compare
Awesome! Will take a look at this tomorrow. |
superset/viz.py
Outdated
for mkey in METRIC_KEYS: | ||
val = fd.get(mkey) | ||
if val: | ||
print(mkey, val) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
zoops
LGTM- yay super excited to see this happening. I think referring to metrics at the label level definitely seems better than keeping a mixed list of dicts and strings- glad to see that refactor. |
@@ -202,7 +222,7 @@ def query_obj(self): | |||
"""Building a query object""" | |||
form_data = self.form_data | |||
gb = form_data.get('groupby') or [] | |||
metrics = form_data.get('metrics') or [] | |||
metrics = self.all_metrics or [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like the pattern we currently have for returning empty arrays, i feel like we should be throwing some exceptions with the proper messaging to let the user know how that there are no metrics currently selected.
For instance if we get len(self.all_metrics) == 0
or some check we should throw a SupersetException
@GabeLoins let me know what you think
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The UI does a pretty good job right now informing users that they do not have a metric selected and preventing them from issuing queries. I'm not sure what situations there could be where a user could submit a query from explore without a metric selected, but I am all for informative error messages. I still think its good to have an idea of when this could happen so the error message offers the right information. Is this if there's a bug in the UI? Or are there cases where the UI is working and someone still can press Run Query
with no metric selected? See:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @hughhhh @mistercrunch ^
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's a bit complex here as many controls may use MetricControl
(some with non null validators, some without) and we don't want to duplicate logic. In the context above metrics
may exist as a key or not exist, or have None
, []
or an actual list of values OR dicts. That first step here is to remove None
or key doesn't exist from the list of possibility as it should be treated the same.
There's clearly much that could be done with a stronger contract between the backend and the frontend.
superset/viz.py
Outdated
@@ -43,6 +43,11 @@ | |||
config = app.config | |||
stats_logger = config.get('STATS_LOGGER') | |||
|
|||
METRIC_KEYS = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: []
over {}
This spreads MetricsControl across visualizations.
Codecov Report
@@ Coverage Diff @@
## master #4914 +/- ##
==========================================
+ Coverage 77.1% 77.34% +0.23%
==========================================
Files 44 44
Lines 8644 8664 +20
==========================================
+ Hits 6665 6701 +36
+ Misses 1979 1963 -16
Continue to review full report at Codecov.
|
Addressed comments, merging |
* [WiP] make MetricsControl the standard across visualizations This spreads MetricsControl across visualizations. * Addressing comments * Fix deepcopy issue using shallow copy * Fix tests (cherry picked from commit 7b427d7)
* [WiP] make MetricsControl the standard across visualizations This spreads MetricsControl across visualizations. * Addressing comments * Fix deepcopy issue using shallow copy * Fix tests
* [WiP] make MetricsControl the standard across visualizations This spreads MetricsControl across visualizations. * Addressing comments * Fix deepcopy issue using shallow copy * Fix tests
* [WiP] make MetricsControl the standard across visualizations This spreads MetricsControl across visualizations. * Addressing comments * Fix deepcopy issue using shallow copy * Fix tests
@GabeLoins this spreads MetricsControl across visualizations.