-
-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Influx integration sends bytes sent rather than bandwidth measurement #82
Comments
SOB, I knew I forgot about that somewhere. |
I see same values in MariaDB and InfluxDB. If I multiply them by 8, I have the bandwidth in Mbits/s. |
I think the point is more that I should be reporting it consistently. If the panel reports bits, influxdb should get bits |
I agree...let's stick to a representation then it will be converted as needed at UI level. I mean: the storage db should contain the same value and representation of speedtest's json, and it should be the same also in influx, right? |
I was trying to figure out why my guagues in Grafana were off and ran across @alexdelprete's comment. Having it consistent would be a big plus |
…en#82 Added as a new value rather than changing existing download/upload so current workarounds do not break.
I configured influxdb2 integration and checked data after the first speedtest. These are the available fields I could choose from:
I picked download/upload, but I noticed very low values in influx:
so I checked the mariadb data (record #7):
So the data was ok. But the reported speed of that test was this:
I didn't understand at first...then I thought maybe those download/upload columns in the db represent the amount of data sent/received, and that you get the actual speed values from the json in the data column. Problem is that the data column is not exported to influx.
So, to integrate my OP, I think also the DL/UL speed values should be in their own columns, probably the quantity of data sent/received is not so important, and you could actually reuse those columns. :)
Originally posted by @alexdelprete in #48 (comment)
The text was updated successfully, but these errors were encountered: