diff --git a/README.md b/README.md index 270c8e3..4e166f2 100644 --- a/README.md +++ b/README.md @@ -1,19 +1,33 @@ ![GitHub release (with filter)](https://img.shields.io/github/v/release/vingerha/gazpar_2_mqtt) ## Introduction / status - -Reworked from the great repo by yukulehe/gazpar2mqtt (who also provided a large part of the docu), now that GRDF is again without Captcha. -Main differences are in the login method, now using virtual browser (old method still does not work), and allowing to export to HA Long Term Statistics -- It is working as a container (tested by 3 people) on collecting data in SQLite, MQTT, InfluxDb, Home Assistant sensor and HA energy-dashboard -- It is working as an add on (tested by myself and 1 other person) -- no verification if the now 2+-year old code from yukulehe/gazpar2mqtt is still valid **in its entirety**, bits and pieces may not work perfectly anylonger -- Not yet tested - - Grafana dashboard template - - Cost calculation from prices file +Reworked from the great repo by yukulehe/gazpar2mqtt. Main differences are in the login method, now using virtual browser and allowing to export to HA Long Term Statistics +- Can be installed as an addon and container; collecting data in SQLite, MQTT, InfluxDb, Home Assistant sensor and HA energy-dashboard (LTS) +- no 100% verification if the now 2+-year old source from yukulehe/gazpar2mqtt is still valid **in its entirety**, bits and pieces may not work perfectly anylonger +- Not yet tested: Grafana dashboard template, Cost calculation from prices file For usage and installation etc. see [DOCUMENTATION](https://github.com/vingerha/gazpar_2_mqtt/wiki) ## Changelogs : +- 0.8.0 + Extending LTS sensors with kWh and price + - sensor.[device]_[pce-alias]_consumption_stat : for daily figures in m3 + - sensor.[device]_[pce-alias]_consumption_kwh_stat : for daily figures in kWh + - sensor.[device]_[pce-alias]_consumption_cost_stat : daily cost + - sensor.[device]_[pce-alias]_consumption_pub_stat : for periodic figures in m3 + - sensor.[device]_[pce-alias]_consumption_kwh_pub_stat : for periodic figures in kWh + - sensor.[device]_[pce-alias]_consumption_cost_pub_stat : periodic cost +- 0.7.0 + The LTS sensor name can no longer be chosen and is fixed to + - sensor.[device]_[pce-alias]_consumption_stat : for daily figures + - sensor.[device]_[pce-alias]_consumption_pub_stat : for periodically 'published' figures + Reason: previously the LTS sensors were added without a regular sensor, this makes their use impossible for e.g. apexcharts who uses the regular sensor-name also for statistics. + These 2 sensors will appear both in HA as in HA statistics + +- 0.6.5 + - Allow to select a date from which to collect data from GRDF (max 3y back in time) + - Allow import of published measures into Long Term Statistics + - Allow to delete Long Term Ststaistics for all PCE - 0.6.0 - Fix issue with double naming in the HA sensor - Fix issue with incorrect device_classes for the sensors diff --git a/VERSION b/VERSION index a918a2a..a3df0a6 100644 --- a/VERSION +++ b/VERSION @@ -1 +1 @@ -0.6.0 +0.8.0 diff --git a/addon/CHANGELOG.md b/addon/CHANGELOG.md index a7165d1..d581368 100644 --- a/addon/CHANGELOG.md +++ b/addon/CHANGELOG.md @@ -1,3 +1,18 @@ +## 0.8.0 (12-05-2024) +Extending LTS sensors with kWh and price +sensor.[device]_[pce-alias]_consumption_stat : for daily figures in m3 +sensor.[device]_[pce-alias]_consumption_kwh_stat : for daily figures in kWh +sensor.[device]_[pce-alias]_consumption_cost_stat : daily cost +sensor.[device]_[pce-alias]_consumption_pub_stat : for periodic figures in m3 +sensor.[device]_[pce-alias]_consumption_kwh_pub_stat : for periodic figures in kWh +sensor.[device]_[pce-alias]_consumption_cost_pub_stat : periodic cost + +## 0.7.0 (10-05-2024) +The LTS sensor name can no longer be chosen and is fixed to : + +sensor.[device]_[pce-alias]_consumption_stat : for daily figures +sensor.[device]_[pce-alias]_consumption_stat_pub : for periodically 'published' figures + ## 0.6.5 (09-05-2024) Allow import of published measures into Long Term Statistics diff --git a/addon/Dockerfile b/addon/Dockerfile index df3f19c..e48cef2 100644 --- a/addon/Dockerfile +++ b/addon/Dockerfile @@ -27,7 +27,7 @@ CMD ["python3", "app/gazpar2mqtt.py"] ############ LABEL \ - io.hass.version="0.6.0" \ + io.hass.version="0.8.0" \ io.hass.type="addon" \ io.hass.arch="armv7|amd64|arm64" diff --git a/addon/README.md b/addon/README.md index 16c1b74..ca212d8 100644 --- a/addon/README.md +++ b/addon/README.md @@ -2,11 +2,10 @@ ## Introduction / status -Reworked from the great repo by yukulehe/gazpar2mqtt (who also provided a large part of the docu). -Main differences are in the login method, now using virtual browser (old method still does not work), and allowing to export to HA Long Term Statistics +Reworked from the great repo by yukulehe/gazpar2mqtt. +Main differences are in the login method and allowing to export to HA Long Term Statistics, values and prices - no verification if the now 2+-year old code from yukulehe/gazpar2mqtt is still valid **in its entirety**, bits and pieces may not work perfectly anylonger - Not yet tested - Grafana dashboard template - - Cost calculation from prices file For usage and installation etc. see [DOCUMENTATION](https://github.com/vingerha/gazpar_2_mqtt/wiki) \ No newline at end of file diff --git a/addon/config.yaml b/addon/config.yaml index 66764a7..ad36c96 100644 --- a/addon/config.yaml +++ b/addon/config.yaml @@ -1,6 +1,6 @@ name: "Gazpar 2 MQTT" description: "Extracts GRDF data into MQTT a.o." -version: 0.6.5 +version: 0.8.0 slug: "gazpar_2_mqtt" init: false homeassistant_api: true diff --git a/app/database.py b/app/database.py index ce20811..14522ed 100644 --- a/app/database.py +++ b/app/database.py @@ -77,11 +77,15 @@ def init(self,g2mVersion,dbVersion,influxVersion): pce TEXT NOT NULL , type TEXT NOT NULL , date TEXT NOT NULL + , periodStart TEXT NOT NULL + , periodEnd TEXT NOT NULL , start_index INTEGER NOT NULL , end_index INTEGER NOT NULL , volume INTEGER NOT NULL , volumeGrossConsumed REAL NOT NULL , energy INTEGER NOT NULL + , energyGrossConsumed REAL NOT NULL + , price REAL NOT NULL , conversion REAL)''') self.cur.execute('''CREATE UNIQUE INDEX IF NOT EXISTS idx_measures_measure ON measures (pce,type,date)''') @@ -313,12 +317,16 @@ def __init__(self,pce,result): self.pce = pce self.type = result[1] self.date = _convertDate(result[2]) - self.startIndex = result[3] - self.endIndex = result[4] - self.volume = result[5] - self.volumeGross = result[6] - self.energy = result[7] - self.conversionFactor = result[8] + self.periodStart = _convertDateTime(result[3]) + self.periodEnd = _convertDateTime(result[4]) + self.startIndex = result[5] + self.endIndex = result[6] + self.volume = result[7] + self.volumeGross = result[8] + self.energy = result[9] + self.energyGross = result[10] + self.price = result[11] + self.conversionFactor = result[12] # Class Measure class Threshold(): diff --git a/app/gazpar.py b/app/gazpar.py index 64f13ae..8667bd6 100644 --- a/app/gazpar.py +++ b/app/gazpar.py @@ -332,12 +332,15 @@ def login(self,username,password, download_folder, screenshot: bool = False, ver logging.debug("Using Button 2: %s", re_btn) re_btn.click() time.sleep(3) + + if screenshot: + self.get_screenshot("04_screenshot_after_password_button.png") self.__browser.switch_to.default_content() time.sleep(3) if screenshot: - self.get_screenshot("03a_screenshot_after_switch_to_default_content.png") + self.get_screenshot("05_screenshot_after_switch_to_default_content.png") isLoggedIn = True @@ -353,7 +356,7 @@ def login(self,username,password, download_folder, screenshot: bool = False, ver logging.debug("Using deny_btn: %s", deny_btn) if screenshot: - self.get_screenshot("05_screenshot_after_deny_button.png") + self.get_screenshot("06_screenshot_after_deny_button.png") try: self.click_in_view( @@ -371,7 +374,7 @@ def login(self,username,password, download_folder, screenshot: bool = False, ver pass if screenshot: - self.get_screenshot("06_screenshot_after_connexion_path.png") + self.get_screenshot("07_screenshot_after_connexion_path.png") # When everything is ok @@ -509,11 +512,7 @@ def getPceMeasures(self,pce, startDate, endDate, type): time.sleep(5) measureList = resp - - # Update PCE range of date - #pce.dailyMeasureStart = startDate - #pce.dailyMeasureEnd = endDate - + if measureList: for measure in measureList[pce.pceId]["releves"]: @@ -1053,7 +1052,9 @@ def __init__(self, pce, measure,type): self.volumeGross = None self.volumeInitial = None self.energy = None + self.energyGross = 0 self.temperature = None + self.price = 0 self.conversionFactor = None self.pce = None self.isDeltaIndex = False @@ -1073,6 +1074,8 @@ def __init__(self, pce, measure,type): if measure["energieConsomme"]: self.energy = int(measure["energieConsomme"]) if measure["temperature"]: self.temperature = float(measure["temperature"]) if measure["coeffConversion"]: self.conversionFactor = float(measure["coeffConversion"]) + if measure["coeffConversion"] and measure["volumeBrutConsomme"]: + self.energyGross = float(measure["volumeBrutConsomme"]) * float(measure["coeffConversion"]) self.pce = pce # Fix informative volume and energy provided when required @@ -1103,9 +1106,9 @@ def store(self,db): dbTable = "consumption_published" if self.isOk() and dbTable: - logging.debug("Store measure type %s, %s, %s, %s, %s m3, %s m3, %s kWh, %s kwh/m3",self.type,str(self.gasDate),str(self.startIndex),str(self.endIndex), str(self.volume), str(self.volumeGross), str(self.energy), str(self.conversionFactor)) - measure_query = f"INSERT OR REPLACE INTO measures VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)" - db.cur.execute(measure_query, [self.pce.pceId, self.type, self.gasDate, self.startIndex, self.endIndex, self.volume, self.volumeGross, self.energy, self.conversionFactor]) + logging.debug("Store measure type %s, %s,%s,%s, %s, %s, %s m3, %s m3, %s kWh, %s kWh, %s EUR, %s kwh/m3",self.type,str(self.gasDate),str(self.startDateTime), str(self.endDateTime),str(self.startIndex),str(self.endIndex), str(self.volume), str(self.volumeGross), str(self.energy), str(self.energyGross), self.price, str(self.conversionFactor)) + measure_query = f"INSERT OR REPLACE INTO measures VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)" + db.cur.execute(measure_query, [self.pce.pceId, self.type, self.gasDate, self.startDateTime, self.endDateTime, self.startIndex, self.endIndex, self.volume, self.volumeGross, self.energy, self.energyGross, self.price, self.conversionFactor]) # Return measure measure quality status diff --git a/app/gazpar2mqtt.py b/app/gazpar2mqtt.py index 389c3ff..cd3d951 100644 --- a/app/gazpar2mqtt.py +++ b/app/gazpar2mqtt.py @@ -20,8 +20,8 @@ # gazpar2mqtt constants -G2M_VERSION = '0.6.5' -G2M_DB_VERSION = '0.2.0' +G2M_VERSION = '0.8.0' +G2M_DB_VERSION = '0.4.0' G2M_INFLUXDB_VERSION = '0.1.0' ####################################################################### @@ -220,10 +220,17 @@ def run(myParams): # Set date range - minDateTime = _getYearOfssetDate(datetime.datetime.now(), 3) # GRDF min date is 3 years ago + if not myParams.grdfStartDate: myParams.grdfStartDate = '2020-01-01' # can be omitted if param.py back to default + minDateTimeLimit = _getYearOfssetDate(datetime.datetime.now(), 3) # GRDF min date is 3 years ago + minDateTime = datetime.datetime.strptime(myParams.grdfStartDate, "%Y-%m-%d") startDate = minDateTime.date() endDate = datetime.date.today() - logging.info("Range period : from %s (3 years ago) to %s (today) ...",startDate,endDate) + if minDateTime < minDateTimeLimit: + startDate = minDateTimeLimit.date() + logging.info("Range period : from %s (3 years ago) to %s (today) ...",startDate,endDate) + + logging.info("Range period : from %s (self defined) to %s (today) ...",startDate,endDate) + # Get informative measures logging.info("---------------") @@ -350,22 +357,22 @@ def run(myParams): logging.info("No PCE retrieved.") - #################################################################################################################### - # STEP 4 : Prices + + #################################################################################################################### - logging.info("-----------------------------------------------------------") - logging.info("# Load prices #") - logging.info("-----------------------------------------------------------") + + + - # Load data from prices file - logging.info("Loading prices from file %s of directory %s", price.FILE_NAME, myParams.pricePath) - myPrices = price.Prices(myParams.pricePath, myParams.priceKwhDefault, myParams.priceFixDefault) - if len(myPrices.pricesList): - logging.info("%s range(s) of prices found !", len(myPrices.pricesList)) + + + + + - #################################################################################################################### + # STEP 5A : Standalone mode #################################################################################################################### if myMqtt.isConnected \ @@ -647,8 +654,22 @@ def run(myParams): ## Other logging.debug("Creation of other entities") myEntity = hass.Entity(myDevice,hass.BINARY,'connectivity','connectivity',hass.CONNECTIVITY_TYPE,None,None).setValue('ON') - - + + if myParams.hassLts: + logging.debug("Creation of dummy LTS sensors") + myEntity = hass.Entity(myDevice, hass.SENSOR, 'consumption_stat', 'consumption stat', hass.GAS_TYPE, hass.ST_TT, + 'm³').setValue('1') + myEntity = hass.Entity(myDevice, hass.SENSOR, 'consumption_kwh_stat', 'consumption kwh stat', hass.GAS_TYPE, hass.ST_TT, + 'kWh').setValue('1') + myEntity = hass.Entity(myDevice, hass.SENSOR, 'consumption_pub_stat', 'consumption pub stat', hass.GAS_TYPE, hass.ST_TT, + 'm³').setValue('1') + myEntity = hass.Entity(myDevice, hass.SENSOR, 'consumption_kwh_pub_stat', 'consumption kwh pub stat', hass.GAS_TYPE, hass.ST_TT, + 'kWh').setValue('1') + myEntity = hass.Entity(myDevice, hass.SENSOR, 'consumption_cost_stat', 'consumption cost stat', hass.COST_TYPE, hass.ST_TT, + 'EUR').setValue('1') + myEntity = hass.Entity(myDevice, hass.SENSOR, 'consumption_cost_pub_stat', 'consumption cost pub stat', hass.COST_TYPE, hass.ST_TT, + 'EUR').setValue('1') + # Publish config, state (when value not none), attributes (when not none) logging.info("Publishing devices...") logging.info("You can retrieve published values subscribing topic %s",myDevice.hass.prefix + "/+/" + myDevice.id + "/#") @@ -662,6 +683,83 @@ def run(myParams): except: logging.error("Home Assistant discovery mode : unable to publish value to mqtt broker") + #################################################################################################################### + # STEP 4a : Prices + #################################################################################################################### + + logging.info("-----------------------------------------------------------") + logging.info("# Load prices #") + logging.info("-----------------------------------------------------------") + + # Load data from prices file + logging.info("Loading prices from file %s of directory %s", price.FILE_NAME, myParams.pricePath) + myPrices = price.Prices(myParams.pricePath, myParams.priceKwhDefault, myParams.priceFixDefault) + if len(myPrices.pricesList): + logging.info("%s range(s) of prices found !", len(myPrices.pricesList)) + + #################################################################################################################### + # STEP 4b : Prices + #################################################################################################################### + + logging.info("-----------------------------------------------------------") + logging.info("# Write prices #") + logging.info("-----------------------------------------------------------") + if myGrdf.isConnected \ + and myDb.isConnected() : + + try: + + cursor = myDb.isConnected() + + # Loop on PCEs + for myPce in myGrdf.pceList: + + myPcePrices = myPrices.getPricesByPce(myPce.pceId) + if myPcePrices: + # Loop on prices of the PCE and write the current price + for myPrice in myPcePrices: + #informative / daily values + logging.debug(f"QUERY_I: SELECT pce, type, date, energy, price FROM measures where pce = '{myPce.pceId}' and type = '{gazpar.TYPE_I}' and date between '{myPrice.startDate}' and '{myPrice.endDate}'") + cursor.execute(f"SELECT pce, type, date, energy, price FROM measures where pce = '{myPce.pceId}' and type = '{gazpar.TYPE_I}' and date between '{myPrice.startDate}' and '{myPrice.endDate}'") + + data = cursor.fetchall() + + for x in data: + try: + cursor.execute(f"UPDATE measures SET price= ( energyGrossConsumed * {myPrice.kwhPrice} ) + {myPrice.fixPrice}") + myDb.commit() + except Exception as e: + logging.error("Writing Prices error: %s", e) + + #published / periodic values + logging.debug(f"QUERY_P: SELECT pce, type, date, energy, price FROM measures where pce = '{myPce.pceId}' and type = '{gazpar.TYPE_P}' and date between '{myPrice.startDate}' and '{myPrice.endDate}'") + cursor.execute(f"SELECT pce, type, date, energy, price FROM measures where pce = '{myPce.pceId}' and type = '{gazpar.TYPE_P}' and date between '{myPrice.startDate}' and '{myPrice.endDate}'") + data = cursor.fetchall() + + for x in data: + try: + cursor.execute(f"UPDATE measures SET price= ( energyGrossConsumed * {myPrice.kwhPrice} ) + ((JulianDay(periodEnd) - JulianDay(periodStart)) * {myPrice.fixPrice})") + myDb.commit() + except Exception as e: + logging.error("Writing Prices from file, error: %s", e) + + else: + logging.warning("No prices file found, using the default price (%s €/kWh and %s €/day).", myParams.priceKwhDefault, myParams.priceFixDefault) + + cursor.execute(f"SELECT pce, type, date, energy, price FROM measures") + data = cursor.fetchall() + + for x in data: + try: + cursor.execute(f"UPDATE measures SET price= ( energyGrossConsumed * {myParams.priceKwhDefault} ) + {myParams.priceFixDefault} , priceKwh = 0") + myDb.commit() + except Exception as e: + logging.error("Writing Prices with default values, error: %s", e) + + except Exception as e: + logging.error("Home Assistant Prices error: %s", e) + + #################################################################################################################### # STEP 5C : Home Assistant Long Term statistics #################################################################################################################### @@ -689,35 +787,89 @@ def run(myParams): logging.info("Writing webservice information of PCE %s alias %s...", myPce.pceId, myPce.alias) stats_array = [] + stats_array_kwh = [] + stats_array_cost = [] stats_array_pub = [] + stats_array_kwh_pub = [] + stats_array_pub_cost = [] + prev_stat_sum = 0 + prev_stat_kwh_sum = 0 + prev_stat_pub_sum = 0 + prev_stat_kwh_pub_sum = 0 + prev_price_sum = 0 + prev_price_pub_sum = 0 for myMeasure in myPce.measureList: date_with_timezone = myMeasure.date.replace(tzinfo=dt.timezone.utc) date_formatted = date_with_timezone.strftime( "%Y-%m-%dT%H:%M:%S%z" ) - stat = { - "start": date_formatted, # formatted date - "state": myMeasure.volumeGross, - "sum": myMeasure.endIndex, - } - # Add the stat to the array - if myMeasure.type == 'informative': + if myMeasure.type == gazpar.TYPE_I : + stat = { + "start": date_formatted, # formatted date + "state": myMeasure.volumeGross, + "sum": prev_stat_sum + myMeasure.volumeGross, + } + stat_kwh = { + "start": date_formatted, # formatted date + "state": myMeasure.energyGross, + "sum": prev_stat_kwh_sum + myMeasure.energyGross, + } + stat_cost = { + "start": date_formatted, # formatted date + "state": myMeasure.price, + "sum": prev_price_sum + myMeasure.price, + } stats_array.append(stat) + stats_array_kwh.append(stat_kwh) + stats_array_cost.append(stat_cost) + prev_stat_sum = prev_stat_sum + myMeasure.volumeGross + prev_stat_kwh_sum = prev_stat_kwh_sum + myMeasure.energyGross + prev_price_sum = prev_price_sum + myMeasure.price else: - stats_array_pub.append(stat) - + stat_pub = { + "start": date_formatted, # formatted date + "state": myMeasure.volumeGross, + "sum": prev_stat_pub_sum + myMeasure.volumeGross, + } + stat_kwh_pub = { + "start": date_formatted, # formatted date + "state": myMeasure.energyGross, + "sum": prev_stat_kwh_pub_sum + myMeasure.energyGross, + } + stat_cost_pub = { + "start": date_formatted, # formatted date + "state": myMeasure.price, + "sum": prev_price_pub_sum + myMeasure.price, + } + stats_array_pub.append(stat_pub) + stats_array_kwh_pub.append(stat_kwh_pub) + stats_array_pub_cost.append(stat_cost_pub) + prev_stat_pub_sum = prev_stat_pub_sum + myMeasure.volumeGross + prev_stat_kwh_pub_sum = prev_stat_kwh_pub_sum + myMeasure.energyGross + prev_price_pub_sum = prev_price_pub_sum + myMeasure.price + + + sensor_name = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_stat' + sensor_name_kwh = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_kwh_stat' + sensor_name_pub = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_pub_stat' + sensor_name_kwh_pub = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_kwh_pub_stat' + sensor_name_cost = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_cost_stat' + sensor_name_cost_pub = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_pub_cost_stat' - sensor_name = myParams.hassLtsSensorName + "_" + myPce.pceId - sensor_name_pub = myParams.hassLtsSensorName + "_pub_" + myPce.pceId logging.debug(f"Writing Websocket Home Assistant LTS for PCE: {myPce.pceId}, sensor name: {sensor_name}") - HomeAssistantWs("import", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name, stats_array) + HomeAssistantWs("import", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name, 'm³', stats_array) + HomeAssistantWs("import", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_kwh, 'kWh', stats_array_kwh) + HomeAssistantWs("import", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_cost, 'EUR', stats_array_cost) + logging.debug(f"Writing Websocket Home Assistant Published LTS for PCE: {myPce.pceId}, sensor name: {sensor_name_pub}") - HomeAssistantWs("import", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_pub, stats_array_pub) + HomeAssistantWs("import", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_pub, 'm³', stats_array_pub) + HomeAssistantWs("import", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_kwh_pub, 'kWh', stats_array_kwh_pub) + HomeAssistantWs("import", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_cost_pub, 'EUR', stats_array_pub_cost) except Exception as e: logging.error("Home Assistant Long Term Statistics : unable to publish LTS to Webservice HA with error: %s", e) logging.error("Retrying with API") - + try: logging.info("-----------------------------------------------------------") logging.info("# Home assistant Long Term Statistics (API) #") @@ -725,14 +877,13 @@ def run(myParams): # Load database in cache myDb.load() - - sensor_name = myParams.hassLtsSensorName data = {} data_pub = {} # Loop on PCEs for myPce in myDb.pceList: logging.info("Writing api information of PCE %s alias %s...", myPce.pceId, myPce.alias) - + sensor_name = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_stat' + sensor_name_pub = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_pub_stat' stats_array = [] stats_array_pub = [] for myMeasure in myPce.measureList: @@ -755,7 +906,7 @@ def run(myParams): "has_mean": False, "has_sum": True, "statistic_id": ( - sensor_name + "_" + myPce.pceId + sensor_name ), "unit_of_measurement": "m³", "source": "recorder", @@ -765,21 +916,21 @@ def run(myParams): "has_mean": False, "has_sum": True, "statistic_id": ( - sensor_name_pub + "_" + myPce.pceId + sensor_name_pub ), "unit_of_measurement": "m³", "source": "recorder", "stats": stats_array_pub, } - logging.debug(f"Writing HA LTS for PCE: {myPce.pceId}, sensor name: {myParams.hassLtsSensorName}, data: {data}") - + logging.debug(f"Writing HA LTS for PCE: {myPce.pceId}, sensor name: {sensor_name}, data: {data}") myGrdf.open_url(myParams.hassHost, myParams.hassStatisticsUri, myParams.hassToken, data) - logging.debug(f"Writing HA LTS Published for PCE: {myPce.pceId}, sensor name: {myParams.hassLtsSensorName}, data: {data_pub}") + logging.debug(f"Writing HA LTS Published for PCE: {myPce.pceId}, sensor name: {sensor_name_pub}, data: {data_pub}") myGrdf.open_url(myParams.hassHost, myParams.hassStatisticsUri, myParams.hassToken, data_pub) except Exception as e: - logging.error("Home Assistant Long Term Statistics : unable to publish LTS to HA with error: %s", e) + logging.error("Home Assistant Long Term Statistics : unable to publish LTS to HA with error: %s", e) + #################################################################################################################### # STEP 5D : Delete Home Assistant Long Term statistics @@ -801,12 +952,20 @@ def run(myParams): } # Loop on PCEs for myPce in myDb.pceList: - sensor_name = myParams.hassLtsSensorName + "_" + myPce.pceId - sensor_name_pub = myParams.hassLtsSensorName + "_pub_" + myPce.pceId + sensor_name = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_stat' + sensor_name_kwh = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_kwh_stat' + sensor_name_pub = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_pub_stat' + sensor_name_kwh_pub = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_kwh_pub_stat' + sensor_name_cost = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_cost_stat' + sensor_name_cost_pub = 'sensor.' + myParams.hassDeviceName + '_' + myPce.alias.lower() + '_consumption_pub_cost_stat' logging.debug(f"Deleting Home Assistant LTS for PCE: {myPce.pceId}, sensor name: {sensor_name}") - HomeAssistantWs("delete", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name, None) + HomeAssistantWs("delete", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name, None, None) + HomeAssistantWs("delete", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_kwh, None, None) + HomeAssistantWs("delete", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_cost, None, None) logging.debug(f"Deleting Home Assistant Published LTS for PCE: {myPce.pceId}, sensor name: {sensor_name_pub}") - HomeAssistantWs("delete", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_pub, None) + HomeAssistantWs("delete", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_pub, None, None) + HomeAssistantWs("delete", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_kwh_pub, None, None) + HomeAssistantWs("delete", myPce.pceId, myParams.hassHost.split('//')[1], myParams.hassSsl, ssl_data, myParams.hassToken, sensor_name_cost_pub, None, None) except Exception as e: diff --git a/app/hass.py b/app/hass.py index 311b929..204014d 100644 --- a/app/hass.py +++ b/app/hass.py @@ -14,6 +14,7 @@ ENERGY_TYPE = "energy" CONNECTIVITY_TYPE = "connectivity" PROBLEM_TYPE = "problem" +COST_TYPE = "monetary" NONE_TYPE = None # Hass state_class diff --git a/app/hass_ws.py b/app/hass_ws.py index 9f792ed..67b6b3c 100644 --- a/app/hass_ws.py +++ b/app/hass_ws.py @@ -6,7 +6,7 @@ import websocket class HomeAssistantWs: - def __init__(self, action, pce, url, ssl, ssl_data, token, sensor, data): + def __init__(self, action, pce, url, ssl, ssl_data, token, sensor, unit, data): self.ws = None self.pce = pce self.url = url @@ -14,6 +14,7 @@ def __init__(self, action, pce, url, ssl, ssl_data, token, sensor, data): self.ssl_data = ssl_data self.token = token self.sensor_name = sensor + self.unit = unit self.data = data self.action = action self.id = 1 @@ -131,11 +132,11 @@ def import_data(self): metadata = { "has_mean": False, "has_sum": True, - "name": "gazpar m3", + "name": self.sensor_name, "statistic_id": ( self.sensor_name ), - "unit_of_measurement": "m³", + "unit_of_measurement": self.unit, "source": "recorder", } diff --git a/app/param.py b/app/param.py index dff22c4..1a3f5da 100644 --- a/app/param.py +++ b/app/param.py @@ -23,6 +23,7 @@ def __init__(self): # Grdf params self.grdfUsername = 'xxx' self.grdfPassword = 'xxx' + self.grdfStartDate = '2020-01-01' # Mqtt params self.mqttHost = '192.168.x.y' @@ -47,7 +48,6 @@ def __init__(self): # Publication in HA long term statistics self.hassLts = False self.hassLtsDelete = False - self.hassLtsSensorName = "sensor.gazpar2mqtt_total" self.hassToken = "" # Long-Lived Access Token self.hassStatisticsUri = "/api/services/recorder/import_statistics" self.hassHost = "http://192.168.x.y:8213" @@ -153,6 +153,8 @@ def getFromOs(self): if "GRDF_USERNAME" in os.environ: self.grdfUsername = os.environ["GRDF_USERNAME"] if "GRDF_PASSWORD" in os.environ: self.grdfPassword = os.environ["GRDF_PASSWORD"] + if "GRDF_STARTDATE" in os.environ: self.grdfStartDate = os.environ["GRDF_STARTDATE"] + if "MQTT_HOST" in os.environ: self.mqttHost = os.environ["MQTT_HOST"] if "MQTT_PORT" in os.environ: self.mqttPort = int(os.environ["MQTT_PORT"]) @@ -199,7 +201,11 @@ def getFromOs(self): if "DB_INIT" in os.environ: self.dbInit = _isItTrue(os.environ["DB_INIT"]) if "DB_PATH" in os.environ: self.dbPath = os.environ["DB_PATH"] - + + if "PRICE_KWH" in os.environ: self.priceKwhDefault = os.environ["PRICE_KWH"] + if "PRICE_FIX" in os.environ: self.priceFixDefault = os.environ["PRICE_FIX"] + if "PRICE_PATH" in os.environ: self.pricePath = os.environ["PRICE_PATH"] + if "DEBUG" in os.environ: self.debug = _isItTrue(os.environ["DEBUG"]) diff --git a/app/price.py b/app/price.py index 760b2d0..df6651f 100644 --- a/app/price.py +++ b/app/price.py @@ -66,12 +66,11 @@ def getPricesByPce(self,pceId): class Price(): def __init__(self,data): - self.pceId = str(data[0]) self.startDate = _convertDate(data[1]) self.endDate = _convertDate(data[2]) self.kwhPrice = float(data[3]) self.fixPrice = float(data[4]) - logging.debug("Add price : pce = %s, startDate = %s, endDate = %s, price = %s", self.pceId, self.startDate, + logging.debug("Add price : pce = %s, startDate = %s, endDate = %s, pricekWh = %s, priceFix = %s", self.pceId, self.startDate, self.endDate, self.kwhPrice, self.fixPrice) diff --git a/docker-compose.yml b/docker-compose.yml index 662dddf..ee7a21e 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -33,6 +33,7 @@ services: #HASS_PREFIX: 'homeassistant' #HASS_DEVICE_NAME: 'gazpar' #HASS_LTS: 'False' # use when you want to load into HA Longtermstatistics (with below params) + #HASS_LTS_DELETE: 'False' # use when you want to delete LTS data for all your PCE #HASS_LTS_TOKEN: "" # Long Term Access token generated for accessing below LTS_HOST #HASS_LTS_URI: "/api/services/recorder/import_statistics" # donot change unless needed #HASS_LTS_HOST: "http://192.168.x.y:8123" diff --git a/entrypoint.sh b/entrypoint.sh index d4304e0..e361cfc 100644 --- a/entrypoint.sh +++ b/entrypoint.sh @@ -20,10 +20,11 @@ cp /app_temp/price.py "$APP/price.py" cp /app_temp/standalone.py "$APP/standalone.py" cp /app_temp/hass_ws.py "$APP/hass_ws.py" -if [ ! -f "$APP/param.py" ]; then - echo "param.py non existing, copying default to /app..." - cp /app_temp/param.py "$APP/param.py" -fi +##if [ ! -f "$APP/param.py" ]; then +## echo "param.py non existing, copying default to /app..." +## cp /app_temp/param.py "$APP/param.py" +##fi +cp /app_temp/param.py "$APP/param.py" exec "$@" \ No newline at end of file diff --git a/repository.json b/repository.json index 43674c8..72b13ae 100644 --- a/repository.json +++ b/repository.json @@ -1,5 +1,5 @@ { - "name": "Gazpar 2 Mqtt DEV", - "url": "https://github.com/vingerha/gazpar_2_mqtt_dev", + "name": "Gazpar 2 Mqtt ", + "url": "https://github.com/vingerha/gazpar_2_mqtt", "maintainer": "vingerha" } \ No newline at end of file