You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
throughput_report fails for a large number (512 or greater) number of events. It fails in its request to the server because of a too long URL. Possibly need a different type of HTTP call. Sample error below:
Traceback (most recent call last): File "analytics.py", line 14, in <module> times, done_counts = throughput_report(events, to_state="JOB_FINISHED") File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/analytics.py", line 14, in throughput_report times = list(log.timestamp for log in finished_logs) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/_api/query.py", line 114, in __iter__ self._fetch_cache() File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/_api/query.py", line 128, in _fetch_cache offset=self._offset, File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/_api/manager.py", line 204, in _get_list count, results = self._fetch_pages(filter_chunk, ordering, limit, offset) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/_api/manager.py", line 170, in _fetch_pages response_data = self._client.get(self._api_path, **query_params) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/client/rest_base_client.py", line 65, in get return self.request(url, "GET", params=kwargs) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/client/requests_client.py", line 98, in request response = self._do_request(absolute_url, http_method, params, json, data) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/client/requests_client.py", line 130, in _do_request self._raise_with_explanation(response) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/client/requests_client.py", line 150, in _raise_with_explanation response.raise_for_status() File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 414 Client Error: URI Too Long for url: https://balsam-dev.alcf.anl.gov/events/?job_id=29682869&job_id=29682870&job_id=29682871&job_id=29
The text was updated successfully, but these errors were encountered:
The clients are already using this parameter to split up a large request into chunks, make the requests separately, and then stitch the responses back together. We might be running up against a new limit here with larger job_id keys 😄
A more flexible solution might be to check for the 414 URI Too Long error specifically, and if that is thrown, retry the request with smaller filter chunk size. Otherwise, I agree that a different HTTP call that keeps the long list of job_ids out of the URL (using the request body instead) would be reasonable
throughput_report
fails for a large number (512 or greater) number of events. It fails in its request to the server because of a too long URL. Possibly need a different type of HTTP call. Sample error below:Traceback (most recent call last): File "analytics.py", line 14, in <module> times, done_counts = throughput_report(events, to_state="JOB_FINISHED") File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/analytics.py", line 14, in throughput_report times = list(log.timestamp for log in finished_logs) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/_api/query.py", line 114, in __iter__ self._fetch_cache() File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/_api/query.py", line 128, in _fetch_cache offset=self._offset, File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/_api/manager.py", line 204, in _get_list count, results = self._fetch_pages(filter_chunk, ordering, limit, offset) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/_api/manager.py", line 170, in _fetch_pages response_data = self._client.get(self._api_path, **query_params) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/client/rest_base_client.py", line 65, in get return self.request(url, "GET", params=kwargs) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/client/requests_client.py", line 98, in request response = self._do_request(absolute_url, http_method, params, json, data) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/client/requests_client.py", line 130, in _do_request self._raise_with_explanation(response) File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/balsam/client/requests_client.py", line 150, in _raise_with_explanation response.raise_for_status() File "/home/csimpson/.local/miniconda-3/latest/lib/python3.7/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 414 Client Error: URI Too Long for url: https://balsam-dev.alcf.anl.gov/events/?job_id=29682869&job_id=29682870&job_id=29682871&job_id=29
The text was updated successfully, but these errors were encountered: