Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

list_inference_endpoints fails with Date parsing error #2671

Closed
james-deee opened this issue Nov 21, 2024 · 5 comments · Fixed by #2683
Closed

list_inference_endpoints fails with Date parsing error #2671

james-deee opened this issue Nov 21, 2024 · 5 comments · Fixed by #2683
Labels
bug Something isn't working

Comments

@james-deee
Copy link

Describe the bug

Pretty simple, the list_inference_endpoints is completely broken. It fails with some parse date error, see stacktrace.

Reproduction

list_inference_endpoints(namespace=NAMESPACE, token=HF_API_KEY)

Logs

File "/Users/jd/code/prompt-js/prompt-worker-py/activities/llama_guard.py", line 98, in check_endpoint
    x = list_inference_endpoints(namespace=NAMESPACE, token=HF_API_KEY)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jd/code/prompt-js/prompt-worker-py/.venv/lib/python3.11/site-packages/huggingface_hub/hf_api.py", line 7716, in list_inference_endpoints
    return [
           ^
  File "/Users/jd/code/prompt-js/prompt-worker-py/.venv/lib/python3.11/site-packages/huggingface_hub/hf_api.py", line 7717, in <listcomp>
    InferenceEndpoint.from_raw(endpoint, namespace=namespace, token=token)
  File "/Users/jd/code/prompt-js/prompt-worker-py/.venv/lib/python3.11/site-packages/huggingface_hub/_inference_endpoints.py", line 134, in from_raw
    return cls(raw=raw, namespace=namespace, _token=token, _api=api)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<string>", line 7, in __init__
  File "/Users/jd/code/prompt-js/prompt-worker-py/.venv/lib/python3.11/site-packages/huggingface_hub/_inference_endpoints.py", line 138, in __post_init__
    self._populate_from_raw()
  File "/Users/jd/code/prompt-js/prompt-worker-py/.venv/lib/python3.11/site-packages/huggingface_hub/_inference_endpoints.py", line 394, in _populate_from_raw
    self.created_at = parse_datetime(self.raw["status"]["createdAt"])
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/jd/code/prompt-js/prompt-worker-py/.venv/lib/python3.11/site-packages/huggingface_hub/utils/_datetime.py", line 59, in parse_datetime
    raise ValueError(
ValueError: Cannot parse '2024-11-20T20:01:58Z' as a datetime. Date string is expected to follow '%Y-%m-%dT%H:%M:%S.%fZ' pattern.

System info

- huggingface_hub version: 0.26.2
- Platform: macOS-14.4.1-arm64-arm-64bit
- Python version: 3.11.7
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Running in Google Colab Enterprise ?: No
- Token path ?: /Users/jd/.cache/huggingface/token
- Has saved token ?: True
- Who am I ?: jamie-de
- Configured git credential helpers: osxkeychain
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.2.0
- Jinja2: 3.1.4
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 11.0.0
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: 1.25.1
- pydantic: 2.9.2
- aiohttp: 3.10.11
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /Users/jd/.cache/huggingface/hub
- HF_ASSETS_CACHE: /Users/jd/.cache/huggingface/assets
- HF_TOKEN_PATH: /Users/jd/.cache/huggingface/token
- HF_STORED_TOKENS_PATH: /Users/jd/.cache/huggingface/stored_tokens
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
@james-deee james-deee added the bug Something isn't working label Nov 21, 2024
@hanouticelina
Copy link
Contributor

Hi @james-deee, This isn't a bug in huggingface_hub.list_inference_endpoints. The API returns timestamps without milliseconds when they're zero, causing a parsing error in huggingface_hub.list_inference_endpoints that expects timestamps with milliseconds. This issue should be fixed on the API side — pinging @ErikKaum for more info.

@ErikKaum
Copy link
Member

Thanks for pinging @hanouticelina
Yeah this should have already been fixed in the API layer.

Let me try to reproduce this behavior locally 👍

@rogthefrog
Copy link

This is still happening with API calls today.

data_string = '2024-11-16T00:27:02Z', format = '%Y-%m-%dT%H:%M:%S.%fZ'

@hanouticelina
Copy link
Contributor

Hi @rogthefrog, we're taking a look at this bug internally and we will get back to you as soon as we have a fix! sorry you're experiencing this issue

@hanouticelina
Copy link
Contributor

Hi @james-deee and @rogthefrog, we've just released a patch which fixes the timestamp parsing issue you encountered. You can update with:

pip install huggingface_hub==0.26.3

Sorry again for the trouble this caused!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants