Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot run bigcode example : access to source requires login credentials #592

Closed
noahgift opened this issue Aug 24, 2023 · 4 comments
Closed

Comments

@noahgift
Copy link

I love this project so far! Thanks everyone for working on it. I got several models to work but did run into an issue here. Hopefully, someone can help me with it.

When I run cargo run --example bigcode --release. I get a similar issue as issue 350.

codespace@codespaces-f226cf:/workspaces/rust-candle-demos/candle$ cargo run --example bigcode --release -- --prompt "hello world python function"
warning: some crates are on edition 2021 which defaults to `resolver = "2"`, but virtual workspaces default to `resolver = "1"`
note: to keep the current resolver, specify `workspace.resolver = "1"` in the workspace root's manifest
note: to use the edition 2021 resolver, specify `workspace.resolver = "2"` in the workspace root's manifest
    Finished release [optimized] target(s) in 3.68s
     Running `target/release/examples/bigcode --prompt 'hello world python function'`
Error: request error: https://huggingface.co/bigcode/starcoderbase-1b/resolve/main/tokenizer.json: status code 403

Caused by:
    https://huggingface.co/bigcode/starcoderbase-1b/resolve/main/tokenizer.json: status code 403
codespace@codespaces-f226cf:/workspaces/rust-candle-demos/candle$ 

Here are the steps that I tried:

  • Authorized the form "Gated model
    You have been granted access to this model" for starcoder
  • Setup HUGGING_FACE_HUB_TOKEN in GitHub Codespaces secret (Didn't work)
  • echo $HUGGING_FACE_HUB_TOKEN > $HOME/.cache/huggingface/token (Didn't work)
  • pip install huggingface_hub && huggingface-cli login (Didn't work)
  • Tried to open up a vanilla python shell and ran >>> from transformers import pipeline then pipe = pipeline("text-generation", model="bigcode/starcoder")

FYI, this is my huggingface-cli env output

codespace@codespaces-f226cf:/workspaces/rust-candle-demos$ huggingface-cli env

Copy-and-paste the text below in your GitHub issue.

- huggingface_hub version: 0.16.4
- Platform: Linux-5.15.0-1041-azure-x86_64-with-glibc2.31
- Python version: 3.10.8
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: /home/codespace/.cache/huggingface/token
- Has saved token ?: True
- Who am I ?: noahgift
- Configured git credential helpers: /.codespaces/bin/gitcredential_github.sh
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.0.1
- Jinja2: 3.1.2
- Graphviz: N/A
- Pydot: N/A
- Pillow: 10.0.0
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: 1.25.2
- pydantic: N/A
- aiohttp: N/A
- ENDPOINT: https://huggingface.co
- HUGGINGFACE_HUB_CACHE: /home/codespace/.cache/huggingface/hub
- HUGGINGFACE_ASSETS_CACHE: /home/codespace/.cache/huggingface/assets
- HF_TOKEN_PATH: /home/codespace/.cache/huggingface/token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
@Narsil
Copy link
Collaborator

Narsil commented Aug 24, 2023

Try setting HF_HOME ?

By default it should indeed be in $HOME/.cache/huggingface/token as you tried.

https://github.com/huggingface/hf-hub/blob/main/src/lib.rs#L169-L181

I'm guessing something must be wrong in codespaces regarding HOME maybe ?

@LaurentMazare
Copy link
Collaborator

Looks like this uses starcoder/bigcodebase-1b but the link you've put is for starcoder/bigcode, maybe get access to https://huggingface.co/bigcode/starcoderbase-1b if you haven't already done so?

@noahgift
Copy link
Author

your a miracle worker @LaurentMazare ! This was it:

codespace@codespaces-f226cf:/workspaces/rust-candle-demos/candle$ cargo run --example bigcode --release -- --prompt "hello world python function"
warning: some crates are on edition 2021 which defaults to `resolver = "2"`, but virtual workspaces default to `resolver = "1"`
note: to keep the current resolver, specify `workspace.resolver = "1"` in the workspace root's manifest
note: to use the edition 2021 resolver, specify `workspace.resolver = "2"` in the workspace root's manifest
    Finished release [optimized] target(s) in 4.43s
     Running `target/release/examples/bigcode --prompt 'hello world python function'`
tokenizer.json [00:00:00] [███████████████████████████████████████████████████████████████████] 1.96 MiB/1.96 MiB 19.97 MiB/s (0s)
model.safetensors [00:01:26] [████████████████████████████████████████████████████████████████] 4.24 GiB/4.24 GiB 49.91 MiB/s (0s)retrieved the files in 96.9321653s
Running on CPU, to run on GPU, build this example with `--features cuda`
loaded the model in 3.7673761s
starting the inference loop
hello world python function")

# +
# %%writefile hello_world.py

def hello_world():
    print("hello world python function")

@noahgift
Copy link
Author

Thanks @Narsil Codespaces it a bit finicky. The issue turned out to be I didn't realize I need to ask for access to starcoderbase-1b.

Btw, I plan on making a reproducible .devcontainer project since I teach with this at Duke University and we get free GPU :). So I will share this config when done.

Thanks again for the help! HUGE fan of this project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants