Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TIS OpenAI frontend, make trust_remote_code configurable #7845

Open
chorus-over-flanger opened this issue Nov 30, 2024 · 0 comments
Open

TIS OpenAI frontend, make trust_remote_code configurable #7845

chorus-over-flanger opened this issue Nov 30, 2024 · 0 comments

Comments

@chorus-over-flanger
Copy link

Is your feature request related to a problem? Please describe.

For OpenAI frontend of TIS currently it's not directly possible to load and serve models from huggingface-hub without transformers builtin model architectures, like Bespoke-MiniCheck-7B . Happening because In tokenizer.py trust_remote_code defaults to False which is a reasonable safety measure, but not configurable on per-model basis

Describe the solution you'd like

Suggesting to update the codebase in order make trust_remote_code configurable on per-model basis

Describe alternatives you've considered

Patched tokenizer.py by setting trust_remote_code default to True 🥲

Additional context

No additional context provided

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

1 participant