You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
For OpenAI frontend of TIS currently it's not directly possible to load and serve models from huggingface-hub without transformers builtin model architectures, like Bespoke-MiniCheck-7B . Happening because In tokenizer.pytrust_remote_code defaults to False which is a reasonable safety measure, but not configurable on per-model basis
Describe the solution you'd like
Suggesting to update the codebase in order make trust_remote_code configurable on per-model basis
Describe alternatives you've considered
Patched tokenizer.py by setting trust_remote_code default to True 🥲
Additional context
No additional context provided
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
For OpenAI frontend of TIS currently it's not directly possible to load and serve models from
huggingface-hub
withouttransformers
builtin model architectures, like Bespoke-MiniCheck-7B . Happening because In tokenizer.pytrust_remote_code
defaults toFalse
which is a reasonable safety measure, but not configurable on per-model basisDescribe the solution you'd like
Suggesting to update the codebase in order make
trust_remote_code
configurable on per-model basisDescribe alternatives you've considered
Patched tokenizer.py by setting
trust_remote_code
default toTrue
🥲Additional context
No additional context provided
The text was updated successfully, but these errors were encountered: