This repository has been archived by the owner on Oct 9, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 113
Issues: huggingface/transformers-bloom-inference
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
ValueError: Couldn't instantiate the backend tokenizer from one of:
#101
opened Jun 30, 2023 by
SeekPoint
The Makefile execution was successful, but there is no response when entering text.
#96
opened Jun 2, 2023 by
dizhenx
AttributeError: 'BloomForCausalLM' object has no attribute 'module'
#95
opened Jun 1, 2023 by
detectiveJoshua
Inference(chatbot) does not work as expected on 2 gpus with bigscience/bloom-7b1 model
#90
opened May 19, 2023 by
dantalyon
Big batchsize cause OOM in bloom-ds-inference.py, how to adjust max_split_size_mb value
#84
opened Apr 27, 2023 by
tohneecao
The generated results are different when using greedy search during generation
#65
opened Mar 14, 2023 by
FrostML
ProTip!
Adding no:label will show everything without a label.