-
Notifications
You must be signed in to change notification settings - Fork 419
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True #131
Comments
Can you provide more detailed information so that I can reproduce your error? |
python ./tools/Alpaca-LoRA-Serve/app.py 1 ✘ took 21s alpaca-lora at 10:35:11
|
We are sorry, our project has stopped maintaining alpaca-serve. You can use our project's own visual scripts (generate, interact, chat...) |
env:
macbook m2
python 3.10
conda create -n alpaca-serve python=3.10
conda activate alpaca-serve
cd Alpaca-LoRA-Serve
pip install -r requirements.txt
BASE_URL=decapoda-research/llama-7b-hf
FINETUNED_CKPT_URL=tloen/alpaca-lora-7b
python app.py --base_url $BASE_URL --ft_ckpt_url $FINETUNED_CKPT_URL --port 6006
The text was updated successfully, but these errors were encountered: