Skip to content

Commit

Permalink
Merge pull request #278 from KonradHoeffner/patch-1
Browse files Browse the repository at this point in the history
Fix AutoGPTQ version to 0.2.2 in requirements.txt. Adapt README.md.
  • Loading branch information
PromtEngineer authored Jul 26, 2023
2 parents 99b105b + dfa6f65 commit 2c3337c
Show file tree
Hide file tree
Showing 2 changed files with 26 additions and 4 deletions.
28 changes: 25 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ In order to set your environment up to run the code here, first install all requ
pip install -r requirements.txt
```


If you want to use BLAS or Metal with [llama-cpp](<(https://github.com/abetlen/llama-cpp-python#installation-with-openblas--cublas--clblast--metal)>) you can set appropriate flags:

```shell
Expand All @@ -46,7 +47,7 @@ git checkout v0.2.2
pip install .
```

For more support on [AutoGPTQ] (https://github.com/PanQiWei/AutoGPTQ).
For more support on [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ).

## Test dataset

Expand Down Expand Up @@ -127,7 +128,7 @@ GGML quantized models for Apple Silicon (M1/M2) are supported through the llama-

## Troubleshooting

**Install MPS:**
**Install MPS:**
1- Follow this [page](https://developer.apple.com/metal/pytorch/) to build up PyTorch with Metal Performance Shaders (MPS) support. PyTorch uses the new MPS backend for GPU training acceleration. It is good practice to verify mps support using a simple Python script as mentioned in the provided link.

2- By following the page, here is an example of what you may initiate in your terminal
Expand All @@ -143,7 +144,7 @@ pip install pdfminer.six
pip install xformers
```

**Upgrade packages:**
**Upgrade packages:**
Your langchain or llama-cpp version could be outdated. Upgrade your packages by running install again.

```shell
Expand Down Expand Up @@ -292,6 +293,27 @@ This is a test project to validate the feasibility of a fully local solution for
pip install torch -f https://download.pytorch.org/whl/torch_stable.html
```
- [Torch not compatible with cuda enabled](https://github.com/pytorch/pytorch/issues/30664)
- Get cuda version
```shell
nvcc --version
```
```shell
nvidia-smi
```
- Try Install pytorch fepending on your cuda version
```shell
conda install -c pytorch torchvision cudatoolkit=10.1 pytorch
```
- If doesn't work, try reinstalling
```shell
pip uninstall torch
pip cache purge
pip install torch -f https://download.pytorch.org/whl/torch_stable.html
```
- [ERROR: pip's dependency resolver does not currently take into account all the packages that are installed](https://stackoverflow.com/questions/72672196/error-pips-dependency-resolver-does-not-currently-take-into-account-all-the-pa/76604141#76604141)
```shell
pip install h5py
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ transformers
protobuf==3.20.0; sys_platform != 'darwin'
protobuf==3.20.0; sys_platform == 'darwin' and platform_machine != 'arm64'
protobuf==3.20.3; sys_platform == 'darwin' and platform_machine == 'arm64'
auto-gptq
auto-gptq==0.2.2
docx2txt

# Utilities
Expand Down

0 comments on commit 2c3337c

Please sign in to comment.