-
Notifications
You must be signed in to change notification settings - Fork 988
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LinkOutsideDestinationError when installing PrivateGPT #905
Comments
Having this same issue |
Also seeing this, here is an example failed GH Action run in case helpful https://github.com/IntrinsicLabsAI/intrinsic-model-server/actions/runs/6868629622/job/18679613918 |
Bumping my Python patch release from 3.11.4 -> 3.11.5 fixed the issue for me. It looks like Poetry may have fixed this via a (as yet unreleased) backport: https://github.com/python-poetry/poetry/pull/8649/files So, best options seem like either
|
Had a same issue with Poetry 1.7, fallback to 1.6 worked as as a WA for me. |
This issue was solved in Poetry 1.7.1 |
I'm seeing this error using Python 3.11.7, Poetry 1.8.2, and llama-cpp-python 0.2.62, on Ubuntu 22.04 LTS. |
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
Please provide a detailed written description of what you were trying to do, and what you expected
llama-cpp-python
to do.Expecting it to install properly when poetry is used to install it.
Current Behavior
Please provide a detailed written description of what
llama-cpp-python
did, instead.Package operations: 1 install, 0 updates, 0 removals
Fails to install properly with the below command:
poetry install --with local
• Installing llama-cpp-python (0.2.17): Failed
LinkOutsideDestinationError
'llama_cpp_python-0.2.17/vendor/llama.cpp/spm-headers/ggml.h' would link to '/tmp/ggml.h', which is outside the destination
at /usr/lib/python3.11/tarfile.py:806 in _get_filtered_attrs
802│ if os.path.isabs(member.linkname):
803│ raise AbsoluteLinkError(member)
804│ target_path = os.path.realpath(os.path.join(dest_path, member.linkname))
805│ if os.path.commonpath([target_path, dest_path]) != dest_path:
→ 806│ raise LinkOutsideDestinationError(member, target_path)
807│ return new_attrs
808│
809│ def fully_trusted_filter(member, dest_path):
810│ return member
Cannot install llama-cpp-python.
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
cat /etc/issue
Ubuntu 23.04 \n \l
uname -a
Linux hostname 6.2.0-36-generic #37-Ubuntu SMP PREEMPT_DYNAMIC Wed Oct 4 10:14:28 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
poetry --version
Poetry (version 1.7.0)
python3 -V
Python 3.11.4
lscpu
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
Address sizes: 40 bits physical, 48 bits virtual
Byte Order: Little Endian
CPU(s): 8
On-line CPU(s) list: 0-7
Vendor ID: GenuineIntel
BIOS Vendor ID: QEMU
Model name: Common KVM processor
BIOS Model name: pc-i440fx-7.2 CPU @ 2.0GHz
BIOS CPU family: 1
CPU family: 15
Model: 6
Thread(s) per core: 1
Core(s) per socket: 4
Socket(s): 2
Stepping: 1
BogoMIPS: 4788.46
Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx lm constant_tsc nopl xtopol
ogy cpuid tsc_known_freq pni cx16 x2apic hypervisor lahf_lm cpuid_fault pti
Virtualization features:
Hypervisor vendor: KVM
Virtualization type: full
Caches (sum of all):
L1d: 256 KiB (8 instances)
L1i: 256 KiB (8 instances)
L2: 32 MiB (8 instances)
L3: 32 MiB (2 instances)
NUMA:
NUMA node(s): 1
NUMA node0 CPU(s): 0-7
Proxmox 7.4-16 KVM virtual machine
$ lscpu
make --version
GNU Make 4.3
Built for x86_64-pc-linux-gnu
Copyright (C) 1988-2020 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later http://gnu.org/licenses/gpl.html
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
g++ --version
g++ (Ubuntu 12.3.0-1ubuntu1~23.04) 12.3.0
Copyright (C) 2022 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Failure Information (for bugs)
Poetry asked 4 days ago for this to be reported to you for fixing:
python-poetry/poetry#8645
Please help provide information about the failure if this is a bug. If it is not a bug, please remove the rest of this template.
Steps to Reproduce
Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.
Failure Logs
Installing dependencies from lock file
Package operations: 1 install, 0 updates, 0 removals
• Installing llama-cpp-python (0.2.17): Failed
LinkOutsideDestinationError
'llama_cpp_python-0.2.17/vendor/llama.cpp/spm-headers/ggml.h' would link to '/tmp/ggml.h', which is outside the destination
at /usr/lib/python3.11/tarfile.py:806 in _get_filtered_attrs
802│ if os.path.isabs(member.linkname):
803│ raise AbsoluteLinkError(member)
804│ target_path = os.path.realpath(os.path.join(dest_path, member.linkname))
805│ if os.path.commonpath([target_path, dest_path]) != dest_path:
→ 806│ raise LinkOutsideDestinationError(member, target_path)
807│ return new_attrs
808│
809│ def fully_trusted_filter(member, dest_path):
810│ return member
Cannot install llama-cpp-python.
The text was updated successfully, but these errors were encountered: