You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If the try to running:- ./llava-v1.5-7b-q4-server.llamafile
And get:- error: APE is running on WIN32 inside WSL. You need to run: sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'
Try:- sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'
But ./llava-v1.5-7b-q4-server.llamafile still does not run check that the service systemd-binfmt is running. ○ systemd-binfmt.service Loaded: masked (Reason: Unit systemd-binfmt.service is masked.) Active: inactive (dead)
If not running:- sudo sh -c 'echo :WSLInterop:M::MZ::/init:PF > /usr/lib/binfmt.d/WSLInterop.conf' sudo systemctl unmask systemd-binfmt.service sudo systemctl restart systemd-binfmt sudo systemctl mask systemd-binfmt.service sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop' ./llava-v1.5-7b-q4-server.llamafile
If the try to running:-
./llava-v1.5-7b-q4-server.llamafile
And get:-
error: APE is running on WIN32 inside WSL. You need to run: sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'
Try:-
sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'
But
./llava-v1.5-7b-q4-server.llamafile
still does not run check that the servicesystemd-binfmt
is running.○ systemd-binfmt.service
Loaded: masked (Reason: Unit systemd-binfmt.service is masked.)
Active: inactive (dead)
If not running:-
sudo sh -c 'echo :WSLInterop:M::MZ::/init:PF > /usr/lib/binfmt.d/WSLInterop.conf'
sudo systemctl unmask systemd-binfmt.service
sudo systemctl restart systemd-binfmt
sudo systemctl mask systemd-binfmt.service
sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'
./llava-v1.5-7b-q4-server.llamafile
Now it is running....
warning: couldn't find nvcc (nvidia c compiler) try setting $CUDA_PATH if it's installed prebuilt binary /zip/ggml-cuda.so not found {"timestamp":1702568274,"level":"INFO","function":"main","line":2669,"message":"build info","build":1500,"commit":"a30b324"} {"timestamp":1702568274,"level":"INFO","function":"main","line":2672,"message":"system info","n_threads":4,"n_threads_batch":-1,"total_threads":8,"system_info":"AVX = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 1 | ARM_FMA = 1 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 0 | SSSE3 = 0 | VSX = 0 | "} Multi Modal Mode Enabledclip_model_load: model name: openai/clip-vit-large-patch14-336
The text was updated successfully, but these errors were encountered: