Skip to content

tangledgroup/mlipy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

mlipy

Downloads Supported Versions License: MIT

Pure Python-based Machine Learning Interface for multiple engines with multi-modal support.

Python HTTP Server/Client (including WebSocket streaming support) for:

Prerequisites

Debian/Ubuntu

sudo apt update -y
sudo apt install build-essential git curl libssl-dev libffi-dev pkg-config

Python

  1. Install Python using internal repository:
sudo apt install python3.11 python3.11-dev python3.11-venv
  1. Install Python using external repository:
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update -y
sudo apt install python3.11 python3.11-dev python3.11-venv

llama.cpp

cd ~
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
make -j

Run Development Server

Setup virtualenv and install requirements:

git clone https://github.com/mtasic85/mlipy.git
cd mlipy

python3.11 -m venv venv
source venv/bin/activate
pip install poetry
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
poetry install

Run server:

python -B -m mli.server --llama-cpp-path='~/llama.cpp'

Run Examples

Using GPU:

NGL=99 python -B examples/sync_demo.py

Using CPU:

python -B examples/sync_demo.py
python -B examples/async_demo.py
python -B examples/langchain_sync_demo.py
python -B examples/langchain_async_demo.py

Run Production Server

Generate self-signed SSL certificates

openssl req -x509 -nodes -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365

Run

python3.11 -m venv venv
source venv/bin/activate
pip install -U mlipy
python -B -m mli.server

About

Python-based Machine Learning Interface

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages