Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Docker support #25

Merged
merged 41 commits into from
Mar 18, 2024
Merged
Show file tree
Hide file tree
Changes from 30 commits
Commits
Show all changes
41 commits
Select commit Hold shift + click to select a range
d8bf74d
chore: fix start.sh for all environments
tnunamak Mar 4, 2024
89ceb15
feat: add Dockerfile to provide an easy way to run selfie with docker
volod-vana Mar 5, 2024
037da63
doc: Documentation on how to run selfie with docker
volod-vana Mar 5, 2024
79f10c6
feat: add --skip-deps and --skip-build to skip installing dependencie…
volod-vana Mar 5, 2024
6b2d072
Update Dockerfile
tnunamak Mar 5, 2024
a477247
Update start.sh
tnunamak Mar 5, 2024
955142f
Update README.md
tnunamak Mar 5, 2024
21ecb22
feat: add mapping for Huggingface cache dir to reduce model loading t…
volod-vana Mar 5, 2024
3d0fc4d
Simplify startup scripts
tnunamak Mar 5, 2024
9181128
Update README.md
tnunamak Mar 5, 2024
0515fb4
Merge branch 'fix-start-sh' into docker
volod-vana Mar 6, 2024
59ad015
Merge branch 'main' into docker
volod-vana Mar 7, 2024
b4ec0ef
feat: GPU support for Docker container
volod-vana Mar 7, 2024
cee0022
fix: deleting documents
tnunamak Mar 7, 2024
313a2df
fix: embeddings that are too large
tnunamak Mar 7, 2024
246ef06
fix: pyinstaller instructions
tnunamak Mar 8, 2024
961d5e5
Build macOS
tnunamak Mar 8, 2024
8b09a91
Split package workflow out to another branch
tnunamak Mar 8, 2024
ac36db1
Fix get_default_completion
tnunamak Mar 8, 2024
0a6d063
Merge branch 'fixes-3-7' into docker
volod-vana Mar 11, 2024
f89761b
feat: Optimize Dockerfile and remove start.sh dependency
volod-vana Mar 11, 2024
1fed0b9
feat: revert yarn.lock
volod-vana Mar 11, 2024
38bba3f
feat: update poetry
volod-vana Mar 11, 2024
26cd651
feat: separate files for GPU and CPU
volod-vana Mar 11, 2024
acad9ee
feat: add for nvidia docker install build tools and compilers
volod-vana Mar 11, 2024
f6e295e
feat: cleanup
volod-vana Mar 11, 2024
df479ea
feat: cleanup
volod-vana Mar 11, 2024
93c912b
feat: cleanup
volod-vana Mar 11, 2024
4479dc7
feat: cleanup
volod-vana Mar 11, 2024
71ee9d7
Merge branch 'main' into docker
volod-vana Mar 11, 2024
1de39d9
feat: add poetry.lock to .dockerignore
volod-vana Mar 11, 2024
64a0576
Update nvidia.Dockerfile
tnunamak Mar 11, 2024
9bec30a
Update .dockerignore
tnunamak Mar 11, 2024
f7e3580
Update cpu.Dockerfile
tnunamak Mar 11, 2024
34e7c60
feat: consolidating the different version into a single multi-stage D…
volod-vana Mar 12, 2024
c416bda
Use Nvidia image for selfie-gpu
tnunamak Mar 14, 2024
6eca14b
doc: updates to docker related doc
volod-vana Mar 15, 2024
3f1d1f3
feat: remove file based stages, code moved to single Dockerfile
volod-vana Mar 15, 2024
0e45e5b
fix: typo in doc
volod-vana Mar 18, 2024
e889f3f
doc: cleanup
volod-vana Mar 18, 2024
199ffbc
Doc edits
tnunamak Mar 18, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions Dockerfile
36 changes: 36 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,42 @@ For most users, the easiest way to install Selfie is to follow the [Quick Start]

> **Note**: You can host selfie at a publicly-accessible URL with [ngrok](https://ngrok.com). Add your ngrok token (and optionally, ngrok domain) in `selfie/.env` and run `poetry run python -m selfie --share`.

## Using Docker

You can also run Selfie using Docker. To do so, follow these steps:

1. Ensure that [Docker](https://www.docker.com) is installed.
2. Clone or [download](https://github.com/vana-com/selfie) selfie repository.
3. In a terminal, navigate to the project directory.

We provide Dockerfiles for different GPU configurations and CPU-only environments.
Choose the appropriate Dockerfile based on your setup:

- NVIDIA GPUs - default image, same as `Dockerfile`. Use `nvidia.Dockerfile` for systems with NVIDIA GPUs. This Dockerfile installs the necessary CUDA toolkit and the accelerated `llama-cpp-python` package.
- No GPU (CPU-only): Use `cpu.Dockerfile` for systems without a GPU. This Dockerfile uses the standard Python image and runs the application on the CPU.

To build and run the Docker image, navigate to the project directory and run the following commands:

```bash
# Build the Docker image
docker build -t selfie .

# Or, for CPU-only image:
docker build -t selfie -f cpu.Dockerfile .

# Run the Docker container
docker run -p 8181:8181 \
--name selfie \
-v $(pwd)/data:/selfie/data \
-v $(pwd)/selfie:/selfie/selfie \
-v $(pwd)/selfie-ui:/selfie/selfie-ui \
-v $HOME/.cache/huggingface:/root/.cache/huggingface \
selfie:latest
```
This will start the server and the UI in your browser at http://0.0.0.0:8181/.
Your personal data will be stored in the `data` directory.
This mounts your Hugging Face cache into the container, so you don't have to download the models again if you already
have them.

## Setting Up Selfie

Expand Down
48 changes: 48 additions & 0 deletions cpu.Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# Build the UI
FROM node:18.19-alpine3.18 AS selfie-ui

# Set the working directory in the container
WORKDIR /selfie

# Copy the package.json and yarn.lock files
COPY selfie-ui/package.json selfie-ui/yarn.lock ./

# Install dependencies
RUN yarn install --frozen-lockfile --non-interactive

# Copy the rest of the code
COPY selfie-ui/ .

# Build the project
RUN yarn run build

# Use the official Python image with CUDA support
FROM python:3.11 as selfie

# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

# Set the working directory
WORKDIR /selfie

# Copy code and dependencies into the docker image
COPY . .

# Copy the built UI from the previous stage
COPY --from=selfie-ui /selfie/out/ ./selfie-ui/out

# Install poetry
RUN pip install poetry --no-cache-dir

# Install dependencies
RUN poetry config virtualenvs.create false
RUN poetry install --no-interaction --no-ansi

# Run the installation script
RUN bash /selfie/scripts/llama-cpp-python-cublas.sh

EXPOSE 8181

# Run the application
CMD ["python", "-m", "selfie"]
volod-vana marked this conversation as resolved.
Show resolved Hide resolved
tnunamak marked this conversation as resolved.
Show resolved Hide resolved
50 changes: 50 additions & 0 deletions nvidia.Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# Build the UI
FROM node:18.19-alpine3.18 AS selfie-ui

# Set the working directory in the container
WORKDIR /selfie

# Copy the package.json and yarn.lock files
COPY selfie-ui/package.json selfie-ui/yarn.lock ./

# Install dependencies
RUN yarn install --frozen-lockfile --non-interactive

# Copy the rest of the code
COPY selfie-ui/ .

# Build the project
RUN yarn run build

# Use pytorch with CUDA support
FROM pytorch/pytorch:2.2.1-cuda12.1-cudnn8-runtime as selfie

# Install build tools and compilers
RUN apt-get update && \
apt-get install -y build-essential

# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PIP_NO_CACHE_DIR=1

# Set the working directory
WORKDIR /selfie

# Copy code and dependencies into the docker image
COPY . .

# Copy the built UI from the previous stage
COPY --from=selfie-ui /selfie/out/ ./selfie-ui/out

# Install poetry
RUN pip install poetry
volod-vana marked this conversation as resolved.
Show resolved Hide resolved

# Install dependencies
RUN poetry config virtualenvs.create false
RUN poetry install --no-interaction --no-ansi

EXPOSE 8181

# Run the application with GPU support
CMD ["python", "-m", "selfie", "--gpu"]
tnunamak marked this conversation as resolved.
Show resolved Hide resolved
Loading