Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Docker support #25

Merged
merged 41 commits into from
Mar 18, 2024
Merged

feat: Docker support #25

merged 41 commits into from
Mar 18, 2024

Conversation

volod-vana
Copy link
Member

Dockerized app for ease of use, this setup includes a two-stage build process. Initially, it crafts the UI with a sleek Node.js image and Yarn, finalizing UI compilation efficiently. Subsequently, it transitions to a Python environment for app execution, integrating OpenGL support and leveraging Poetry for streamlined Python dependency management.

The build also smartly incorporates the pre-built UI, makes port 8181 available, and gears up for application kickoff with a start.sh script. This approach neatly combines Node.js and Python within a single container, optimizing both ease of deployment and operational simplicity.

Image build

image

Running in a docker

image

Additionally, the README has been updated with comprehensive instructions on how to build and run the Docker container, ensuring users have all the necessary information at their fingertips.

docker build -t selfie .
docker run -p 8181:8181 \
  --name selfie \
  -v $(pwd)/data:/selfie/data \
  -v $(pwd)/selfie:/selfie/selfie \
  -v $(pwd)/selfie-ui:/selfie/selfie-ui \
  selfie:latest

Possible Next Steps

  • Optimization of Build Times
  • Security Enhancements
  • Continuous Integration (CI) Integration

@volod-vana volod-vana requested review from tnunamak and Kahtaf March 5, 2024 18:10
@volod-vana volod-vana self-assigned this Mar 5, 2024
@tnunamak tnunamak changed the title Docker feat: Docker support Mar 5, 2024
Dockerfile Outdated Show resolved Hide resolved
start.sh Outdated Show resolved Hide resolved
README.md Outdated Show resolved Hide resolved
README.md Outdated Show resolved Hide resolved
@tnunamak
Copy link
Member

tnunamak commented Mar 5, 2024

It would be great if we could utilize any available special hardware (e.g. GPUs).

That said, Docker setups may work best with LLMs that are provided by an external API. Maybe we should document how to use Docker with something like an an Ollama server.

volod-vana and others added 5 commits March 6, 2024 16:32
# Conflicts:
#	README.md
#	docs/images/playground-search.png
#	scripts/llama-cpp-python-cublas.sh
#	start.sh
selfie-ui/yarn.lock Outdated Show resolved Hide resolved
Dockerfile Outdated Show resolved Hide resolved
Dockerfile Outdated Show resolved Hide resolved
Dockerfile Outdated Show resolved Hide resolved
Dockerfile Outdated Show resolved Hide resolved
Dockerfile Outdated Show resolved Hide resolved
@volod-vana volod-vana requested a review from tnunamak March 11, 2024 19:01
cpu.Dockerfile Outdated Show resolved Hide resolved
nvidia.Dockerfile Outdated Show resolved Hide resolved
nvidia.Dockerfile Outdated Show resolved Hide resolved
poetry.lock Outdated Show resolved Hide resolved
.dockerignore Outdated Show resolved Hide resolved
cpu.Dockerfile Outdated Show resolved Hide resolved
start.sh Outdated Show resolved Hide resolved
@volod-vana volod-vana requested a review from tnunamak March 12, 2024 00:16
@volod-vana volod-vana merged commit 2c943db into main Mar 18, 2024
1 check passed
@volod-vana volod-vana deleted the docker branch March 18, 2024 16:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants