Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix docker-compose.yaml #2073

Closed
wants to merge 1 commit into from
Closed

Fix docker-compose.yaml #2073

wants to merge 1 commit into from

Conversation

gabryz95
Copy link

Description

Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

Type of Change

Please delete options that are not relevant.

  • [ x] Bug fix (non-breaking change which fixes an issue)

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

  • [x ] I can run the project via docker-compose now.

This change solves the problem described in issue #2071

Copy link
Collaborator

@jaluma jaluma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please revert the changes I have marked for you. On the other hand, to solve this problem, what we have to do is to retry the initial connection to ollama. E.g. https://stackoverflow.com/questions/50246304/using-python-decorators-to-retry-request

private-gpt-ollama:
image: ${PGPT_IMAGE:-zylonai/private-gpt}:${PGPT_TAG:-0.6.2}-ollama # x-release-please-version
user: root
image: ${PGPT_IMAGE:-zylonai/private-gpt}:${PGPT_TAG:-0.6.2}-ollama
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please don't remove # x-release-please-version

private-gpt-llamacpp-cpu:
image: ${PGPT_IMAGE:-zylonai/private-gpt}:${PGPT_TAG:-0.6.2}-llamacpp-cpu # x-release-please-version
image: ${PGPT_IMAGE:-zylonai/private-gpt}:${PGPT_TAG:-0.6.2}-llamacpp-cpu
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please don't remove # x-release-please-version


# Traefik reverse proxy for the Ollama service
# This will route requests to the Ollama service based on the profile.
ollama:
traefik:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you change service name, you cannot connect to ollama using http://ollama:11434

# Ollama service for the CPU mode
ollama-cpu:
# Ollama service
ollama:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please go back to ollama-cpu. Traeffik will route each request to these children service

@jaluma
Copy link
Collaborator

jaluma commented Sep 16, 2024

We close this PR since the changes are not valid. They don't fix anything, they just prevent them from happening. Fixed at: #2059

@jaluma jaluma closed this Sep 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants