Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ollama): support calling the Ollama local process #2923

Merged
merged 42 commits into from
Jan 2, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
256a0d1
feat: support running ollama from the local binary
mdelapenya Dec 3, 2024
c43e5af
fix: wrong working dir at CI
mdelapenya Dec 3, 2024
fa2b345
chore: extract wait to a function
mdelapenya Dec 3, 2024
3ac88ef
chore: print local binary logs on error
mdelapenya Dec 3, 2024
9e63a7e
chore: remove debug logs
mdelapenya Dec 3, 2024
57ca76a
fix(ci): kill ollama before the tests
mdelapenya Dec 3, 2024
e4d2234
chore: stop ollama using systemctl
mdelapenya Dec 3, 2024
01134eb
chore: support setting log file from the env
mdelapenya Dec 4, 2024
5c1b404
chore: support running ollama commands, only
mdelapenya Dec 4, 2024
ce04a0e
fix: release lock on error
mdelapenya Dec 13, 2024
99e2655
chore: add more test coverage for the option
mdelapenya Dec 13, 2024
6c50334
chore: simplify useLocal checks
mdelapenya Dec 13, 2024
6a06b4d
chore: simpolify
mdelapenya Dec 13, 2024
bd85c0e
chore: pass context to runLocal
mdelapenya Dec 13, 2024
6239947
chore: move ctx to the right scope
mdelapenya Dec 13, 2024
811eb6d
chore: remove not needed
mdelapenya Dec 13, 2024
5556971
chore: use a container function
mdelapenya Dec 13, 2024
c68ff22
chore: support reading OLLAMA_HOST
mdelapenya Dec 13, 2024
b5e9874
chore: return error with copy APIs
mdelapenya Dec 13, 2024
c39e554
chore: simply execute the script
mdelapenya Dec 13, 2024
556c2f5
chore: simplify var initialisation
mdelapenya Dec 13, 2024
d3e7a49
chore: return nil
mdelapenya Dec 13, 2024
c38a640
fix: return errors on terminate
mdelapenya Dec 13, 2024
77a39f3
chore: remove options type
mdelapenya Dec 13, 2024
ffa0b2a
chore: use a map
mdelapenya Dec 13, 2024
6c39254
chor: simplify error on wait
mdelapenya Dec 13, 2024
91396ee
chore: wrap start logic around the localContext
mdelapenya Dec 16, 2024
7fa26ee
chor: fold
mdelapenya Dec 16, 2024
ebad12c
chore: merge wait into start
mdelapenya Dec 16, 2024
88f58af
fix: use proper ContainersState
mdelapenya Dec 16, 2024
80c76f9
fix: remove extra conversion
mdelapenya Dec 16, 2024
8c0ee3d
chore: handle remove log file errors properly
mdelapenya Dec 16, 2024
de1339a
chore: go back to string in env vars
mdelapenya Dec 16, 2024
8a18b3b
refactor(ollama): local process
stevenh Dec 17, 2024
2a3a30d
chore(ollama): refactor local to use log sub match.
stevenh Dec 18, 2024
5e77b27
feat(ollama): validate container request
stevenh Dec 18, 2024
5c0486a
chore(ollama): remove temporary test
stevenh Dec 18, 2024
bbd6242
feat(ollama): configurable local process binary
stevenh Dec 20, 2024
cb684b4
docs(ollama): detail local process supported fields
stevenh Dec 20, 2024
9be6309
docs(ollama): update local process site docs
stevenh Dec 20, 2024
4c3a06c
chore: refactor to support TerminateOption
stevenh Jan 2, 2025
39e7af4
fix: remove unused var
stevenh Jan 2, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .github/scripts/modules/ollama/install-dependencies.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/usr/bin/env bash

curl -fsSL https://ollama.com/install.sh | sh

# kill any running ollama process so that the tests can start from a clean state
sudo systemctl stop ollama.service
10 changes: 10 additions & 0 deletions .github/workflows/ci-test-go.yml
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,16 @@ jobs:
working-directory: ./${{ inputs.project-directory }}
run: go build

- name: Install dependencies
shell: bash
run: |
SCRIPT_PATH="./.github/scripts/${{ inputs.project-directory }}/install-dependencies.sh"
if [ -f "$SCRIPT_PATH" ]; then
$SCRIPT_PATH
else
echo "No dependencies script found at $SCRIPT_PATH - skipping installation"
fi
- name: go test
# only run tests on linux, there are a number of things that won't allow the tests to run on anything else
# many (maybe, all?) images used can only be build on Linux, they don't have Windows in their manifest, and
Expand Down
50 changes: 50 additions & 0 deletions docs/modules/ollama.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,15 @@ go get github.com/testcontainers/testcontainers-go/modules/ollama

## Usage example

The module allows you to run the Ollama container or the local Ollama binary.

<!--codeinclude-->
[Creating a Ollama container](../../modules/ollama/examples_test.go) inside_block:runOllamaContainer
[Running the local Ollama binary](../../modules/ollama/examples_test.go) inside_block:localOllama
<!--/codeinclude-->

If the local Ollama binary fails to execute, the module will fallback to the container version of Ollama.

## Module Reference

### Run function
Expand Down Expand Up @@ -48,6 +53,51 @@ When starting the Ollama container, you can pass options in a variadic way to co
If you need to set a different Ollama Docker image, you can set a valid Docker image as the second argument in the `Run` function.
E.g. `Run(context.Background(), "ollama/ollama:0.1.25")`.

#### Use Local

- Not available until the next release of testcontainers-go <a href="https://github.com/testcontainers/testcontainers-go"><span class="tc-version">:material-tag: main</span></a>

!!!warning
Please make sure the local Ollama binary is not running when using the local version of the module:
Ollama can be started as a system service, or as part of the Ollama application,
and interacting with the logs of a running Ollama process not managed by the module is not supported.

If you need to run the local Ollama binary, you can set the `UseLocal` option in the `Run` function.
This option accepts a list of environment variables as a string, that will be applied to the Ollama binary when executing commands.

E.g. `Run(context.Background(), "ollama/ollama:0.1.25", WithUseLocal("OLLAMA_DEBUG=true"))`.

All the container methods are available when using the local Ollama binary, but will be executed locally instead of inside the container.
Please consider the following differences when using the local Ollama binary:

- The local Ollama binary will create a log file in the current working directory, identified by the session ID. E.g. `local-ollama-<session-id>.log`. It's possible to set the log file name using the `OLLAMA_LOGFILE` environment variable. So if you're running Ollama yourself, from the Ollama app, or the standalone binary, you could use this environment variable to set the same log file name.
- For the Ollama app, the default log file resides in the `$HOME/.ollama/logs/server.log`.
- For the standalone binary, you should start it redirecting the logs to a file. E.g. `ollama serve > /tmp/ollama.log 2>&1`.
- `ConnectionString` returns the connection string to connect to the local Ollama binary started by the module instead of the container.
- `ContainerIP` returns the bound host IP `127.0.0.1` by default.
- `ContainerIPs` returns the bound host IP `["127.0.0.1"]` by default.
- `CopyToContainer`, `CopyDirToContainer`, `CopyFileToContainer` and `CopyFileFromContainer` return an error if called.
- `GetLogProductionErrorChannel` returns a nil channel.
- `Endpoint` returns the endpoint to connect to the local Ollama binary started by the module instead of the container.
- `Exec` passes the command to the local Ollama binary started by the module instead of inside the container. First argument is the command to execute, and the second argument is the list of arguments, else, an error is returned.
- `GetContainerID` returns the container ID of the local Ollama binary started by the module instead of the container, which maps to `local-ollama-<session-id>`.
- `Host` returns the bound host IP `127.0.0.1` by default.
- `Inspect` returns a ContainerJSON with the state of the local Ollama binary started by the module.
- `IsRunning` returns true if the local Ollama binary process started by the module is running.
- `Logs` returns the logs from the local Ollama binary started by the module instead of the container.
- `MappedPort` returns the port mapping for the local Ollama binary started by the module instead of the container.
- `Start` starts the local Ollama binary process.
- `State` returns the current state of the local Ollama binary process, `stopped` or `running`.
- `Stop` stops the local Ollama binary process.
- `Terminate` calls the `Stop` method and then removes the log file.

The local Ollama binary will create a log file in the current working directory, and it will be available in the container's `Logs` method.

!!!info
The local Ollama binary will use the `OLLAMA_HOST` environment variable to set the host and port to listen on.
If the environment variable is not set, it will default to `localhost:0`
which bind to a loopback address on an ephemeral port to avoid port conflicts.

{% include "../features/common_functional_options.md" %}

### Container Methods
Expand Down
70 changes: 70 additions & 0 deletions modules/ollama/examples_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -173,3 +173,73 @@ func ExampleRun_withModel_llama2_langchain() {

// Intentionally not asserting the output, as we don't want to run this example in the tests.
}

func ExampleRun_withLocal() {
ctx := context.Background()

// localOllama {
ollamaContainer, err := tcollama.Run(ctx, "ollama/ollama:0.3.13", tcollama.WithUseLocal("OLLAMA_DEBUG=true"))
defer func() {
if err := testcontainers.TerminateContainer(ollamaContainer); err != nil {
log.Printf("failed to terminate container: %s", err)
}
}()
if err != nil {
log.Printf("failed to start container: %s", err)
return
}
// }

model := "llama3.2:1b"

_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "pull", model})
if err != nil {
log.Printf("failed to pull model %s: %s", model, err)
return
}

_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "run", model})
if err != nil {
log.Printf("failed to run model %s: %s", model, err)
return
}

connectionStr, err := ollamaContainer.ConnectionString(ctx)
if err != nil {
log.Printf("failed to get connection string: %s", err)
return
}

var llm *langchainollama.LLM
if llm, err = langchainollama.New(
langchainollama.WithModel(model),
langchainollama.WithServerURL(connectionStr),
); err != nil {
log.Printf("failed to create langchain ollama: %s", err)
return
}

completion, err := llm.Call(
context.Background(),
"how can Testcontainers help with testing?",
llms.WithSeed(42), // the lower the seed, the more deterministic the completion
llms.WithTemperature(0.0), // the lower the temperature, the more creative the completion
)
if err != nil {
log.Printf("failed to create langchain ollama: %s", err)
return
}

words := []string{
"easy", "isolation", "consistency",
}
lwCompletion := strings.ToLower(completion)

for _, word := range words {
if strings.Contains(lwCompletion, word) {
fmt.Println(true)
}
}

// Intentionally not asserting the output, as we don't want to run this example in the tests.
}
2 changes: 1 addition & 1 deletion modules/ollama/go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ go 1.22

require (
github.com/docker/docker v27.1.1+incompatible
github.com/docker/go-connections v0.5.0
github.com/google/uuid v1.6.0
github.com/stretchr/testify v1.9.0
github.com/testcontainers/testcontainers-go v0.34.0
Expand All @@ -22,7 +23,6 @@ require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/distribution/reference v0.6.0 // indirect
github.com/dlclark/regexp2 v1.8.1 // indirect
github.com/docker/go-connections v0.5.0 // indirect
github.com/docker/go-units v0.5.0 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
github.com/go-logr/logr v1.4.1 // indirect
Expand Down
Loading
Loading