Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Compose module does not keep connection alive to Ryuk reaper #2754

Closed
aabmass opened this issue Aug 29, 2024 · 1 comment
Closed

[Bug]: Compose module does not keep connection alive to Ryuk reaper #2754

aabmass opened this issue Aug 29, 2024 · 1 comment
Labels
bug An issue with the library

Comments

@aabmass
Copy link

aabmass commented Aug 29, 2024

Testcontainers version

v0.33.0

Using the latest Testcontainers version?

Yes

Host OS

Linux

Host arch

amd64

Go version

1.23

Docker version

Client: Docker Engine - Community
 Version:           27.2.0
 API version:       1.47
 Go version:        go1.21.13
 Git commit:        3ab4256
 Built:             Tue Aug 27 14:15:27 2024
 OS/Arch:           linux/amd64
 Context:           default

Server: Docker Engine - Community
 Engine:
  Version:          27.2.0
  API version:      1.47 (minimum version 1.24)
  Go version:       go1.21.13
  Git commit:       3ab5c7d
  Built:            Tue Aug 27 14:15:27 2024
  OS/Arch:          linux/amd64
  Experimental:     false
 containerd:
  Version:          1.7.21
  GitCommit:        472731909fa34bd7bc9c087e4c27943f9835f111
 runc:
  Version:          1.1.13
  GitCommit:        v1.1.13-0-g58aa920
 docker-init:
  Version:          0.19.0
  GitCommit:        de40ad0

Docker info

Client: Docker Engine - Community
 Version:    27.2.0
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.16.2
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.29.2
    Path:     /usr/libexec/docker/cli-plugins/docker-compose

Server:
 Containers: 0
  Running: 0
  Paused: 0
  Stopped: 0
 Images: 168
 Server Version: 27.2.0
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Using metacopy: false
  Native Overlay Diff: true
  userxattr: false
 Logging Driver: json-file
 Cgroup Driver: systemd
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local splunk syslog
 Swarm: inactive
 Runtimes: io.containerd.runc.v2 runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: 472731909fa34bd7bc9c087e4c27943f9835f111
 runc version: v1.1.13-0-g58aa920
 init version: de40ad0
 Security Options:
  apparmor
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 6.7.12-1rodete1-amd64
 Operating System: Debian GNU/Linux rodete
 OSType: linux
 Architecture: x86_64
 CPUs: 48
 Total Memory: 188.7GiB
 Name: foo
 ID: 66492cf4-373d-4dbc-9db0-42677af4076f
 Docker Root Dir: /usr/local/google/docker
 Debug Mode: true
  File Descriptors: 23
  Goroutines: 41
  System Time: 2024-08-29T19:41:44.058143085Z
  EventsListeners: 0
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Registry Mirrors:
  https://mirror.gcr.io/
 Live Restore Enabled: false
 Default Address Pools:
   Base: 192.168.8.0/22, Size: 24

What happened?

When using the compose module, the reaper (Ryuk) stops containers in the middle of the test, after a succesful call to Up(). I don't know much about Ryuk, but it seems like this library should be holding open a connection to the reaper until Down() is called. Otherwise the reaper eagerly stops containers. Here is a repro:

package main

import (
	"context"
	"strings"
	"testing"
	"time"

	"github.com/stretchr/testify/require"
	"github.com/testcontainers/testcontainers-go/modules/compose"
)

const composeText = `
services:
  sleep:
    image: busybox
    command: sleep 60
`

func Test(t *testing.T) {
	ctx := context.Background()
	composeStack, err := compose.NewDockerComposeWith(compose.WithStackReaders(strings.NewReader(composeText)))
	require.NoError(t, err)

	err = composeStack.Up(ctx)
	require.NoError(t, err)
	t.Cleanup(func() { composeStack.Down(ctx) })

	// Do some verifications or have to wait for something to happen in the compose services
	time.Sleep(time.Second * 20)

	// The container should still be running... but what actually happens is the reaper
	// stops my containers before the test is done.
	container, err := composeStack.ServiceContainer(ctx, "sleep")
	require.NoError(t, err)
	require.True(t, container.IsRunning())
}

Relevant log output

$ go test . -v  
=== RUN   Test
2024/08/29 19:49:06 github.com/testcontainers/testcontainers-go - Connected to docker: 
  Server Version: 27.2.0
  API Version: 1.46
  Operating System: Debian GNU/Linux rodete
  Total Memory: 193221 MB
  Testcontainers for Go Version: v0.33.0
  Resolved Docker Host: unix:///var/run/docker.sock
  Resolved Docker Socket Path: /var/run/docker.sock
  Test SessionID: b566dffb2a6503ebee2791158c77f048327f12dade40bf42c5578d49b6f388cf
  Test ProcessID: f39872aa-89db-41e1-b3d4-40cc0412bd1b
2024/08/29 19:49:06 🐳 Creating container for image testcontainers/ryuk:0.8.1
2024/08/29 19:49:06 ✅ Container created: 424e3d0e362c
2024/08/29 19:49:06 🐳 Starting container: 424e3d0e362c
2024/08/29 19:49:06 ✅ Container started: 424e3d0e362c
2024/08/29 19:49:06 ⏳ Waiting for container id 424e3d0e362c image: testcontainers/ryuk:0.8.1. Waiting for: &{Port:8080/tcp timeout:<nil> PollInterval:100ms skipInternalCheck:false}
2024/08/29 19:49:07 🔔 Container is ready: 424e3d0e362c
 Network ec575809-8848-4714-8b51-957d5312c9b7_default  Creating
 Network ec575809-8848-4714-8b51-957d5312c9b7_default  Created
 Container ec575809-8848-4714-8b51-957d5312c9b7-sleep-1  Creating
 Container ec575809-8848-4714-8b51-957d5312c9b7-sleep-1  Created
 Container ec575809-8848-4714-8b51-957d5312c9b7-sleep-1  Starting
 Container ec575809-8848-4714-8b51-957d5312c9b7-sleep-1  Started
    example_test.go:36: 
                Error Trace:    /usr/local/google/home/aaronabbott/tmp/testcontainersrepro/example_test.go:36
                Error:          Should be true
                Test:           Test
--- FAIL: Test (21.68s)
FAIL
FAIL    testcontainersrepro     22.111s
FAIL

Reaper logs:

$ docker logs -f reaper_d3407a5c6268e30af0971495c546de5eec4519c9579155f85294b22bbab513bb 
2024/08/29 19:46:09 Pinging Docker...
2024/08/29 19:46:09 Docker daemon is available!
2024/08/29 19:46:09 Starting on port 8080...
2024/08/29 19:46:09 Started!
2024/08/29 19:46:09 New client connected: 192.168.9.1:60638
2024/08/29 19:46:09 Adding {"label":{"org.testcontainers.lang=go":true,"org.testcontainers.sessionId=d3407a5c6268e30af0971495c546de5eec4519c9579155f85294b22bbab513bb":true,"org.testcontainers.version=0.33.0":true,"org.testcontainers=true":true}}
2024/08/29 19:46:09 New client connected: 192.168.9.1:60648
2024/08/29 19:46:09 Adding {"label":{"org.testcontainers.lang=go":true,"org.testcontainers.sessionId=d3407a5c6268e30af0971495c546de5eec4519c9579155f85294b22bbab513bb":true,"org.testcontainers.version=0.33.0":true,"org.testcontainers=true":true}}
2024/08/29 19:46:09 Client disconnected: 192.168.9.1:60648
2024/08/29 19:46:09 Client disconnected: 192.168.9.1:60638
2024/08/29 19:46:19 Timeout waiting for connection
2024/08/29 19:46:20 Removed 1 container(s), 1 network(s), 0 volume(s) 0 image(s)

Additional information

I've seen the configuration options https://golang.testcontainers.org/features/configuration/ but AFAICT none of them really solve the problem. The Go compose module doesn't try to reconnect to the reaper and increasing timeouts just bandaids the problem.

This is different from #2621 because compose comes up successfully.

@aabmass aabmass added the bug An issue with the library label Aug 29, 2024
@stevenh
Copy link
Collaborator

stevenh commented Oct 19, 2024

Thenks for the report and providing a simple test to reproduce.

This isn't currently the reaper thats causing this test to fail its the fact that not all fields are initialised correctly by compose, see #2667

It's possible the recent reaper work has corrected the original issue, if you can still reproduce the original symptoms, let us know and we can re-open.

@stevenh stevenh closed this as completed Oct 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug An issue with the library
Projects
None yet
Development

No branches or pull requests

2 participants