Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker multi stage build ignoring intermediate stages #5294

Closed
htonkovac opened this issue Jan 25, 2021 · 10 comments
Closed

Docker multi stage build ignoring intermediate stages #5294

htonkovac opened this issue Jan 25, 2021 · 10 comments

Comments

@htonkovac
Copy link

htonkovac commented Jan 25, 2021

  • [√] I have tried with the latest version of Docker Desktop
  • [√] I have tried disabling enabled experimental features
  • [√] I have uploaded Diagnostics
  • Diagnostics ID: 9BBA6AD3-318A-403A-B064-1001484A3145/20210125183936

Expected behavior

FROM golang as base
RUN echo base
FROM base as test
RUN echo test
FROM base as final
RUN echo final

The code above should run 3 stages. The "RUN echo test" should be visible in the output.

Actual behavior

✗ docker build -t testing .
[+] Building 0.8s (7/7) FINISHED                                          
 => [internal] load build definition from Dockerfile                 0.0s
 => => transferring dockerfile: 141B                                 0.0s
 => [internal] load .dockerignore                                    0.0s
 => => transferring context: 2B                                      0.0s
 => [internal] load metadata for docker.io/library/golang:latest     0.0s
 => CACHED [base 1/2] FROM docker.io/library/golang                  0.0s
 => [base 2/2] RUN echo base                                         0.3s
 => [final 1/1] RUN echo final                                       0.4s
 => exporting to image                                               0.0s
 => => exporting layers                                              0.0s
 => => writing image sha256:28250229eb6bd9546ef91e5b4fb39b1b3661e69  0.0s
 => => naming to docker.io/library/testing                           0.0s

Middle stage is skipped. Output should show "[test 1/1] RUN echo test", but it does not.

Information

  • macOS Version: macOS Big Sur 11.1 (20C69)
  • hardware: MacBook Pro (16-inch, 2019)

Diagnostic logs

  "id": "9BBA6AD3-318A-403A-B064-1001484A3145",
  "date": "2021-01-25 18:39:36.685445 +0000 UTC",
  "os": "macOS 11.1",
  "os_label": "osx/11.1.x",
  "app_version": "3.1.0",
  "app_channel": "stable",
  "engine_version": "20.10.2",
  "compose_version": "1.27.4",
  "kubernetes_version": "v1.19.3",
  "credhelper_version": "0.6.3",
  "notary_version": "0.6.1",
  "vpnkit_version": "ea9dbeaf887f5dad8391f4a34d127501fb6bbf64",
  "hyperkit_version": "v0.20200224-44-gb54460"

Steps to reproduce the behavior

  1. Run docker build
@stephen-turner
Copy link
Contributor

I'm not a build expert (it's only an upstream to us), but my understanding is that this is the correct behaviour. Stages that can't affect the final image are skipped.

@htonkovac
Copy link
Author

htonkovac commented Jan 26, 2021 via email

@stephen-turner
Copy link
Contributor

You might be using the old Docker build on Linux, which I think is still the default, and which had fewer optimisations. Try invoking buildkit explicitly (with something like DOCKER_BUILDKIT=1 docker build .) and see if you get the same output.

@thaJeztah
Copy link
Member

This is the expected behaviour when building using BuildKit; BuildKit will only build stages that are needed for the final image. Buildkit constructs a "dependency tree" for steps in the Dockerfile, which allows skipping steps, but also to build stages in parallel.

Having build-stages only built when needed allows you to have a single dockerfile for different scenarios, for example if different steps have to be executed depending on "architecture" (when building multi-arch images), or to have (e.g.) stages to build different artefacts.

With this, it's also possible to construct if/else-like Dockerfiles, where passing a build-arg allows you to customise what steps are executed; the example below illustrates how to have different paths for "production" and "debugging";

# build-arg to switch between build-types
ARG BUILD_TYPE=production

FROM busybox AS base
RUN mkdir -p /out

# Example build stage with "debug" enabled
FROM base AS build-debug
RUN echo "debugging enabled" > /out/foo.txt

# Example build stage with "debug" enabled
FROM base AS build-production
RUN echo "production build" > /out/foo.txt

# depending on $BUILD_TYPE, this will either be
# "build-debug" or "build-production" (production is the default)
FROM build-${BUILD_TYPE} AS build

# Common build-stage
FROM base AS common
RUN echo "common succeeded" > /out/bar.txt

FROM base AS final
COPY --from=common /out/bar.txt /out/bar.txt
COPY --from=build  /out/foo.txt /out/foo.txt
CMD cat /out/*

Building and running the Dockerfile twice (once without build-arg (which uses the default "production"), and once with a build-arg;

$ docker build -t prod .
$ docker build -t debug --build-arg BUILD_TYPE=debug .

$ docker run --rm prod
common succeeded
production build

$ docker run --rm debug
common succeeded
debugging enabled

More complete examples of these features can be found in the Dockerfile for the docker documentation, and the Dockerfile for the docker daemon (which uses if/else constructs enable stages for cross-compiling, and wether or not to include systemd code used in CI)

@adamdavis40208
Copy link

adamdavis40208 commented Mar 25, 2021

I noticed the new output in the recent release, but didn't realize "unneeded" multi-stage builds would be skipped by default. We use an intermediate layer to install test-only requirements/run tests, and it's completely skipping over that.

DOCKER_BUILDKIT=0

Gets me back to match my pipeline that is also breaking, but it's a heck of a change to default to.

@albertotn
Copy link

same problem on windows 10, change this default without proper explanation is not exactly a nice move, thanks @adamdavis40208 for pointing out the solution ( on windows just type

set DOCKER_BUILDKIT=0

before launch your docker multistage build

@docker-robott
Copy link
Collaborator

Issues go stale after 90 days of inactivity.
Mark the issue as fresh with /remove-lifecycle stale comment.
Stale issues will be closed after an additional 30 days of inactivity.

Prevent issues from auto-closing with an /lifecycle frozen comment.

If this issue is safe to close now please do so.

Send feedback to Docker Community Slack channels #docker-for-mac or #docker-for-windows.
/lifecycle stale

@albertotn
Copy link

/lifecycle frozen

@albertotn
Copy link

For me this problem is still open on windows, I think the same on macOs ?

@docker-robott
Copy link
Collaborator

Closed issues are locked after 30 days of inactivity.
This helps our team focus on active issues.

If you have found a problem that seems similar to this, please open a new issue.

Send feedback to Docker Community Slack channels #docker-for-mac or #docker-for-windows.
/lifecycle locked

@docker docker locked and limited conversation to collaborators Nov 5, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants