Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

podman push twice results in two different digests #6496

Closed
cdjohnson opened this issue Jun 4, 2020 · 16 comments
Closed

podman push twice results in two different digests #6496

cdjohnson opened this issue Jun 4, 2020 · 16 comments
Assignees
Labels
kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.

Comments

@cdjohnson
Copy link

/kind bug

Description

If I attempt to push the same tag to docker.io twice in a row, I get a different digest.

If I remove the blob cache /var/lib/containers/cache/blob-info-cache-v1.boltdb, then the correct digest is calculated.

Steps to reproduce the issue:

docker build... -t docker.io/cdjohnson/test:test
podman push docker.io/cdjohnson/test:test
skopeo inspect docker://docker.io/cdjohnson/test:test | jq .Digests
"sha256:8f69f5bcb72ac644ec82a5dd01f56620226e9dc5b0dcf3647a5a3b7b1993f002

podman push docker.io/cdjohnson/test:test
skopeo inspect docker://docker.io/cdjohnson/test:test | jq .Digests
"sha256:29996e2fe0d7bb953724001f8fd23e00c9313a684c2516c12537a6d2cd9af2fa"

rm /var/lib/containers/cache/blob-info-cache-v1.boltdb
podman push docker.io/cdjohnson/test:test
skopeo inspect docker://docker.io/cdjohnson/test:test | jq .Digests
"sha256:8f69f5bcb72ac644ec82a5dd01f56620226e9dc5b0dcf3647a5a3b7b1993f002

Describe the results you received:
The digest is different for images that are stored in the image cache.

Describe the results you expected:
The digest should be the same regardless of it's cache state.

Additional information you deem important (e.g. issue happens only occasionally):
Deleting the cache prior to any podman push resolves the problem.

Output of podman version:

Version:            1.9.1
RemoteAPI Version:  1
Go Version:         go1.13.9
Git Commit:         b5af022859f680013083f184628dff184dc86c7a
Built:              Wed Apr 29 12:44:40 2020
OS/Arch:            linux/amd64

Output of podman info --debug:

debug:
  compiler: gc
  gitCommit: b5af022859f680013083f184628dff184dc86c7a
  goVersion: go1.13.9
  podmanVersion: 1.9.1
host:
  arch: amd64
  buildahVersion: 1.14.8
  cgroupVersion: v1
  conmon:
    package: 'conmon: /usr/libexec/podman/conmon'
    path: /usr/libexec/podman/conmon
    version: 'conmon version 2.0.15, commit: '
  cpus: 2
  distribution:
    distribution: ubuntu
    version: "18.04"
  eventLogger: file
  hostname: cdjohnson-ubuntu-olmdev1.fyre.ibm.com
  idMappings:
    gidmap: null
    uidmap: null
  kernel: 4.15.0-76-generic
  memFree: 819040256
  memTotal: 2089746432
  ociRuntime:
    name: runc
    package: 'containerd.io: /usr/bin/runc'
    path: /usr/bin/runc
    version: |-
      runc version 1.0.0-rc10
      commit: dc9208a3303feef5b3839f4323d9beb36df0a9dd
      spec: 1.0.1-dev
  os: linux
  rootless: false
  slirp4netns:
    executable: ""
    package: ""
    version: ""
  swapFree: 15991529472
  swapTotal: 15997071360
  uptime: 985h 37m 48.76s (Approximately 41.04 days)
registries:
  search:
  - docker.io
  - quay.io
store:
  configFile: /etc/containers/storage.conf
  containerStore:
    number: 0
    paused: 0
    running: 0
    stopped: 0
  graphDriverName: overlay
  graphOptions: {}
  graphRoot: /var/lib/containers/storage
  graphStatus:
    Backing Filesystem: xfs
    Native Overlay Diff: "true"
    Supports d_type: "true"
    Using metacopy: "false"
  imageStore:
    number: 28
  runRoot: /var/run/containers/storage
  volumePath: /var/lib/containers/storage/volumes

Package info (e.g. output of rpm -q podman or apt list podman):

podman/unknown 1.9.1~1 amd64 [upgradable from: 1.9.0~2]

Additional environment details (AWS, VirtualBox, physical, etc.):

This appears to be similar to the workaround for: containers/image#733

@github-actions
Copy link

github-actions bot commented Jul 5, 2020

A friendly reminder that this issue had no activity for 30 days.

@cdjohnson
Copy link
Author

bump

@rhatdan
Copy link
Member

rhatdan commented Jul 6, 2020

@mtrmac PTAL

@mtrmac
Copy link
Collaborator

mtrmac commented Jul 7, 2020

Thanks for your report. Yes, that is almost certainly containers/image#733 . You can verify that by using skopeo inspect --raw on the destination and comparing the manifests.

@jdockter
Copy link

jdockter commented Aug 3, 2020

Also seeing this behavior in podman 2.0.3

$ podman version
Version:      2.0.3
API Version:  1
Go Version:   go1.14.4
Built:        Wed Dec 31 16:00:00 1969
OS/Arch:      linux/amd64

$ podman build -t docker.io/jdockter/test:test -f test.Dockerfile
$ podman push docker.io/jdockter/test:test
$ skopeo inspect docker://docker.io/jdockter/test:test | jq .Digest
"sha256:8f143f04a126bfd6cdbe25cee9be9537fb674bc84fa17a2e2bc9c5d3344dae2e"

$ podman push docker.io/jdockter/test:test
$ skopeo inspect docker://docker.io/jdockter/test:test | jq .Digest
"sha256:5b8d98e5edc321c2e3f3f4d1c91a8e7060cc3bb77015427410c74ef5f559743d"

@rocketraman
Copy link

I have this issue (I think) as well. I'm building with podman 2.0.3 using BUILDAH_FORMAT=docker and I see that all or most of the built images have a layer which incorrectly reports the mediatype as application/vnd.docker.image.rootfs.diff.tar.gzip, when in fact the correct media type is application/vnd.docker.image.rootfs.diff.tar.

Furthermore, when doing:

podman push gcr.io/image-with-incorrect-layer-mediatype:foo gcr.io/image-with-incorrect-layer-mediatype:bar

podman apparently "corrects" the layer mediatype and the server ends up with the tags on different images. I can see using skopeo inspect that the only difference between these images is the mediatype value as mentioned above: the bar tagged image has the correct (non gzip) mediaType, and everything else is the same (including the layer hash) as foo.

@rhatdan
Copy link
Member

rhatdan commented Sep 11, 2020

@mtrmac Seems we are not making progress on this?

@rbo
Copy link

rbo commented Oct 30, 2020

I'm seeing the same behaviour with podman 2.1.1. Here my debugging details : https://gist.github.com/rbo/2bcae948fe5e278fc68d12c365d20af1 (Don't want to make to much noise here.)

It looks like to me it change the manifest version. First push schemaVersion": 2, Seconf push "schemaVersion": 1, registry: quay.io

@rhatdan
Copy link
Member

rhatdan commented Dec 24, 2020

@vrothberg This is blocked on a PR that you opened many months ago. Any progress?

@rhatdan rhatdan assigned vrothberg and unassigned mtrmac Dec 24, 2020
@vrothberg vrothberg removed their assignment Jan 4, 2021
@vrothberg
Copy link
Member

@vrothberg This is blocked on a PR that you opened many months ago. Any progress?

I guess you refer to containers/image#733? That's just an issue. I haven't been working on this but I believe @nalind is looking into this.

@rhatdan
Copy link
Member

rhatdan commented Feb 3, 2021

@nalind Any thoughts on this?

@vrothberg
Copy link
Member

This is fixed now.

@cdjohnson
Copy link
Author

@vrothberg I can't seem to follow the breadcrumbs. Which version of podman includes the fix?

@vrothberg
Copy link
Member

Apologies. Podman v3.0 should include the fix. At least, I couldn't reproduce locally and @nalind fixed a bug in the blob-info cache that went into v3.0.

@david-caro
Copy link

I know that this is might be a long shot, but I've seen this issue using 3.0.1, where I have an image built locally with one hash, then push it, and in the repo gets a different hash ending with multiple digests:

root@node1:~# podman inspect f954c4ac912a                                                                                                                                                                                                                                                                                       
[                                                                                                                                                                                                                                                                                                                               
    {                                                                                                                                                                                                                                                                                                                           
        "Id": "f954c4ac912afb422fbc54ccd86229ce1e4d1685cdd28f8aba3c4edb93297046",                                                                                                                                                                                                                                               
        "Digest": "sha256:144bc25b33c0e004cfa173cadb2964c5fc8d8cbb8422e6ff3bb4af02f0eb7747",                                                                                                                                                                                                                                    
        "RepoTags": [                                                                                                                                                                                                                                                                                                           
            "192.168.1.12:5000/ceph/ceph:latest",                                                                                                                                                                                                                                                                               
            "localhost/v16.2.1_aarch64_rpi4:latest"                                                                                                                                                                                                                                                                             
        ],                                                                                                                                                                                                                                                                                                                      
        "RepoDigests": [                                                                                                                                                                                                                                                                                                        
            "192.168.1.12:5000/ceph/ceph@sha256:144bc25b33c0e004cfa173cadb2964c5fc8d8cbb8422e6ff3bb4af02f0eb7747",                                                                                                                                                                                                              
            "192.168.1.12:5000/ceph/ceph@sha256:9bfe7a6455ddf93935aaa6fe9f521b8ccf40189a3c1447305884a7d647167e64",                                                                                                                                                                                                              
            "localhost/v16.2.1_aarch64_rpi4@sha256:144bc25b33c0e004cfa173cadb2964c5fc8d8cbb8422e6ff3bb4af02f0eb7747",                                                                                                                                                                                                           
            "localhost/v16.2.1_aarch64_rpi4@sha256:9bfe7a6455ddf93935aaa6fe9f521b8ccf40189a3c1447305884a7d647167e64"                                                                                                                                                                                                            
        ], 
    ...

My workaround so far is to delete the local image and pull again, that pulls only on hash.
I'm not very familiar with podman, so if you need any more info let me know.
Thanks!

@nalind
Copy link
Member

nalind commented May 5, 2021

That's what I'd expect if we pulled an image using a named reference that corresponded to a manifest list -- in those cases we save both the manifest list and the manifest of the arch-specific image in the record for the local copy of the image. If that image is then tagged with another name, that name gets attached to the same record, which still has two manifests in it. Those manifests will have different digests because their contents are different.

I guess it could also happen if we pushed that same image back to the registry without help from the blob info cache (which aims to let us reuse blobs that we know are already present in the registry), and we had to compress the image's layers again, and that yielded different results due to differences in the versions of the compression libraries used to compress them. That would produce a new manifest that described the same arch-specific image's configuration and layer blobs, and if you then re-pulled the image, you'd get another manifest added to the image's record.

If neither of those describes how you got to there, please describe which commands you used, so that I can try to reproduce it here and figure out what's going on.

Thanks!

@github-actions github-actions bot added the locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments. label Sep 22, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Sep 22, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
kind/bug Categorizes issue or PR as related to a bug. locked - please file new issue/PR Assist humans wanting to comment on an old issue or PR with locked comments.
Projects
None yet
Development

No branches or pull requests

10 participants