Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"You are running out of disk space" Error. How can I fix this? #2840

Closed
1 of 7 tasks
jessehui opened this issue Mar 4, 2021 · 16 comments
Closed
1 of 7 tasks

"You are running out of disk space" Error. How can I fix this? #2840

jessehui opened this issue Mar 4, 2021 · 16 comments
Assignees
Labels
OS: Ubuntu question Further information is requested

Comments

@jessehui
Copy link

jessehui commented Mar 4, 2021

Description
The previous execution of this action never gives an error like this.
截屏2021-03-04 下午5 48 22

I think the reason is that the virtual environment is changed. But I am not sure. My question is whether there is a fix for this? I don't want to change the yaml file. Thank you. The action can be found here: https://github.com/occlum/occlum/actions/runs/619537350

Virtual environments affected

  • Ubuntu 16.04
  • Ubuntu 18.04
  • Ubuntu 20.04
  • macOS 10.15
  • macOS 11.0
  • Windows Server 2016 R2
  • Windows Server 2019

Image version

Image version where you are experiencing the issue.
Environment: ubuntu-18.04
Version: 20210219.1

@al-cheb al-cheb added OS: Ubuntu question Further information is requested and removed needs triage labels Mar 4, 2021
@al-cheb
Copy link
Contributor

al-cheb commented Mar 4, 2021

Hello, @jessehui
Based on documentation, we provide at least 14 GB of free space (https://docs.github.com/en/actions/reference/specifications-for-github-hosted-runners#supported-runners-and-hardware-resources).

As a workaround:

sudo rm -rf /usr/share/dotnet
sudo rm -rf /opt/ghc
sudo rm -rf "/usr/local/share/boost"
sudo rm -rf "$AGENT_TOOLSDIRECTORY"

@jessehui
Copy link
Author

jessehui commented Mar 5, 2021

@al-cheb Thank you for your reply. I would give it a try. I am just curious whether this limitation is added recently? Because we never encounter this error running the same workflow before.

@arroyc
Copy link
Contributor

arroyc commented Mar 5, 2021

@al-cheb is there any way that you can provide more disk space to the customer? we do multistage docker build and 14 gb is too small for us, we are using 'ubuntu-latest' ado / azure pipelines .. not github actions

Error processing tar file(exit status 1): write /opt/dotnet/3.1.405/packs/Microsoft.NETCore.App.Ref/3.1.0/ref/netcoreapp3.1/System.Resources.ResourceManager.dll: no space left on device

@maxim-lobanov
Copy link
Contributor

@jessehui , this limitation has always existed but previously, we have provided a bit more free space. Sometimes, image can have ~15-20 GB of free space and this volume can be changed without notice. We only guarantee that images contain at least 14 GB of free space.

@arroyc , unfortunately, no. Only workarounds that Aleks has shared above

@arroyc
Copy link
Contributor

arroyc commented Mar 5, 2021

@maxim-lobanov thanks for responding ... i just found following information in one of our builds ... its not even 14G ... its less than 10GB .. can you please confirm that? I'm assuming we are only using /dev/sda1

2021-03-05T17:45:05.3501527Z Filesystem Size Used Avail Use% Mounted on
2021-03-05T17:45:05.3506978Z /dev/root 84G 34G 50G 40% /
2021-03-05T17:45:05.3508722Z devtmpfs 3.4G 0 3.4G 0% /dev
2021-03-05T17:45:05.3510122Z tmpfs 3.4G 12K 3.4G 1% /dev/shm
2021-03-05T17:45:05.3511897Z tmpfs 696M 1.1M 695M 1% /run
2021-03-05T17:45:05.3513269Z tmpfs 5.0M 0 5.0M 0% /run/lock
2021-03-05T17:45:05.3515129Z tmpfs 3.4G 0 3.4G 0% /sys/fs/cgroup
2021-03-05T17:45:05.3516728Z /dev/loop0 139M 139M 0 100% /snap/chromium/1506
2021-03-05T17:45:05.3517833Z /dev/loop1 138M 138M 0 100% /snap/chromium/1497
2021-03-05T17:45:05.3518877Z /dev/loop2 56M 56M 0 100% /snap/core18/1988
2021-03-05T17:45:05.3520991Z /dev/loop3 163M 163M 0 100% /snap/gnome-3-28-1804/145
2021-03-05T17:45:05.3522240Z /dev/loop4 32M 32M 0 100% /snap/snapd/11036
2021-03-05T17:45:05.3523353Z /dev/sdb15 105M 7.8M 97M 8% /boot/efi
2021-03-05T17:45:05.3524425Z /dev/loop5 70M 70M 0 100% /snap/lxd/19188
2021-03-05T17:45:05.3525988Z /dev/loop6 65M 65M 0 100% /snap/gtk-common-themes/1514
2021-03-05T17:45:05.3526844Z /dev/loop7 33M 33M 0 100% /snap/snapd/11107
2021-03-05T17:45:05.3527413Z /dev/sda1 14G 4.1G 9.0G 32% /mnt

@jessehui
Copy link
Author

jessehui commented Mar 8, 2021

@al-cheb @maxim-lobanov Thank you for your help. We've managed to work this out. You may close this issue.

@al-cheb al-cheb closed this as completed Mar 9, 2021
behlendorf added a commit to behlendorf/zfs that referenced this issue Apr 1, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

  actions/runner-images#2840

Signed-off-by: Brian Behlendorf <[email protected]>
behlendorf added a commit to openzfs/zfs that referenced this issue Apr 1, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes #11826
behlendorf added a commit to openzfs/zfs that referenced this issue Apr 7, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes #11826
adamdmoss pushed a commit to adamdmoss/zfs that referenced this issue Apr 10, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
ghost pushed a commit to truenas/zfs that referenced this issue May 6, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
ghost pushed a commit to truenas/zfs that referenced this issue May 6, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
ghost pushed a commit to truenas/zfs that referenced this issue May 6, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
ghost pushed a commit to truenas/zfs that referenced this issue May 7, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
ghost pushed a commit to truenas/zfs that referenced this issue May 10, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
ghost pushed a commit to truenas/zfs that referenced this issue May 10, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
ghost pushed a commit to truenas/zfs that referenced this issue May 10, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
ghost pushed a commit to truenas/zfs that referenced this issue May 13, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes openzfs#11826
behlendorf added a commit to openzfs/zfs that referenced this issue May 20, 2021
Recently we've been running out of free space in the ubuntu 20.04
environment resulting in test failures.  This appears to be caused
by a change in the default available free space and not because of
any change in OpenZFS. Try and avoid this failure by applying a
suggested workaround which removes some unnecessary files.

actions/runner-images#2840

Reviewed-by: George Melikov <[email protected]>
Signed-off-by: Brian Behlendorf <[email protected]>
Closes #11826
sayboras added a commit to cilium/cilium that referenced this issue Aug 15, 2024
[ upstream commit e553bd2 ]

We are having the below failure due to no disk space, it seems like we
can remove pre-installed software and language runtimes, which are not
use in Cilium, to reclaim more disk space.

Alternative option is to bump the runner, but it might not be the best
resource and cost utilization.

Relates: https://github.com/cilium/cilium/actions/runs/10300396788
Relates: actions/runner-images#2840 (comment)
Signed-off-by: Tam Mach <[email protected]>
Signed-off-by: gray <[email protected]>
youngnick pushed a commit to cilium/cilium that referenced this issue Aug 16, 2024
[ upstream commit e553bd2 ]

We are having the below failure due to no disk space, it seems like we
can remove pre-installed software and language runtimes, which are not
use in Cilium, to reclaim more disk space.

Alternative option is to bump the runner, but it might not be the best
resource and cost utilization.

Relates: https://github.com/cilium/cilium/actions/runs/10300396788
Relates: actions/runner-images#2840 (comment)
Signed-off-by: Tam Mach <[email protected]>
Signed-off-by: gray <[email protected]>
wangyinz added a commit to mspass-team/mspass that referenced this issue Aug 17, 2024
XrXr added a commit to XrXr/ruby that referenced this issue Aug 30, 2024
Lately we've seen frequent failures on macOS GitHub Action runs due to
disk space issues. Poking with du(1) revealed that
/Library/Developer/CoreSimulator/Caches/dyld was growing to be multiple
gigbytes.

Deleting unused stuff is a known workaround to space issues.

actions/runner-images#2840 (comment)
XrXr added a commit to ruby/ruby that referenced this issue Aug 30, 2024
Lately we've seen frequent failures on macOS GitHub Action runs due to
disk space issues. Poking with du(1) revealed that
/Library/Developer/CoreSimulator/Caches/dyld was growing to be multiple
gigbytes.

Deleting unused stuff is a known workaround to space issues.

actions/runner-images#2840 (comment)
@BrentMifsud
Copy link

BrentMifsud commented Sep 3, 2024

so it seems the suggestion above was not clearing up enough space for me.

What I ended up doing was just deleting all the unused versions of Xcode from my Mac runner as one of the initial steps:

   - name: Delete Unused Xcode Versions
      shell: bash
      run: |
        # when changing xcode version below, be sure to update this script!!!
        echo "Deleting unused Xcode Versions..."
        cd /Applications/
        mv Xcode_15.4.app Ycode_15.4.app
        sudo rm -rf Xcode*
        mv Ycode_15.4.app Xcode_15.4.app

I'm no bash expert, so I'm sure theres a more elegant way of doing this. Still have no idea why this randomly started on Friday for us though. Our builds were working fine.

@BrentMifsud
Copy link

BrentMifsud commented Sep 3, 2024

sudo rm -rf /usr/share/dotnet
sudo rm -rf /opt/ghc
sudo rm -rf "/usr/local/share/boost"
sudo rm -rf "$AGENT_TOOLSDIRECTORY"

For the benefit of those who come across this several years later, as I did, here's a more up-to-date list of big things that you could remove at the start of a build:

# Remove software and language runtimes we're not using
sudo rm -rf \
  "$AGENT_TOOLSDIRECTORY" \
  /opt/google/chrome \
  /opt/microsoft/msedge \
  /opt/microsoft/powershell \
  /opt/pipx \
  /usr/lib/mono \
  /usr/local/julia* \
  /usr/local/lib/android \
  /usr/local/lib/node_modules \
  /usr/local/share/chromium \
  /usr/local/share/powershell \
  /usr/share/dotnet \
  /usr/share/swift
df -h /

If you are using a Mac runner, one thing you might want to add to this list is all the versions of Xcode you are not using.

We only needed Xcode 15.4, but these runners have half a dozen versions of Xcode installed. Thats a few GB of storage space for each.

@Hoverbear
Copy link

@BrentMifsud thanks for the breadcrumb here.

You can remove all but the latest Xcode with:

find /Applications/ -name "Xcode*" | sort -r | tail --lines=+2 | xargs rm -rf

@BrentMifsud
Copy link

@BrentMifsud thanks for the breadcrumb here.

You can remove all but the latest Xcode with:

find /Applications/ -name "Xcode*" | sort -r | tail --lines=+2 | xargs rm -rf

Thanks for this. Definitely much better than my solution

@Aaron-Ritter
Copy link

Aaron-Ritter commented Sep 7, 2024

find /Applications/ -name "Xcode*" | sort -r | tail --lines=+2 | xargs rm -rf

great hint, thanks!

i did implement it slightly different as you just have to remove the different folders and not each file individually:

  • like this the find is much quicker
  • i can choose what i want to keep
  • i can see whats left/deleted

In addition i coupled it with a matrix.xcode variable to initialization xcode in a specific version so basically what i initialize i want to keep and if i like i can run it against different versions of xcode.

      - name: Initialize latest xcode
        uses: maxim-lobanov/[email protected]
        with:
          xcode-version: ${{ matrix.xcode }}

      - name: Remove old xcode versions
        run: |
          echo "Searching for Xcode versions:"
          find /Applications -name "Xcode_*" -maxdepth 1 -mindepth 1
          echo "Removing old Xcode versions..."
          find /Applications -name "Xcode_*" -maxdepth 1 -mindepth 1 | grep -v ${{ matrix.xcode }} | xargs rm -rf
          echo "Available Xcode versions after removal:"
          find /Applications -name "Xcode_*" -maxdepth 1 -mindepth 1

martinszuc added a commit to martinszuc/intellij-openshift-connector that referenced this issue Oct 1, 2024
martinszuc added a commit to martinszuc/intellij-openshift-connector that referenced this issue Oct 1, 2024
martinszuc added a commit to martinszuc/intellij-openshift-connector that referenced this issue Oct 1, 2024
martinszuc added a commit to martinszuc/intellij-openshift-connector that referenced this issue Oct 1, 2024
martinszuc added a commit to martinszuc/intellij-openshift-connector that referenced this issue Oct 1, 2024
martinszuc added a commit to martinszuc/intellij-openshift-connector that referenced this issue Oct 2, 2024
essobedo added a commit to apache/camel-karaf that referenced this issue Oct 23, 2024
## Motivation

The build runs out of disk space which prevents adding new integration tests

## Modifications:

* Free up some disk space by:
  * following the workaround described [in this ticket](actions/runner-images#2840 (comment))
  * removing pre-pulled docker images
* Add a new System property `camel.karaf.itest.keep.docker.images` to indicate whether the docker images should be removed after the test. Locally, you can leverage the Maven property `keep.docker.images.on.exit` to keep them or not. By default, the docker images are cleaned up to prevent the disk space leak
* Extend the forked process exit timeout to ensure that pax-exam can stop properly
dparnell added a commit to dparnell/intellij-wgsl that referenced this issue Oct 25, 2024
singiamtel added a commit to alisw/docks that referenced this issue Nov 13, 2024
Some images (like slc9-gpu-builder) are too big for the free runner, so
we need to remove some stuff before, mostly unused language packages

Reference for the free disk space:
actions/runner-images#2840

Builder error message:
```
==> docker:    At least 11799MB more space needed on the / filesystem.
```
singiamtel added a commit to alisw/docks that referenced this issue Nov 13, 2024
Some images (like slc9-gpu-builder) are too big for the free runner, so
we need to remove some stuff before, mostly unused language packages

Reference for the free disk space:
actions/runner-images#2840

Builder error message:
```
==> docker:    At least 11799MB more space needed on the / filesystem.
```
singiamtel added a commit to alisw/docks that referenced this issue Nov 13, 2024
Some images (like slc9-gpu-builder) are too big for the free runner, so
we need to remove some stuff before, mostly unused language packages

Reference for the free disk space:
actions/runner-images#2840

Builder error message:
```
==> docker:    At least 11799MB more space needed on the / filesystem.
```
ktf pushed a commit to alisw/docks that referenced this issue Nov 13, 2024
Some images (like slc9-gpu-builder) are too big for the free runner, so
we need to remove some stuff before, mostly unused language packages

Reference for the free disk space:
actions/runner-images#2840

Builder error message:
```
==> docker:    At least 11799MB more space needed on the / filesystem.
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
OS: Ubuntu question Further information is requested
Projects
None yet
Development

No branches or pull requests