Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Connection errors with v2 #71

Closed
dortort opened this issue Apr 28, 2020 · 19 comments · Fixed by #82
Closed

Connection errors with v2 #71

dortort opened this issue Apr 28, 2020 · 19 comments · Fixed by #82
Assignees
Labels
bug Something isn't working priority very important v2 Related to the v2 version of the action

Comments

@dortort
Copy link

dortort commented Apr 28, 2020

image

@konradpabjan
Copy link
Collaborator

hmmm.. so I stumbled on this issue early on during the v2-preview and it seemed like it was fixed since no one reported it after some major changes, however it looks like it can still happen. If anyone is hitting this and it's a consistent problem, I would suggest to pin to actions/upload-artifact@v1

A large chunk of users are pinned to master and as a result have automatically switched over to v2. I warned about this a long time ago: #41

@konradpabjan konradpabjan self-assigned this Apr 29, 2020
@konradpabjan konradpabjan added the bug Something isn't working label Apr 29, 2020
@zhangyoufu
Copy link

This issue affect not only large files. I have a 2KB file failed uploading.

log
2020-04-29T09:51:30.3872622Z ##[group]Run actions/upload-artifact@v2
2020-04-29T09:51:30.3873321Z with:
2020-04-29T09:51:30.3873766Z   name: repo-v3.10-x86_64
2020-04-29T09:51:30.3874031Z   path: repo/
2020-04-29T09:51:30.3874294Z env:
2020-04-29T09:51:30.3874559Z   PACKAGER: <redacted>
2020-04-29T09:51:30.3874872Z   PRIVATE_KEY_FILENAME: <redacted>
2020-04-29T09:51:30.3875153Z   ALPINE_ARCH: x86_64
2020-04-29T09:51:30.3875420Z ##[endgroup]
2020-04-29T09:51:30.4564412Z With the provided path, there will be 10 files uploaded
2020-04-29T09:51:40.5163683Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:51:50.5176989Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:52:00.5186333Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:52:01.9944092Z An error has been caught http-client index 0, retrying the upload
2020-04-29T09:52:01.9966055Z Error: connect ETIMEDOUT 13.107.42.16:443
2020-04-29T09:52:01.9967525Z     at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1129:14) {
2020-04-29T09:52:01.9968149Z   errno: 'ETIMEDOUT',
2020-04-29T09:52:01.9968726Z   code: 'ETIMEDOUT',
2020-04-29T09:52:01.9969700Z   syscall: 'connect',
2020-04-29T09:52:01.9970225Z   address: '13.107.42.16',
2020-04-29T09:52:01.9970547Z   port: 443
2020-04-29T09:52:01.9970856Z }
2020-04-29T09:52:01.9971292Z Exponential backoff for retry #1. Waiting for 5296.003694819559 milliseconds before continuing the upload at offset 0
2020-04-29T09:52:07.2937924Z Finished backoff for retry #1, continuing with upload
2020-04-29T09:52:10.5197675Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:52:20.5200369Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:52:30.5209754Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:52:40.5221278Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:52:50.5239814Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:53:00.5257723Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:53:10.5267189Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:53:11.1009698Z An error has been caught http-client index 0, retrying the upload
2020-04-29T09:53:11.1012367Z Error: read ECONNRESET
2020-04-29T09:53:11.1012882Z     at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
2020-04-29T09:53:11.1013481Z   errno: 'ECONNRESET',
2020-04-29T09:53:11.1014215Z   code: 'ECONNRESET',
2020-04-29T09:53:11.1014905Z   syscall: 'read'
2020-04-29T09:53:11.1015262Z }
2020-04-29T09:53:11.1015663Z Exponential backoff for retry #2. Waiting for 10022.78797842833 milliseconds before continuing the upload at offset 0
2020-04-29T09:53:20.5271434Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:53:21.1240154Z Finished backoff for retry #2, continuing with upload
2020-04-29T09:53:30.5278457Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:53:40.5296525Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:53:50.5307434Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:54:00.5328395Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:54:10.5336782Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:54:20.5346772Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:54:28.2221574Z An error has been caught http-client index 0, retrying the upload
2020-04-29T09:54:28.2222322Z Error: read ECONNRESET
2020-04-29T09:54:28.2222742Z     at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
2020-04-29T09:54:28.2224143Z   errno: 'ECONNRESET',
2020-04-29T09:54:28.2225255Z   code: 'ECONNRESET',
2020-04-29T09:54:28.2225732Z   syscall: 'read'
2020-04-29T09:54:28.2226032Z }
2020-04-29T09:54:28.2226445Z Exponential backoff for retry #3. Waiting for 19342.55671356728 milliseconds before continuing the upload at offset 0
2020-04-29T09:54:30.5346210Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:54:40.5356824Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:54:47.5649679Z Finished backoff for retry #3, continuing with upload
2020-04-29T09:54:50.5361204Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:55:00.5368024Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:55:10.5385694Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:55:20.5392804Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:55:30.5400266Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:55:40.5408737Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:55:50.5414566Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:55:53.6119501Z An error has been caught http-client index 0, retrying the upload
2020-04-29T09:55:53.6120328Z Error: read ECONNRESET
2020-04-29T09:55:53.6121103Z     at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
2020-04-29T09:55:53.6121717Z   errno: 'ECONNRESET',
2020-04-29T09:55:53.6122259Z   code: 'ECONNRESET',
2020-04-29T09:55:53.6123214Z   syscall: 'read'
2020-04-29T09:55:53.6123513Z }
2020-04-29T09:55:53.6123868Z Exponential backoff for retry #4. Waiting for 25885.27396668191 milliseconds before continuing the upload at offset 0
2020-04-29T09:56:00.5426199Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:56:11.1837838Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:56:19.4980462Z Finished backoff for retry #4, continuing with upload
2020-04-29T09:56:20.5446425Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:56:30.5455668Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:56:40.5465810Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:56:50.5472290Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:57:00.5479632Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:57:10.7009579Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:57:20.5488767Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:57:29.1251093Z An error has been caught http-client index 0, retrying the upload
2020-04-29T09:57:29.1251908Z Error: read ECONNRESET
2020-04-29T09:57:29.1252367Z     at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
2020-04-29T09:57:29.1253032Z   errno: 'ECONNRESET',
2020-04-29T09:57:29.1253654Z   code: 'ECONNRESET',
2020-04-29T09:57:29.1254348Z   syscall: 'read'
2020-04-29T09:57:29.1254913Z }
2020-04-29T09:57:29.1255280Z Exponential backoff for retry #5. Waiting for 29909.119046919102 milliseconds before continuing the upload at offset 0
2020-04-29T09:57:30.5503810Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:57:40.5511448Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:57:50.5522066Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:57:59.0341599Z Finished backoff for retry #5, continuing with upload
2020-04-29T09:58:00.5531583Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:58:10.5538163Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:58:20.5546900Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:58:30.5545867Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:58:40.5552348Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:58:50.5560924Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:59:00.5568820Z Total file count: 10 ---- Processed file #9 (90.0%)
2020-04-29T09:59:04.5865660Z An error has been caught http-client index 0, retrying the upload
2020-04-29T09:59:04.5867138Z Error: read ECONNRESET
2020-04-29T09:59:04.5867551Z     at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
2020-04-29T09:59:04.5868193Z   errno: 'ECONNRESET',
2020-04-29T09:59:04.5868739Z   code: 'ECONNRESET',
2020-04-29T09:59:04.5869263Z   syscall: 'read'
2020-04-29T09:59:04.5869991Z }
2020-04-29T09:59:04.5871532Z Retry limit has been reached for chunk at offset 0 to https://pipelines.actions.githubusercontent.com/0t0JAJWM9suQiNosbQh5B1xDbgmZcCzCAi4NasTzit1V1ninOo/_apis/resources/Containers/1935682?itemPath=repo-v3.10-x86_64%2Fv3.10%2Fmain%2Fx86_64%2FAPKINDEX.tar.gz
2020-04-29T09:59:04.5874495Z ##[warning]Aborting upload for /home/runner/work/alpine/alpine/repo/v3.10/main/x86_64/APKINDEX.tar.gz due to failure
2020-04-29T09:59:04.5878594Z Total size of all the files uploaded is 631439 bytes
2020-04-29T09:59:04.6148983Z Finished uploading artifact repo-v3.10-x86_64. Reported size is 631439 bytes. There were 1 items that failed to upload
2020-04-29T09:59:04.6149623Z Artifact upload has finished successfully!

And upload-artifacts should exit with a non-zero exit code, while its current behavior is ignore and continue.

@konradpabjan konradpabjan added the priority very important label Apr 29, 2020
@Bo98
Copy link

Bo98 commented Apr 30, 2020

And upload-artifacts should exit with a non-zero exit code, while its current behavior is ignore and continue.

That happened in v1 too, but I agree. I have to always manually check the logs to make sure the upload was successful and it would be great not to have to do that.

@konradpabjan
Copy link
Collaborator

For those that users that hit this issue, could you please provide a link to the run that was faulty? (if your repo is public of course).

I'm basically trying to see if there is some pattern between when the first ECONNRESET is hit and when the upload starts. Possibly there might be some default timeouts or something else going on behind the scenes.

I don't think the issue is related to the type of file (large or small upload), but rather how long the upload is taking

@zhangyoufu
Copy link

@konradpabjan My run log is available at https://github.com/zhangyoufu/alpine/runs/629165891.

Is there any simultaneous connection limit, QPS limit on Azure Artifacts side?

@murgatroid99
Copy link

I just hit this bug. The log is available at https://github.com/grpc/grpc-node/runs/654464810

@konradpabjan konradpabjan added the v2 Related to the v2 version of the action label May 11, 2020
@konradpabjan konradpabjan changed the title Connection errors when uploading large files with v2 Connection errors with v2 May 11, 2020
@konradpabjan
Copy link
Collaborator

konradpabjan commented May 11, 2020

I think it might be related to this... actions/cache#298

The retry should work, however all 4 retries fail with the same error, so I suspect this could be an issue with the readstream.

An ECONNRESET or TIMEOUT can happen from time to time (the internet isn't perfect), however our retries should work. Our uploadChunk logic is very similar. Seen here

@rreynier
Copy link

Just bumped into this myself (in private repo though). In addition to this being a really odd error, it would be ideal if this failed the job also. Downloading the artifact does fail, so at least my whole deploy process didn't try to push something out to production!

@konradpabjan
Copy link
Collaborator

This should 🤞 be fixed with the PR that was just merged. I'll be updating the v2 tag with a new release in a bit.

@agrobbin
Copy link

@konradpabjan I'm not sure if this is the same issue, but we've been seeing a lot of 500 errors when upload-artifact attempts to upload our artifacts. We had seen the connection reset issue every so often, but recently it switched to a 500.

Unfortunately it's in a private repo, so I don't know how best to share an example. Any help would be great! I can also create a separate issue for this if you don't think it's related. For reference, we are depending on actions/upload-artifact@v2, and are getting this for all 3 files we are attempting to upload:

##### Begin Diagnostic HTTP information #####
Status Code: 500
Status Message: Internal Server Error

@konradpabjan
Copy link
Collaborator

Thanks for the notice @agrobbin

This looks to be another problem. Could you open up another issue? We can track it there.

@agrobbin
Copy link

@konradpabjan done in #84!

@nostadt
Copy link

nostadt commented May 21, 2021

I hit it

An error has been caught http-client index 1, retrying the upload
Error: read ECONNRESET
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
  errno: 'ECONNRESET',
  code: 'ECONNRESET',
  syscall: 'read'
}
Exponential backoff for retry #1. Waiting for 5077.766121471924 milliseconds before continuing the upload at offset 0
Finished backoff for retry #1, continuing with upload
Total file count: 8402 ---- Processed file #5062 (60.2%)
Total file count: 8402 ---- Processed file #5215 (62.0%)
Total file count: 8402 ---- Processed file #5353 (63.7%)
Total file count: 8402 ---- Processed file #5499 (65.4%)
Total file count: 8402 ---- Processed file #5640 (67.1%)
Total file count: 8402 ---- Processed file #5789 (68.9%)
Total file count: 8402 ---- Processed file #5933 (70.6%)
An error has been caught http-client index 0, retrying the upload
Error: read ECONNRESET
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
  errno: 'ECONNRESET',
  code: 'ECONNRESET',
  syscall: 'read'
}
Exponential backoff for retry #1. Waiting for 6224.009953806136 milliseconds before continuing the upload at offset 0
Finished backoff for retry #1, continuing with upload
Total file count: 8402 ---- Processed file #5986 (71.2%)
Total file count: 8402 ---- Processed file #6126 (72.9%)
Total file count: 8402 ---- Processed file #6268 (74.6%)
Total file count: 8402 ---- Processed file #6412 (76.3%)
Total file count: 8402 ---- Processed file #6549 (77.9%)

The repo is private but this is the job:

  create_artifact:
    name: Create Artifact
    runs-on: ubuntu-20.04
    needs: [tests_phpunit]
    steps:
      - uses: actions/checkout@v2

      - name: Switch default PHP Version to 7.4
        run: sudo update-alternatives --set php /usr/bin/php7.4

      - name: composer install
        run: composer install --no-interaction --no-progress

      - name: yarn install
        run: yarn install

      - name: yarn build
        run: yarn build

      - name: Create development artifact
        uses: actions/upload-artifact@v2
        with:
          name: development-code
          path: |
            assets
            bin
            config
            public
            src
            templates
            translations
            vendor
            .env
            .env.test
            composer.json
            composer.lock

During a second run I noticed this one:

An error has been caught http-client index 1, retrying the upload
Error: read ECONNRESET
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
  errno: 'ECONNRESET',
  code: 'ECONNRESET',
  syscall: 'read'
}
Exponential backoff for retry #1. Waiting for 4936.179816716808 milliseconds before continuing the upload at offset 0
An error has been caught http-client index 0, retrying the upload
Error: Client has already been disposed.
    at HttpClient.request (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:5694:19)
    at HttpClient.sendStream (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:5655:21)
    at UploadHttpClient.<anonymous> (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:7104:37)
    at Generator.next (<anonymous>)
    at /home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:6834:71
    at new Promise (<anonymous>)
    at module.exports.608.__awaiter (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:6830:12)
    at uploadChunkRequest (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:7102:46)
    at UploadHttpClient.<anonymous> (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:7139:38)
    at Generator.next (<anonymous>)

@nitish166
Copy link

I hit it

An error has been caught http-client index 1, retrying the upload
Error: read ECONNRESET
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
  errno: 'ECONNRESET',
  code: 'ECONNRESET',
  syscall: 'read'
}
Exponential backoff for retry #1. Waiting for 5077.766121471924 milliseconds before continuing the upload at offset 0
Finished backoff for retry #1, continuing with upload
Total file count: 8402 ---- Processed file #5062 (60.2%)
Total file count: 8402 ---- Processed file #5215 (62.0%)
Total file count: 8402 ---- Processed file #5353 (63.7%)
Total file count: 8402 ---- Processed file #5499 (65.4%)
Total file count: 8402 ---- Processed file #5640 (67.1%)
Total file count: 8402 ---- Processed file #5789 (68.9%)
Total file count: 8402 ---- Processed file #5933 (70.6%)
An error has been caught http-client index 0, retrying the upload
Error: read ECONNRESET
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
  errno: 'ECONNRESET',
  code: 'ECONNRESET',
  syscall: 'read'
}
Exponential backoff for retry #1. Waiting for 6224.009953806136 milliseconds before continuing the upload at offset 0
Finished backoff for retry #1, continuing with upload
Total file count: 8402 ---- Processed file #5986 (71.2%)
Total file count: 8402 ---- Processed file #6126 (72.9%)
Total file count: 8402 ---- Processed file #6268 (74.6%)
Total file count: 8402 ---- Processed file #6412 (76.3%)
Total file count: 8402 ---- Processed file #6549 (77.9%)

The repo is private but this is the job:

  create_artifact:
    name: Create Artifact
    runs-on: ubuntu-20.04
    needs: [tests_phpunit]
    steps:
      - uses: actions/checkout@v2

      - name: Switch default PHP Version to 7.4
        run: sudo update-alternatives --set php /usr/bin/php7.4

      - name: composer install
        run: composer install --no-interaction --no-progress

      - name: yarn install
        run: yarn install

      - name: yarn build
        run: yarn build

      - name: Create development artifact
        uses: actions/upload-artifact@v1
        with:
          name: development-code
          path: |
            assets
            bin
            config
            public
            src
            templates
            translations
            vendor
            .env
            .env.test
            composer.json
            composer.lock

During a second run I noticed this one:

An error has been caught http-client index 1, retrying the upload
Error: read ECONNRESET
    at TLSWrap.onStreamRead (internal/stream_base_commons.js:201:27) {
  errno: 'ECONNRESET',
  code: 'ECONNRESET',
  syscall: 'read'
}
Exponential backoff for retry #1. Waiting for 4936.179816716808 milliseconds before continuing the upload at offset 0
An error has been caught http-client index 0, retrying the upload
Error: Client has already been disposed.
    at HttpClient.request (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:5694:19)
    at HttpClient.sendStream (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:5655:21)
    at UploadHttpClient.<anonymous> (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:7104:37)
    at Generator.next (<anonymous>)
    at /home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:6834:71
    at new Promise (<anonymous>)
    at module.exports.608.__awaiter (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:6830:12)
    at uploadChunkRequest (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:7102:46)
    at UploadHttpClient.<anonymous> (/home/runner/work/_actions/actions/upload-artifact/v2/dist/index.js:7139:38)
    at Generator.next (<anonymous>)

@nitish166
Copy link

nitish166 commented May 23, 2021

Hi, @AMartinNo1 Please change the uses: actions/upload-artifact@v1 I believe it would work for you as well. I was getting the same issue but changing the actions/upload-artifact@v1 and now it is working fine for me. Let us know if any other issue.

@nostadt
Copy link

nostadt commented May 23, 2021

Hi, @AMartinNo1 Please change the uses: actions/upload-artifact@v1 I believe it would be for you as well. I was getting the same issue but changing the actions/upload-artifact@v1 and now it is working fine for me. Let us know if any other issue.

I will keep that in mind. For now I stopped using the artefact.

@Monte-Christo
Copy link

Just ran into this. Here are my logs:

logs_35.zip

@woter1832
Copy link

woter1832 commented Nov 13, 2021

This error doesn't appear to have been fixed. It eventually works after about 10 retries.
Using a self-hosted runner, version 2.284.0.

Task:

- name: 1.5 - Upload artifact for deployment job
  uses: actions/upload-artifact@v2
  with:
    name: .net-app
    path: ${{env.DOTNET_ROOT}}/${{env.APP_NAME}}

Error:

Exponential backoff for retry #1. Waiting for 5745.648822939182 milliseconds before continuing the upload at offset 0
Total file count: 1952 ---- Processed file #1749 (89.6%)
Finished backoff for retry #1, continuing with upload
Total file count: 1952 ---- Processed file #1796 (92.0%)
Total file count: 1952 ---- Processed file #1823 (93.3%)
An error has been caught http-client index 0, retrying the upload
Error: connect ECONNREFUSED 13.107.42.16:443
    at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1129:14) {
  errno: 'ECONNREFUSED',
  code: 'ECONNREFUSED',
  syscall: 'connect',
  address: '13.107.42.16',
  port: 443

@citkane
Copy link

citkane commented Jan 19, 2023

EDIT: It is Github Itself experiencing an issue. Apologies for raising any alarms...

I have just bumped into this, so while the re-try is ticking to 5 and the minutes past 15, I thought I would post it here for posterity.

This is occurring after a successful document build to my gh-pages branch, and the automatic github-pages "pages build and deployment" action runs:

https://github.com/citkane/typedoc-theme-yaf/actions/runs/3960780756/jobs/6785378185

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working priority very important v2 Related to the v2 version of the action
Projects
None yet
Development

Successfully merging a pull request may close this issue.