-
Notifications
You must be signed in to change notification settings - Fork 628
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scale down not getting all runners for org #1151
Comments
30 results is standard behaviour according the the pagination docs. async function run(): Promise<void> {
// console.log(await getRunners('org-name'));
await getRunners('org-name').then((runners) => console.log(runners))
} |
wait... I just saw your open PR #1164. Is your question here still relevant? :-) |
The 30 results per page is expected if just calling the endpoint without I would have expected this to also be properly awaited but it didn't appear to be in my local testing. The PR is for a different issue where subsequent Lambda runs can retain data in memory. Because of that, the class still has data from previous runs. |
@mcaulifn So scale down is not working over 30+ runners? We haven't seen an issue over the last month wit a larger set of runners. |
I am not seeing this consistently. I'll close this for now and re-open if we see it again. |
We're seeing a similar issue. We run 500+ runners routinely, so if the pagination was broken it would really cause chaos that would be more obvious. For that reason I'm fairly confident the pagination code works just fine. However, it seems like we're still seeing situations where Runners are missing from the data that Github provides. If it was inconsistent for @mcaulifn I wonder if there's a chance this is actually an API bug on Github's side? We're having cases where we can see a Job clearly in the middle executing on a Runner, seeing the logs of the scale down attempt to do the hard termination (e.g. not bothering to de-register first). I wonder if, in that unhappy path of termination without de-registering we should re query Github to double check it's not there? |
Should this be re-opened? We hit this issue often, and we have hundreds of runners often.
|
We're also seeing this issue fairly frequently as well. |
Can we get this issue re-opened? We are seeing it very consistently. |
The pagination here is not working correctly for getting all runners for an org. If there are more than 30, the full list is not fetched. I confirmed that this works with this code:
The text was updated successfully, but these errors were encountered: