Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove ws-scheduler #7430

Merged
merged 1 commit into from
Jan 24, 2022
Merged

Remove ws-scheduler #7430

merged 1 commit into from
Jan 24, 2022

Conversation

aledbf
Copy link
Member

@aledbf aledbf commented Jan 4, 2022

Description

Remove custom ws-scheduler component

How to test

  • Workspaces should start without issues

Release Notes

Remove ws-scheduler component

@roboquat roboquat added release-note do-not-merge/work-in-progress team: delivery Issue belongs to the self-hosted team team: workspace Issue belongs to the Workspace team size/XXL labels Jan 4, 2022
@codecov
Copy link

codecov bot commented Jan 4, 2022

Codecov Report

Merging #7430 (1f516a3) into main (43198ce) will increase coverage by 18.21%.
The diff coverage is 90.00%.

Impacted file tree graph

@@             Coverage Diff             @@
##             main    #7430       +/-   ##
===========================================
+ Coverage   11.46%   29.68%   +18.21%     
===========================================
  Files          20      117       +97     
  Lines        1177    19290    +18113     
===========================================
+ Hits          135     5727     +5592     
- Misses       1039    13074    +12035     
- Partials        3      489      +486     
Flag Coverage Δ
components-content-service-app 13.92% <ø> (?)
components-content-service-lib 13.92% <ø> (?)
components-ee-agent-smith-app 40.05% <ø> (?)
components-ee-agent-smith-lib 40.32% <ø> (?)
components-gitpod-cli-app 10.20% <ø> (ø)
components-image-builder-mk3-app 35.26% <ø> (?)
components-installation-telemetry-app ∅ <ø> (?)
components-local-app-app-darwin-amd64 ?
components-local-app-app-darwin-arm64 ?
components-local-app-app-linux-amd64 ?
components-local-app-app-linux-arm64 ?
components-local-app-app-windows-386 ?
components-local-app-app-windows-amd64 ?
components-local-app-app-windows-arm64 ?
components-supervisor-app 35.41% <ø> (?)
components-workspacekit-app 7.19% <ø> (?)
components-ws-daemon-app 21.85% <ø> (?)
components-ws-daemon-lib 21.85% <ø> (?)
components-ws-manager-app 40.74% <100.00%> (?)
components-ws-proxy-app 68.32% <ø> (?)
installer-raw-app 5.76% <0.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
installer/pkg/components/ws-manager/configmap.go 30.05% <0.00%> (ø)
installer/pkg/components/ws-manager/tlssecret.go 0.00% <ø> (ø)
components/ws-manager/pkg/manager/create.go 83.17% <100.00%> (ø)
components/local-app/pkg/auth/auth.go
components/local-app/pkg/auth/pkce.go
installer/pkg/components/ws-manager/rolebinding.go 0.00% <0.00%> (ø)
components/ws-daemon/pkg/resources/controller.go 31.06% <0.00%> (ø)
components/workspacekit/cmd/nsenter.go 25.00% <0.00%> (ø)
components/supervisor/pkg/supervisor/ssh.go 0.00% <0.00%> (ø)
components/ws-daemon/pkg/content/archive.go 57.95% <0.00%> (ø)
... and 94 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 43198ce...1f516a3. Read the comment docs.

@stale
Copy link

stale bot commented Jan 15, 2022

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the meta: stale This issue/PR is stale and will be closed soon label Jan 15, 2022
@aledbf aledbf removed the meta: stale This issue/PR is stale and will be closed soon label Jan 15, 2022
@roboquat roboquat added the team: webapp Issue belongs to the WebApp team label Jan 18, 2022
@aledbf aledbf changed the title Remove ws-scheduler and custom schedulerName Remove ws-scheduler Jan 18, 2022
@aledbf aledbf marked this pull request as ready for review January 18, 2022 10:02
@aledbf
Copy link
Member Author

aledbf commented Jan 18, 2022

/werft run

👍 started the job as gitpod-build-aledbf-scheduler.4

Copy link
Contributor

@mrsimonemms mrsimonemms left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/lgtm
/approve

This does raise an interesting question with regards to the Installer - unlike Helm, the Installer doesn't remove Kubernetes objects that are no longer used. What that means is that an installation that's deploying this change will still have their ws-scheduler resources in their cluster.

@csweichel should we work out a way of a deployment removing unused resources when running the kubectl apply -f gitpod.yaml (for the record, I don't have any thoughts on how we might achieve that within the existing Installer workflow)

@roboquat
Copy link
Contributor

LGTM label has been added.

Git tree hash: acabba8e62748faa1d2e0b8e65b6f6ca8b4494f1

@iQQBot
Copy link
Contributor

iQQBot commented Jan 18, 2022

/approve

@csweichel
Copy link
Contributor

/lgtm /approve

This does raise an interesting question with regards to the Installer - unlike Helm, the Installer doesn't remove Kubernetes objects that are no longer used. What that means is that an installation that's deploying this change will still have their ws-scheduler resources in their cluster.

@csweichel should we work out a way of a deployment removing unused resources when running the kubectl apply -f gitpod.yaml (for the record, I don't have any thoughts on how we might achieve that within the existing Installer workflow)

we'll want to eventually implement the installer apply which would do that based on the configmap with k8s objects that we put in place

@csweichel
Copy link
Contributor

/lgtm

@aledbf
Copy link
Member Author

aledbf commented Jan 18, 2022

/assign @jankeromnes

@geropl
Copy link
Member

geropl commented Jan 18, 2022

/assign @geropl

@aledbf Awesome that we moved past beyond this... crutch. 🎉 (also: far well 🥲 )

Without looking at the changes in detail (looks like it's a plain removal) I have two questions:

  • (more out of curiosity, a link to discussion/PRs/doc is fine): How do we solve the pre-scaleup-problem in environments where we do not control the scheduler/cluster configuration? 🤔 (relevant for bigger self-hosted installations)
  • How do we roll this out? Especially for meta clusters, which currently still have ws-scheduler installed and execute prebuilds/image builds?

@aledbf
Copy link
Member Author

aledbf commented Jan 18, 2022

How do we solve the pre-scaleup-problem in environments where we do not control the scheduler/cluster configuration?

The idea here is to rely on the standard kubernetes scheduler or use one of the scheduler-plugins. For the majority of the scenarios, the standard scheduler is the right choice.

How do we roll this out?

Since last week, the workspace clusters gen27 are running without ghost and ws-scheduler.
For meta, I suggest we just remove the setting schedulerName from ws-manager configmap and execute a kubectl rollout restart as the first step and run one or two days to compare the behavior.

@geropl
Copy link
Member

geropl commented Jan 18, 2022

The idea here is to rely on the standard kubernetes scheduler or use one of the scheduler-plugins. For the majority of the scenarios, the standard scheduler is the right choice.

Thx, will read up on this. 🙂

For meta, I suggest we just remove the setting schedulerName from ws-manager configmap and execute a kubectl rollout restart.

@aledbf Alright, that'll work. Will add a notice to next deployment (tomorrow morning).

/lgtm

@geropl
Copy link
Member

geropl commented Jan 18, 2022

/hold

Holding to ensure we do not merge before tomorrows webapp deployment. Not strictly necessary, but keeps things simple in case sth else goes haywire. Will remove after the deployment went smoothly.

@geropl
Copy link
Member

geropl commented Jan 18, 2022

/approve no-issue

@csweichel
Copy link
Contributor

/lgtm

@roboquat roboquat added the lgtm label Jan 24, 2022
@roboquat
Copy link
Contributor

LGTM label has been added.

Git tree hash: 5507bb3a1677dbfacaa56f64e1e0c5485f98301c

@roboquat
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: csweichel, geropl, iQQBot, MrSimonEmms

Associated issue requirement bypassed by: geropl

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@csweichel
Copy link
Contributor

webapp deployment happened - let's do this

/hold cancel

@roboquat roboquat merged commit 82d786e into main Jan 24, 2022
@roboquat roboquat deleted the aledbf/scheduler branch January 24, 2022 19:08
@roboquat roboquat added the deployed: workspace Workspace team change is running in production label Jan 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved deployed: workspace Workspace team change is running in production release-note size/XXL team: delivery Issue belongs to the self-hosted team team: IDE team: webapp Issue belongs to the WebApp team team: workspace Issue belongs to the Workspace team
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants