Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Manager panics when executed without a valid kube context #2054

Closed
mdbooth opened this issue May 3, 2024 · 0 comments · Fixed by #2057
Closed

Manager panics when executed without a valid kube context #2054

mdbooth opened this issue May 3, 2024 · 0 comments · Fixed by #2057
Labels
kind/bug Categorizes issue or PR as related to a bug.

Comments

@mdbooth
Copy link
Contributor

mdbooth commented May 3, 2024

/kind bug

podman run --rm -ti gcr.io/k8s-staging-capi-openstack/capi-openstack-controller:v0.10.1 -- --version    
E0503 16:03:01.692120       1 config.go:133] "unable to load in-cluster config" err="unable to load in-cluster configuration, KUBERNETES_SERVICE_HOST and KUBERNETES_SERVICE_PORT must be defined" logger="controller-runtime.client.config"
E0503 16:03:01.693959       1 main.go:235] "unable to get kubeconfig" err="invalid configuration: no configuration has been provided, try setting KUBERNETES_MASTER environment variable" logger="setup"
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x1d0 pc=0x1ba6cc8]

goroutine 1 [running]:
main.main()
        /workspace/main.go:237 +0x3c8

Looks like we're missing an exit here:

cfg, err := config.GetConfigWithContext(os.Getenv("KUBECONTEXT"))
if err != nil {
setupLog.Error(err, "unable to get kubeconfig")
}
cfg.QPS = restConfigQPS

Note that it didn't crash in v0.9.0, but I suspect that was due to luck. I'm guessing this broke when we bumped controller-runtime.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
Archived in project
2 participants