Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Debian Bullseye installation issues #1654

Closed
rcdeoliveira opened this issue Oct 16, 2020 · 9 comments
Closed

Debian Bullseye installation issues #1654

rcdeoliveira opened this issue Oct 16, 2020 · 9 comments
Labels

Comments

@rcdeoliveira
Copy link

rcdeoliveira commented Oct 16, 2020

If I try to install latest version in Debian Bullseye I get

/snap/microk8s/1710/kubectl --kubeconfig=/var/snap/microk8s/1710/credentials/client.config apply -f /var/snap/microk8s/1710/args/cni-network/cni.yaml The connection to the server 127.0.0.1:16443 was refused - did you specify the right host or port?

I tried solution from #156 but only it works when I use an older release (like 1.12.9)

  • snapd version: 2.45.2-1
  • core version: 16-2.47
  • microk8s version: 1.19/stable
@ktsakalozos
Copy link
Member

ktsakalozos commented Oct 16, 2020

Why is the apiserver failing? could you attach the microk8s inspect tarball or at least the logs of the apiserver, journalctl -n 3000 -u snap.microk8s.daemon-apiserver ?

@rcdeoliveira
Copy link
Author

Attached the output of journalctl -n 3000 -u snap.microk8s.daemon-apiserver

snap.microk8s.daemon-apiserver.log

@balchua
Copy link
Collaborator

balchua commented Oct 17, 2020

Looks like it is being shut down gracefully.

oct 16 10:33:30 lusitania microk8s.daemon-apiserver[13139]: time="2020-10-16T10:33:30-04:00" level=error msg="failed to list /registry/priorityclasses/ for revision 50"
oct 16 10:33:30 lusitania systemd[1]: snap.microk8s.daemon-apiserver.service: Succeeded.
oct 16 10:33:30 lusitania systemd[1]: Stopped Service for snap application microk8s.daemon-apiserver.

Does it have a stable ip? Im guessing its the apiserver kicker force restarting apiserver.

@rcdeoliveira
Copy link
Author

Looks like it is being shut down gracefully.

oct 16 10:33:30 lusitania microk8s.daemon-apiserver[13139]: time="2020-10-16T10:33:30-04:00" level=error msg="failed to list /registry/priorityclasses/ for revision 50"
oct 16 10:33:30 lusitania systemd[1]: snap.microk8s.daemon-apiserver.service: Succeeded.
oct 16 10:33:30 lusitania systemd[1]: Stopped Service for snap application microk8s.daemon-apiserver.

Does it have a stable ip? Im guessing its the apiserver kicker force restarting apiserver.

It's a internal IP @balchua . Also it works with older versions.

@ktsakalozos
Copy link
Member

Would you be able to attach the microk8s inspect tarball? We will have to look at more logs.

@rcdeoliveira
Copy link
Author

Would you be able to attach the microk8s inspect tarball? We will have to look at more logs.

I couldn't get it after failure because all is removed after it. I paused the installation process and get it in that moment.

inspection-report-20201020_205335.tar.gz
inspection-report-20201020_210000.tar.gz

@ktsakalozos
Copy link
Member

@rcdeoliveira I see you are on btrfs, can you try one of the fixes suggested in #1587 (comment) ?

@rcdeoliveira
Copy link
Author

rcdeoliveira commented Oct 21, 2020

@rcdeoliveira I see you are on btrfs, can you try one of the fixes suggested in #1587 (comment) ?

@ktsakalozos It was a little bit tricky but finally I got it, I couldn't do it with regular procedure, I had to download and unpack the snap and modify a couple of archives to be restart microk8s (otherwise it was blocked by snap-install status):

mkdir /tmp/microk8s-test
cd /tmp/microk8s-test
snap download microk8s
unsquashfs microk8s_1769.snap
echo '"--feature-gates="LocalStorageCapacityIsolation=false"' >> squashfs-root/default-args/kubelet
echo '"--feature-gates="LocalStorageCapacityIsolation=false"' >> squashfs-root/microk8s-resources/default-args/kubelet
sudo snap try --classic squashfs-root

Thanks

@stale
Copy link

stale bot commented Sep 17, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the inactive label Sep 17, 2021
@stale stale bot closed this as completed Oct 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants