Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core: endless "cannot reuse client connection" warnings in log #22951

Closed
solongordon opened this issue Feb 22, 2018 · 2 comments · Fixed by #23298
Closed

core: endless "cannot reuse client connection" warnings in log #22951

solongordon opened this issue Feb 22, 2018 · 2 comments · Fixed by #23298
Assignees
Labels
S-3-ux-surprise Issue leaves users wondering whether CRDB is behaving properly. Likely to hurt reputation/adoption.
Milestone

Comments

@solongordon
Copy link
Contributor

I was running a local single-node cluster and at a certain point it started spitting out this warning once per second and never stopped:

W180222 18:28:19.495889 166 vendor/google.golang.org/grpc/clientconn.go:1158  grpc: addrConn.createTransport failed to connect to {Solons-MBP:26257 0  <nil>}. Err :connection error: desc = "transport: Error while dialing cannot reuse client connection".

As far as I can tell this didn't affect the functionality of the server. I could still run queries against it with no issues. I'll update with reproduction steps if I can figure out how to make this happen consistently.

Seems related to #22658 (cc @bdarnell).

@solongordon
Copy link
Contributor Author

I can consistently reproduce this by starting a node and then turning off my Wi-Fi connection for a few seconds (presumably because my hostname doesn't resolve anymore). The warnings keep coming after I turn Wi-Fi back on.

@knz knz added this to the 2.0 milestone Feb 26, 2018
@knz knz added the S-3-ux-surprise Issue leaves users wondering whether CRDB is behaving properly. Likely to hurt reputation/adoption. label Feb 26, 2018
@bdarnell bdarnell self-assigned this Mar 1, 2018
bdarnell added a commit to bdarnell/cockroach that referenced this issue Mar 1, 2018
Local connections can fail if the machine's network configuration
changes (for example, if you turn wifi off on a laptop). We disabled
heartbeating of local connections in cockroachdb#16526 to work around a UI issue
(in which the UI would fail to recover after a local connection
failed). This relied on GRPC's internal reconnections, which were
disabled in cockroachdb#22658. This commit re-enables local heartbeats so that
our reconnection logic works as usual (In the meantime, the UI has
gotten better about error recovery, so cockroachdb#16526 is no longer necessary).

Fixes cockroachdb#22951

Release note (bug fix): The admin UI no longer hangs after a node's
network configuration has changed.
@bdarnell
Copy link
Contributor

bdarnell commented Mar 1, 2018

This isn't just log spam - parts of the UI would be broken by this too (at least debug pages - the main UI seems OK). Fix in #23298.

bdarnell added a commit to bdarnell/cockroach that referenced this issue Mar 5, 2018
Local connections can fail if the machine's network configuration
changes (for example, if you turn wifi off on a laptop). We disabled
heartbeating of local connections in cockroachdb#16526 to work around a UI issue
(in which the UI would fail to recover after a local connection
failed). This relied on GRPC's internal reconnections, which were
disabled in cockroachdb#22658. This commit re-enables local heartbeats so that
our reconnection logic works as usual (In the meantime, the UI has
gotten better about error recovery, so cockroachdb#16526 is no longer necessary).

Fixes cockroachdb#22951

Release note (bug fix): The admin UI no longer hangs after a node's
network configuration has changed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
S-3-ux-surprise Issue leaves users wondering whether CRDB is behaving properly. Likely to hurt reputation/adoption.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants