Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

etcd3 seems te have a grpc-gateway issue wenn the gateway is connecting to etcd via https ( --listen-client-urls https://172.20.0.2:2379,https://127.0.0.1:2379) #17969

Closed
4 tasks
ntap-jbo opened this issue May 8, 2024 · 0 comments

Comments

@ntap-jbo
Copy link

ntap-jbo commented May 8, 2024

Bug report criteria

What happened?

I get this error:
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.412619Z","caller":"grpclog/grpclog.go:37","msg":"[core][Channel #4 SubChannel #6] Subchannel Connectivity change to IDLE, last error: connection error: desc = "error reading server preface: remote error: tls: bad certificate""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.412821Z","caller":"grpclog/grpclog.go:37","msg":"[core][pick-first-lb 0xc001414d80] Received SubConn state update: 0xc001414f60, {ConnectivityState:IDLE ConnectionError:connection error: desc = "error reading server preface: remote error: tls: bad certificate"}"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.41287Z","caller":"grpclog/grpclog.go:37","msg":"[core][Channel #4 SubChannel #6] Subchannel Connectivity change to CONNECTING"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.412931Z","caller":"grpclog/grpclog.go:37","msg":"[core][Channel #4 SubChannel #6] Subchannel picks a new address "127.0.0.1:2379" to connect"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.41311Z","caller":"grpclog/grpclog.go:37","msg":"[core][pick-first-lb 0xc001414d80] Received SubConn state update: 0xc001414f60, {ConnectivityState:CONNECTING ConnectionError:}"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"warn","ts":"2024-05-08T10:30:19.415787Z","caller":"embed/config_logging.go:287","msg":"rejected connection","remote-addr":"127.0.0.1:40384","server-name":"","error":"tls: failed to verify certificate: x509: certificate specifies an incompatible key usage"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.415902Z","caller":"grpclog/grpclog.go:37","msg":"[transport][client-transport 0xc000153b00] Closing: connection error: desc = "error reading server preface: remote error: tls: bad certificate""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.416043Z","caller":"grpclog/grpclog.go:37","msg":"[core]Creating new client transport to "{Addr: \"127.0.0.1:2379\", ServerName: \"127.0.0.1:2379\", }": connection error: desc = "error reading server preface: remote error: tls: bad certificate""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"warn","ts":"2024-05-08T10:30:19.416078Z","caller":"grpclog/grpclog.go:46","msg":"[core][Channel #4 SubChannel #6] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "error reading server preface: remote error: tls: bad certificate""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.416107Z","caller":"grpclog/grpclog.go:37","msg":"[core][Channel #4 SubChannel #6] Subchannel Connectivity change to TRANSIENT_FAILURE, last error: connection error: desc = "error reading server preface: remote error: tls: bad certificate""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.416152Z","caller":"grpclog/grpclog.go:37","msg":"[core][pick-first-lb 0xc001414d80] Received SubConn state update: 0xc001414f60, {ConnectivityState:TRANSIENT_FAILURE ConnectionError:connection error: desc = "error reading server preface: remote error: tls: bad certificate"}"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.416204Z","caller":"grpclog/grpclog.go:37","msg":"[transport][client-transport 0xc000153b00] loopyWriter exiting with error: transport closed by client"}

What did you expect to happen?

That grpc-gateway can connect via https to etcd

How can we reproduce it (as minimally and precisely as possible)?

I've created the certificates like:
https://medium.com/nirman-tech-blog/setting-up-etcd-cluster-with-tls-authentication-enabled-49c44e4151bb
My instances are called etcd1 etc in stead of member-1 etc

My configuration /etc/systemd/system/etcd.service is:
[Unit]
Description=etcd service
Documentation=https://github.com/coreos/etcd

[Service]
User=root
Type=notify
ExecStart=/usr/bin/etcd
--name etcd1
--data-dir /var/lib/etcd
--initial-advertise-peer-urls https://172.20.0.2:2380
--listen-peer-urls https://172.20.0.2:2380
--listen-client-urls https://172.20.0.2:2379,https://127.0.0.1:2379
--advertise-client-urls https://172.20.0.2:2379
--initial-cluster-token etcd-cluster-1
--initial-cluster etcd1=https://172.20.0.2:2380,etcd2=https://172.20.0.3:2380,etcd3=https://172.20.0.4:2380
--client-cert-auth --trusted-ca-file=/etc/etcd/ca.pem
--cert-file=/etc/etcd/server.pem --key-file=/etc/etcd/server-key.pem
--peer-client-cert-auth --peer-trusted-ca-file=/etc/etcd/ca.pem
--peer-cert-file=/etc/etcd/etcd1.pem --peer-key-file=/etc/etcd/etcd1-key.pem
--initial-cluster-state new
--log-outputs="systemd/journal"
--logger="zap"
--log-level=debug
Restart=on-failure
RestartSec=5

[Install]
WantedBy=multi-user.target

Anything else we need to know?

When I change this line:
--listen-client-urls https://172.20.0.2:2379,https://127.0.0.1:2379
to
--listen-client-urls https://172.20.0.2:2379,http://127.0.0.1:2379 \

I don't have an issue anymore. (The set-up is a etcd3 Patroni setup.) with above setting (http) it works flawlessly.

Etcd version (please run commands below)

$ etcd --version
# paste output here

$ etcdctl version
# paste output here

etcdctl version: 3.4.32
API version: 3.4

and

etcdctl --cacert /etc/etcd/ca.pem --cert /etc/etcd/client.pem --key /etc/etcd/client-key.pem --endpoints=https://127.0.0.1:2379 version
etcdctl version: 3.5.13
API version: 3.5

Etcd configuration (command line flags or environment variables)

paste your configuration here

[Unit]
Description=etcd service
Documentation=https://github.com/coreos/etcd

[Service]
User=root
Type=notify
ExecStart=/usr/bin/etcd
--name etcd1
--data-dir /var/lib/etcd
--initial-advertise-peer-urls https://172.20.0.2:2380
--listen-peer-urls https://172.20.0.2:2380
--listen-client-urls https://172.20.0.2:2379,https://127.0.0.1:2379
--advertise-client-urls https://172.20.0.2:2379
--initial-cluster-token etcd-cluster-1
--initial-cluster etcd1=https://172.20.0.2:2380,etcd2=https://172.20.0.3:2380,etcd3=https://172.20.0.4:2380
--client-cert-auth --trusted-ca-file=/etc/etcd/ca.pem
--cert-file=/etc/etcd/server.pem --key-file=/etc/etcd/server-key.pem
--peer-client-cert-auth --peer-trusted-ca-file=/etc/etcd/ca.pem
--peer-cert-file=/etc/etcd/etcd1.pem --peer-key-file=/etc/etcd/etcd1-key.pem
--initial-cluster-state new
--log-outputs="systemd/journal"
--logger="zap"
--log-level=debug
Restart=on-failure
RestartSec=5

[Install]
WantedBy=multi-user.target

Etcd debug information (please run commands below, feel free to obfuscate the IP address or FQDN in the output)

$ etcdctl member list -w table
# paste output here

$ etcdctl --endpoints=<member list> endpoint status -w table
# paste output here

root@etcd1:# etcdctl --cacert /etc/etcd/ca.pem --cert /etc/etcd/client.pem --key /etc/etcd/client-key.pem --endpoints=https://127.0.0.1:2379 member list -w table
+------------------+---------+-------+-------------------------+-------------------------+------------+
| ID | STATUS | NAME | PEER ADDRS | CLIENT ADDRS | IS LEARNER |
+------------------+---------+-------+-------------------------+-------------------------+------------+
| 1ece3cda8fd346db | started | etcd1 | https://172.20.0.2:2380 | https://172.20.0.2:2379 | false |
| 4f614e32162d6bdb | started | etcd3 | https://172.20.0.4:2380 | https://172.20.0.4:2379 | false |
| d2878f3fa0c02172 | started | etcd2 | https://172.20.0.3:2380 | https://172.20.0.3:2379 | false |
+------------------+---------+-------+-------------------------+-------------------------+------------+
root@etcd1:
#

Relevant log output

3.4.32

May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.412619Z","caller":"grpclog/grpclog.go:37","msg":"[core][Channel #4 SubChannel #6] Subchannel Connectivity change to IDLE, last error: connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.412821Z","caller":"grpclog/grpclog.go:37","msg":"[core][pick-first-lb 0xc001414d80] Received SubConn state update: 0xc001414f60, {ConnectivityState:IDLE ConnectionError:connection error: desc = \"error reading server preface: remote error: tls: bad certificate\"}"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.41287Z","caller":"grpclog/grpclog.go:37","msg":"[core][Channel #4 SubChannel #6] Subchannel Connectivity change to CONNECTING"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.412931Z","caller":"grpclog/grpclog.go:37","msg":"[core][Channel #4 SubChannel #6] Subchannel picks a new address \"127.0.0.1:2379\" to connect"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.41311Z","caller":"grpclog/grpclog.go:37","msg":"[core][pick-first-lb 0xc001414d80] Received SubConn state update: 0xc001414f60, {ConnectivityState:CONNECTING ConnectionError:<nil>}"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"warn","ts":"2024-05-08T10:30:19.415787Z","caller":"embed/config_logging.go:287","msg":"rejected connection","remote-addr":"127.0.0.1:40384","server-name":"","error":"tls: failed to verify certificate: x509: certificate specifies an incompatible key usage"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.415902Z","caller":"grpclog/grpclog.go:37","msg":"[transport][client-transport 0xc000153b00] Closing: connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.416043Z","caller":"grpclog/grpclog.go:37","msg":"[core]Creating new client transport to \"{Addr: \\\"127.0.0.1:2379\\\", ServerName: \\\"127.0.0.1:2379\\\", }\": connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"warn","ts":"2024-05-08T10:30:19.416078Z","caller":"grpclog/grpclog.go:46","msg":"[core][Channel #4 SubChannel #6] grpc: addrConn.createTransport failed to connect to {Addr: \"127.0.0.1:2379\", ServerName: \"127.0.0.1:2379\", }. Err: connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.416107Z","caller":"grpclog/grpclog.go:37","msg":"[core][Channel #4 SubChannel #6] Subchannel Connectivity change to TRANSIENT_FAILURE, last error: connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.416152Z","caller":"grpclog/grpclog.go:37","msg":"[core][pick-first-lb 0xc001414d80] Received SubConn state update: 0xc001414f60, {ConnectivityState:TRANSIENT_FAILURE ConnectionError:connection error: desc = \"error reading server preface: remote error: tls: bad certificate\"}"}
May 08 10:30:19 etcd1 etcd[602]: {"level":"info","ts":"2024-05-08T10:30:19.416204Z","caller":"grpclog/grpclog.go:37","msg":"[transport][client-transport 0xc000153b00] loopyWriter exiting with error: transport closed by client"}



3.5.13

May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.979952Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[core] [Channel #3 SubChannel #4] Subchannel Connectivity change to IDLE, last error: connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.980293Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[core] [pick-first-lb 0xc000fc0ba0] Received SubConn state update: 0xc000fc0e70, {ConnectivityState:IDLE ConnectionError:connection error: desc = \"error reading server preface: remote error: tls: bad certificate\"}"}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.980383Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[core] [Channel #3 SubChannel #4] Subchannel Connectivity change to CONNECTING"}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.980487Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[core] [Channel #3 SubChannel #4] Subchannel picks a new address \"127.0.0.1:2379\" to connect"}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.980927Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[core] [pick-first-lb 0xc000fc0ba0] Received SubConn state update: 0xc000fc0e70, {ConnectivityState:CONNECTING ConnectionError:<nil>}"}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.985105Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[transport] [client-transport 0xc000169d40] Closing: connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.985325Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[core] Creating new client transport to \"{Addr: \\\"127.0.0.1:2379\\\", ServerName: \\\"127.0.0.1:2379\\\", }\": connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"warn","ts":"2024-05-08T10:45:57.985374Z","caller":"zapgrpc/zapgrpc.go:191","msg":"[core] [Channel #3 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: \"127.0.0.1:2379\", ServerName: \"127.0.0.1:2379\", }. Err: connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.98541Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[core] [Channel #3 SubChannel #4] Subchannel Connectivity change to TRANSIENT_FAILURE, last error: connection error: desc = \"error reading server preface: remote error: tls: bad certificate\""}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.985451Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[core] [pick-first-lb 0xc000fc0ba0] Received SubConn state update: 0xc000fc0e70, {ConnectivityState:TRANSIENT_FAILURE ConnectionError:connection error: desc = \"error reading server preface: remote error: tls: bad certificate\"}"}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"info","ts":"2024-05-08T10:45:57.985483Z","caller":"zapgrpc/zapgrpc.go:174","msg":"[transport] [client-transport 0xc000169d40] loopyWriter exiting with error: transport closed by client"}
May 08 10:45:57 etcd1 etcd[1213]: {"level":"warn","ts":"2024-05-08T10:45:57.98556Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"127.0.0.1:43624","server-name":"","error":"tls: failed to verify certificate: x509: certificate specifies an incompatible key usage"}
@etcd-io etcd-io locked and limited conversation to collaborators May 9, 2024
@jmhbnz jmhbnz converted this issue into discussion #17972 May 9, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Development

No branches or pull requests

2 participants