Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

expiration: shutting down, not attempting further revocation of lease #5000

Closed
atomd opened this issue Jul 26, 2018 · 21 comments
Closed

expiration: shutting down, not attempting further revocation of lease #5000

atomd opened this issue Jul 26, 2018 · 21 comments

Comments

@atomd
Copy link

atomd commented Jul 26, 2018

I have encountered this error, I would like to ask what scenario will happen this error:

2018-07-27T01:11:53.560+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/login/f4f875e7a41ca2b1842ac1d83b8e5467b055d97f
2018-07-27T01:11:53.560+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/login/f108353c701ec4f12e21ea6614e9793d4e44c12a
2018-07-27T01:11:53.560+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/login/f3a6b888ed9ab8ee62c740eaa38126ead2beb764
2018-07-27T01:11:53.560+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/login/f61169378c7d2c10d68bc8c8cac70b8d045a705a
2018-07-27T01:11:53.560+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/login/f606afefe1e9d07bba2cc976328fe271656209c6
2018-07-27T01:11:53.560+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/login/f284a5b45235591f7b9205be057aa1dacf495429

When this error occurs, the vault will not resume normal operation in any way.

Thanks!

@jefferai
Copy link
Member

What version of Vault?

@atomd
Copy link
Author

atomd commented Jul 27, 2018

@jefferai
the version is 0.10.1.

@jefferai
Copy link
Member

I believe this should be fixed in 0.10.4.

@atomd
Copy link
Author

atomd commented Aug 1, 2018

Can you give me more context about this error? And how is it produced?

Thanks. @jefferai

@jefferai
Copy link
Member

jefferai commented Aug 1, 2018

I can't really as I believe those messages are symptoms, not the cause. We have cleaned up several ways that shutdown could deadlock, however.

@atomd
Copy link
Author

atomd commented Aug 1, 2018

Fill in the details:
client error is local node not active but active cluster node not found.​,and then all the nodes are electing the active node forever, the cluster is not working properly.

Has this error been fixed in 0.10.4?

Thanks. @jefferai

@jefferai
Copy link
Member

jefferai commented Aug 1, 2018

That sounds like some other issue, possibly with your underlying storage. Can you provide full logs?

@atomd
Copy link
Author

atomd commented Aug 7, 2018

local node not active but active cluster node not found.​ In what case will this error message appear?

@atomd
Copy link
Author

atomd commented Aug 7, 2018

Here is the detailed logs. Trouble to help me see what the problem is. Thanks. @jefferai

==> Vault server configuration:

              HA Storage: etcd
             Api Address: https://10.15.50.208:9332
                     Cgo: disabled
         Cluster Address: https://10.15.50.208:9333
              Listener 1: tcp (addr: "0.0.0.0:9332", cluster address: "0.0.0.0:9333", tls: "enabled")
               Log Level: trace
                   Mlock: supported: true, enabled: false
                 Storage: mysql
                 Version: Vault v0.10.3
             Version Sha: c69ae68faf2bf7fc1d78e3ec62655696a07454c7

==> Vault server started! Log data will stream in below:

2018-08-08T02:40:32.136+0800 [DEBUG] storage.mysql: max_parallel set: max_parallel=99
2018-08-08T02:40:32.144+0800 [DEBUG] storage.cache: creating LRU cache: size=0
2018-08-08T02:40:32.162+0800 [DEBUG] cluster listener addresses synthesized: cluster_addresses=[0.0.0.0:9333]
2018-08-08T02:40:37.392+0800 [DEBUG] core: cannot unseal, not enough keys: keys=1 threshold=3 nonce=9c9832d8-3a94-0a79-508d-95818fe245c1
2018-08-08T02:40:43.770+0800 [DEBUG] core: cannot unseal, not enough keys: keys=2 threshold=3 nonce=9c9832d8-3a94-0a79-508d-95818fe245c1
2018-08-08T02:40:48.659+0800 [INFO ] core: vault is unsealed
2018-08-08T02:40:48.659+0800 [INFO ] core: entering standby mode
2018-08-08T02:40:48.688+0800 [INFO ] core: acquired lock, enabling active operation
2018-08-08T02:40:48.689+0800 [TRACE] core: generating cluster private key
2018-08-08T02:40:48.702+0800 [DEBUG] core: generating local cluster certificate
2018-08-08T02:40:48.766+0800 [INFO ] core: post-unseal setup starting
2018-08-08T02:40:48.766+0800 [DEBUG] core: clearing forwarding clients
2018-08-08T02:40:48.766+0800 [DEBUG] core: done clearing forwarding clients
2018-08-08T02:40:48.766+0800 [INFO ] core: loaded wrapping token key
2018-08-08T02:40:48.766+0800 [INFO ] core: successfully setup plugin catalog: plugin-directory=/opt/vault_plugins/catalog
2018-08-08T02:40:48.767+0800 [INFO ] core: successfully mounted backend: type=kv path=secret/
2018-08-08T02:40:48.767+0800 [INFO ] core: successfully mounted backend: type=system path=sys/
2018-08-08T02:40:48.767+0800 [INFO ] core: successfully mounted backend: type=identity path=identity/
2018-08-08T02:40:48.768+0800 [INFO ] core: successfully mounted backend: type=kv path=secrets/
2018-08-08T02:40:48.768+0800 [INFO ] core: successfully mounted backend: type=cubbyhole path=cubbyhole/
2018-08-08T02:40:48.810+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: starting plugin: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0 args=[/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0]
2018-08-08T02:40:48.810+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: waiting for RPC address: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0
2018-08-08T02:40:48.815+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin.vault-plugin-auth-plt-linux-amd64-0.2.0: plugin address: metadata=true address=/tmp/plugin046398181 network=unix timestamp=2018-08-08T02:40:48.815+0800
2018-08-08T02:40:48.815+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: setup: transport=gRPC status=started
2018-08-08T02:40:48.816+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: setup: transport=gRPC status=finished err=<nil> took=718.727µs

...skipping...

2018-08-08T02:41:03.162+0800 [DEBUG] identity: entities collected: num_existing=256
2018-08-08T02:41:03.162+0800 [DEBUG] identity: entities loading: progress=0
2018-08-08T02:41:03.212+0800 [DEBUG] expiration: leases collected: num_existing=5735
2018-08-08T02:41:03.237+0800 [DEBUG] expiration: leases loading: progress=500
2018-08-08T02:41:03.265+0800 [DEBUG] expiration: leases loading: progress=1000
2018-08-08T02:41:03.291+0800 [DEBUG] expiration: leases loading: progress=1500
2018-08-08T02:41:03.319+0800 [DEBUG] expiration: leases loading: progress=2000
2018-08-08T02:41:03.349+0800 [DEBUG] expiration: leases loading: progress=2500
2018-08-08T02:41:03.370+0800 [DEBUG] expiration: leases loading: progress=3000
2018-08-08T02:41:03.399+0800 [DEBUG] expiration: leases loading: progress=3500
2018-08-08T02:41:03.429+0800 [DEBUG] expiration: leases loading: progress=4000
2018-08-08T02:41:03.435+0800 [INFO ] core: pre-seal teardown starting
2018-08-08T02:41:03.435+0800 [INFO ] core: cluster listeners not running
2018-08-08T02:41:03.435+0800 [DEBUG] expiration: stop triggered
2018-08-08T02:41:03.435+0800 [DEBUG] expiration: finished stopping
2018-08-08T02:41:03.435+0800 [INFO ] rollback: stopping rollback manager
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b858aa53aafdb127e8de2933d680327a5ae8f14b
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b74e4857a325538be471d9364fca8649a0b67039
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b7cb71478f2bdd5b9e2c03b7772714128f7b246f
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b78823f8e704b6828e40cef4fbe21d606c2c941a
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b77a0ada5f5c1b6a587334888ed9cf26d80de5aa
2018-08-08T02:41:03.435+0800 [INFO ] core: pre-seal teardown complete
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b497c845a761e788ee75af44a5bb73c1b3911214
2018-08-08T02:41:03.435+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/userpass/login/lark/3349f2196f467eaa2bd26cf4eb45643169809cfe
2018-08-08T02:41:03.435+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/usercred/login/76478e102a3825c188990fe45ce4a2061e184a26

...skipping...

2018-08-08T02:40:51.567+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/svc/plt/login/6da39a488aefd934a8433ecb29648da799e6a0af
2018-08-08T02:40:51.567+0800 [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "06bab3ca-af25-11a5-452c-177d6ebf7676" in already tied to a different entity "98412413-afe0-7968-9eba-60eaf1903391""
2018-08-08T02:40:51.567+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/svc/plt/login/7d1c1a8a1f956d3d77a581ac0b484fa38e38524f

...skipping...

2018-08-08T02:40:51.569+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/svc/plt/login/ff8e4c86994a92a1e757310a1315a51cc3a0a950
2018-08-08T02:40:51.569+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/svc/plt/login/faf9ccf84b239c039c66ece245e6c6e4d27d0cf1
2018-08-08T02:40:51.573+0800 [TRACE] core: found new active node information, refreshing
2018-08-08T02:40:51.611+0800 [DEBUG] core: parsing information for new active node: active_cluster_addr=https://10.15.50.208:9333 active_redirect_addr=https://10.15.50.208:9332
2018-08-08T02:40:51.611+0800 [DEBUG] core: refreshing forwarding connection
2018-08-08T02:40:51.611+0800 [DEBUG] core: clearing forwarding clients
2018-08-08T02:40:51.611+0800 [DEBUG] core: done clearing forwarding clients
2018-08-08T02:40:51.611+0800 [DEBUG] core: done refreshing forwarding connection
2018-08-08T02:40:51.612+0800 [DEBUG] core: creating rpc dialer: host=fw-0b8475a0-0606-6488-5608-7bc75778bdc8
2018-08-08T02:40:51.612+0800 [DEBUG] core: forwarding: error sending echo request to active node: error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure"
2018-08-08T02:40:51.651+0800 [INFO ] core: acquired lock, enabling active operation
2018-08-08T02:40:51.652+0800 [TRACE] core: generating cluster private key
2018-08-08T02:40:51.663+0800 [DEBUG] core: generating local cluster certificate
2018-08-08T02:40:51.707+0800 [INFO ] core: post-unseal setup starting
2018-08-08T02:40:51.707+0800 [DEBUG] core: clearing forwarding clients
2018-08-08T02:40:51.707+0800 [DEBUG] core: done clearing forwarding clients
2018-08-08T02:40:51.707+0800 [DEBUG] core: forwarding: stopping heartbeating
2018-08-08T02:40:51.707+0800 [INFO ] core: loaded wrapping token key
2018-08-08T02:40:51.707+0800 [INFO ] core: successfully setup plugin catalog: plugin-directory=/opt/vault_plugins/catalog
2018-08-08T02:40:51.708+0800 [INFO ] core: successfully mounted backend: type=kv path=secret/
2018-08-08T02:40:51.708+0800 [INFO ] core: successfully mounted backend: type=system path=sys/
2018-08-08T02:40:51.708+0800 [INFO ] core: successfully mounted backend: type=identity path=identity/
2018-08-08T02:40:51.709+0800 [INFO ] core: successfully mounted backend: type=kv path=secrets/
2018-08-08T02:40:51.709+0800 [INFO ] core: successfully mounted backend: type=cubbyhole path=cubbyhole/
2018-08-08T02:40:51.751+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: starting plugin: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0 args=[/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0]
2018-08-08T02:40:51.751+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: waiting for RPC address: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0
2018-08-08T02:40:51.756+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin.vault-plugin-auth-plt-linux-amd64-0.2.0: plugin address: metadata=true address=/tmp/plugin885669146 network=unix timestamp=2018-08-08T02:40:51.756+0800
2018-08-08T02:40:51.756+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: setup: transport=gRPC status=started
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: setup: transport=gRPC status=finished err=<nil> took=779.173µs
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: special paths: transport=gRPC status=started
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: special paths: transport=gRPC status=finished took=213.349µs
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: type: transport=gRPC status=started
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: type: transport=gRPC status=finished took=191.959µs
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: cleanup: transport=gRPC status=started
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: cleanup: transport=gRPC status=finished took=182.704µs
2018-08-08T02:40:51.757+0800 [WARN ] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: error closing client during Kill: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2018-08-08T02:40:51.758+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: plugin process exited: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0
2018-08-08T02:40:51.799+0800 [DEBUG] auth.plugin.auth_plugin_95820632.user-auth-plugin: starting plugin: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-user-linux-amd64-0.1.0 args=[/opt/vault_plugins/catalog/vault-plugin-auth-user-linux-amd64-0.1.0]
2018-08-08T02:40:51.799+0800 [DEBUG] auth.plugin.auth_plugin_95820632.user-auth-plugin: waiting for RPC address: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-user-linux-amd64-0.1.0
2018-08-08T02:40:51.804+0800 [DEBUG] auth.plugin.auth_plugin_95820632.user-auth-plugin.vault-plugin-auth-user-linux-amd64-0.1.0: plugin address: metadata=true network=unix address=/tmp/plugin015885452 timestamp=2018-08-08T02:40:51.804+0800
2018-08-08T02:40:51.804+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: setup: transport=gRPC status=started
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: setup: transport=gRPC status=finished err=<nil> took=701.998µs
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: special paths: transport=gRPC status=started
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: special paths: transport=gRPC status=finished took=274.052µs
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: type: transport=gRPC status=started
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: type: transport=gRPC status=finished took=136.139µs
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: cleanup: transport=gRPC status=started
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: cleanup: transport=gRPC status=finished took=203.882µs
2018-08-08T02:40:51.805+0800 [WARN ] auth.plugin.auth_plugin_95820632.user-auth-plugin: error closing client during Kill: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2018-08-08T02:40:51.806+0800 [DEBUG] auth.plugin.auth_plugin_95820632.user-auth-plugin: plugin process exited: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-user-linux-amd64-0.1.0
2018-08-08T02:40:51.861+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: starting plugin: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-svc-linux-amd64-0.8.2 args=[/opt/vault_plugins/catalog/vault-plugin-auth-svc-linux-amd64-0.8.2]
2018-08-08T02:40:51.861+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: waiting for RPC address: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-svc-linux-amd64-0.8.2
2018-08-08T02:40:51.868+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin.vault-plugin-auth-svc-linux-amd64-0.8.2: plugin address: metadata=true address=/tmp/plugin005325653 network=unix timestamp=2018-08-08T02:40:51.868+0800
2018-08-08T02:40:51.868+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: setup: transport=gRPC status=started
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: setup: transport=gRPC status=finished err=<nil> took=1.919106ms
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: special paths: transport=gRPC status=started
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: special paths: transport=gRPC status=finished took=224.657µs
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: type: transport=gRPC status=started
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: type: transport=gRPC status=finished took=168.546µs
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: cleanup: transport=gRPC status=started
2018-08-08T02:40:51.870+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin.vault-plugin-auth-svc-linux-amd64-0.8.2: time="2018-08-08T02:40:51+08:00" level=info msg="dns unwatch, the ticked update is over": metadata=true
2018-08-08T02:40:51.870+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin.vault-plugin-auth-svc-linux-amd64-0.8.2: time="2018-08-08T02:40:51+08:00" level=info msg="dns close, update running terminated": metadata=true
2018-08-08T02:40:51.871+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: cleanup: transport=gRPC status=finished took=308.573µs
2018-08-08T02:40:51.871+0800 [WARN ] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: error closing client during Kill: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2018-08-08T02:40:51.871+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: plugin process exited: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-svc-linux-amd64-0.8.2
2018-08-08T02:40:51.871+0800 [INFO ] core: restoring leases
2018-08-08T02:40:51.871+0800 [INFO ] rollback: starting rollback manager
2018-08-08T02:40:51.871+0800 [DEBUG] expiration: collecting leases
2018-08-08T02:40:51.872+0800 [DEBUG] audit: socket backend options: path=socket/ address=127.0.0.1:6048 socket type=tcp
2018-08-08T02:40:51.872+0800 [DEBUG] audit: syslog backend options: path=syslog/ facility=local0 tag=vault
2018-08-08T02:40:51.872+0800 [DEBUG] identity: loading entities
2018-08-08T02:40:51.872+0800 [DEBUG] identity: entities collected: num_existing=256
2018-08-08T02:40:51.873+0800 [DEBUG] identity: entities loading: progress=0
2018-08-08T02:40:51.931+0800 [DEBUG] expiration: leases collected: num_existing=5735
2018-08-08T02:40:51.957+0800 [DEBUG] expiration: leases loading: progress=500
2018-08-08T02:40:51.987+0800 [DEBUG] expiration: leases loading: progress=1000
2018-08-08T02:40:52.012+0800 [DEBUG] expiration: leases loading: progress=1500
2018-08-08T02:40:52.040+0800 [DEBUG] expiration: leases loading: progress=2000
2018-08-08T02:40:52.062+0800 [DEBUG] expiration: leases loading: progress=2500
2018-08-08T02:40:52.089+0800 [DEBUG] expiration: leases loading: progress=3000
2018-08-08T02:40:52.117+0800 [DEBUG] expiration: leases loading: progress=3500
2018-08-08T02:40:52.138+0800 [DEBUG] expiration: leases loading: progress=4000
2018-08-08T02:40:52.168+0800 [DEBUG] expiration: leases loading: progress=4500
2018-08-08T02:40:52.197+0800 [DEBUG] expiration: leases loading: progress=5000
2018-08-08T02:40:52.223+0800 [DEBUG] expiration: leases loading: progress=5500
2018-08-08T02:40:52.242+0800 [INFO ] expiration: lease restore complete
2018-08-08T02:40:52.287+0800 [INFO ] core: pre-seal teardown starting
2018-08-08T02:40:52.287+0800 [INFO ] core: cluster listeners not running
2018-08-08T02:40:52.287+0800 [DEBUG] expiration: stop triggered
2018-08-08T02:40:52.287+0800 [DEBUG] expiration: finished stopping
2018-08-08T02:40:52.287+0800 [INFO ] rollback: stopping rollback manager
2018-08-08T02:40:52.287+0800 [INFO ] core: pre-seal teardown complete
2018-08-08T02:40:52.287+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/usercred/login/57edd7cd4a102bf954336404795e951dd4517789
2018-08-08T02:40:52.287+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/usercred/login/2783a1b612da9d861a575d0b424bffde88362790

@atomd

This comment has been minimized.

@jefferai
Copy link
Member

jefferai commented Aug 8, 2018

@atomd I'm hiding your comment as it's an entirely different issue not related to this one

@atomd
Copy link
Author

atomd commented Aug 8, 2018

@jefferai Thanks. I raised a new issue.

@atomd
Copy link
Author

atomd commented Aug 8, 2018

@jefferai can you unhide my comment? I think it is helpful to this issue.

Because of this problem, vault does not work properly and the secrets are lost. It is very serious for me, please help me see how it can be solved.

@jefferai
Copy link
Member

jefferai commented Aug 8, 2018

@atomd are you able to build from source?

@atomd
Copy link
Author

atomd commented Aug 8, 2018

@jefferai yes, I can try. you can provide me some more detailed guidelines.

@jefferai
Copy link
Member

jefferai commented Aug 8, 2018

See https://github.com/hashicorp/vault#developing-vault -- the branch you want to build is merge-on-conflict.

This was caused by race conditions that have since been fixed but there isn't currently a way to reverse those changes. We have an idea to prevent some of these problems in the first place (in case there are other bugs that we don't know about) but it's a much bigger change that we won't be able to get to in this dev cycle, so I'm coding a reasonable workaround: if two entities claim the same alias, we merge the entities. Since this situation is only caused by (so far as we know) a race condition in the first place, it is reasonable to merge them since they were always meant to be a single entity with a single alias.

@atomd
Copy link
Author

atomd commented Aug 9, 2018

Hi @jefferai

The branch merge-on-conflict can fix my race conditions.

Has this issue been fixed in 0.10.4?
How can I ensure that the same problem will not happen again?

Thanks.

@jefferai
Copy link
Member

jefferai commented Aug 9, 2018

No, the patch hasn't been merged so it will be fixed in the next release.

@atomd
Copy link
Author

atomd commented Aug 11, 2018

What is the order of operations that will lead to the race conditions?
Is it related to the master election?

@jefferai Thanks! Great help to me.

@jefferai
Copy link
Member

It was caused by a bug that was fixed in 0.10.4.

@cmukhopadhyay-rms
Copy link

@jefferai : we're encountering this error in production. since we can't immediately upgrade vault without testing, is there a workaround we could do on v0.10.1? we're running vault HA with a consul and azure backend storage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants