Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB #5070

Closed
atomd opened this issue Aug 8, 2018 · 2 comments
Closed

Comments

@atomd
Copy link

atomd commented Aug 8, 2018

Environment:

  • Vault Server Version: 0.10.3
  • MySQL Storage Backend
  • Etcd HA Backend

I found that this issue (#4326) is similar to what I have encountered.

When I redeployed vault with previous data (everything is new except mysql storage backend), there will always be some errors. At the same time, the vault is looping in the behavior described in the logs, no leader is ever elected.

2018-08-09T01:33:49.281+0800 [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "06bab3ca-af25-11a5-452c-177d6ebf7676" in already tied to a different entity "98412413-afe0-7968-9eba-60eaf1903391""
2018-08-09T01:33:49.826+0800 [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "06bab3ca-af25-11a5-452c-177d6ebf7676" in already tied to a different entity "98412413-afe0-7968-9eba-60eaf1903391""
2018-08-09T01:33:50.230+0800 [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "11a1eeed-ef89-3620-a62b-263cd6dc2366" in already tied to a different entity "b8c25651-2d67-fb4b-8db7-0c14ddb87bb5""
2018-08-09T01:33:50.644+0800 [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "06bab3ca-af25-11a5-452c-177d6ebf7676" in already tied to a different entity "98412413-afe0-7968-9eba-60eaf1903391""
2018-08-09T01:33:51.191+0800 [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "11a1eeed-ef89-3620-a62b-263cd6dc2366" in already tied to a different entity "b8c25651-2d67-fb4b-8db7-0c14ddb87bb5""
2018-08-09T01:33:51.699+0800 [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "06bab3ca-af25-11a5-452c-177d6ebf7676" in already tied to a different entity "98412413-afe0-7968-9eba-60eaf1903391""

It is just like the database was written with dirty data (leader info?) and can never be recovered. The key is that the secrets is lost, which is unacceptable. I want to know:

  • How can we completely prevent this situation?
  • How to recover from this situation?
@atomd atomd changed the title [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "xx" in already tied to a different entity "[ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "xx" in already tied to a different entity "xx" [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB Aug 8, 2018
@atomd
Copy link
Author

atomd commented Aug 8, 2018

Here is the detailed logs.

==> Vault server configuration:

              HA Storage: etcd
             Api Address: https://10.15.50.208:9332
                     Cgo: disabled
         Cluster Address: https://10.15.50.208:9333
              Listener 1: tcp (addr: "0.0.0.0:9332", cluster address: "0.0.0.0:9333", tls: "enabled")
               Log Level: trace
                   Mlock: supported: true, enabled: false
                 Storage: mysql
                 Version: Vault v0.10.3
             Version Sha: c69ae68faf2bf7fc1d78e3ec62655696a07454c7

==> Vault server started! Log data will stream in below:

2018-08-08T02:40:32.136+0800 [DEBUG] storage.mysql: max_parallel set: max_parallel=99
2018-08-08T02:40:32.144+0800 [DEBUG] storage.cache: creating LRU cache: size=0
2018-08-08T02:40:32.162+0800 [DEBUG] cluster listener addresses synthesized: cluster_addresses=[0.0.0.0:9333]
2018-08-08T02:40:37.392+0800 [DEBUG] core: cannot unseal, not enough keys: keys=1 threshold=3 nonce=9c9832d8-3a94-0a79-508d-95818fe245c1
2018-08-08T02:40:43.770+0800 [DEBUG] core: cannot unseal, not enough keys: keys=2 threshold=3 nonce=9c9832d8-3a94-0a79-508d-95818fe245c1
2018-08-08T02:40:48.659+0800 [INFO ] core: vault is unsealed
2018-08-08T02:40:48.659+0800 [INFO ] core: entering standby mode
2018-08-08T02:40:48.688+0800 [INFO ] core: acquired lock, enabling active operation
2018-08-08T02:40:48.689+0800 [TRACE] core: generating cluster private key
2018-08-08T02:40:48.702+0800 [DEBUG] core: generating local cluster certificate
2018-08-08T02:40:48.766+0800 [INFO ] core: post-unseal setup starting
2018-08-08T02:40:48.766+0800 [DEBUG] core: clearing forwarding clients
2018-08-08T02:40:48.766+0800 [DEBUG] core: done clearing forwarding clients
2018-08-08T02:40:48.766+0800 [INFO ] core: loaded wrapping token key
2018-08-08T02:40:48.766+0800 [INFO ] core: successfully setup plugin catalog: plugin-directory=/opt/vault_plugins/catalog
2018-08-08T02:40:48.767+0800 [INFO ] core: successfully mounted backend: type=kv path=secret/
2018-08-08T02:40:48.767+0800 [INFO ] core: successfully mounted backend: type=system path=sys/
2018-08-08T02:40:48.767+0800 [INFO ] core: successfully mounted backend: type=identity path=identity/
2018-08-08T02:40:48.768+0800 [INFO ] core: successfully mounted backend: type=kv path=secrets/
2018-08-08T02:40:48.768+0800 [INFO ] core: successfully mounted backend: type=cubbyhole path=cubbyhole/
2018-08-08T02:40:48.810+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: starting plugin: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0 args=[/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0]
2018-08-08T02:40:48.810+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: waiting for RPC address: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0
2018-08-08T02:40:48.815+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin.vault-plugin-auth-plt-linux-amd64-0.2.0: plugin address: metadata=true address=/tmp/plugin046398181 network=unix timestamp=2018-08-08T02:40:48.815+0800
2018-08-08T02:40:48.815+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: setup: transport=gRPC status=started
2018-08-08T02:40:48.816+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: setup: transport=gRPC status=finished err=<nil> took=718.727µs

...skipping...

2018-08-08T02:41:03.162+0800 [DEBUG] identity: entities collected: num_existing=256
2018-08-08T02:41:03.162+0800 [DEBUG] identity: entities loading: progress=0
2018-08-08T02:41:03.212+0800 [DEBUG] expiration: leases collected: num_existing=5735
2018-08-08T02:41:03.237+0800 [DEBUG] expiration: leases loading: progress=500
2018-08-08T02:41:03.265+0800 [DEBUG] expiration: leases loading: progress=1000
2018-08-08T02:41:03.291+0800 [DEBUG] expiration: leases loading: progress=1500
2018-08-08T02:41:03.319+0800 [DEBUG] expiration: leases loading: progress=2000
2018-08-08T02:41:03.349+0800 [DEBUG] expiration: leases loading: progress=2500
2018-08-08T02:41:03.370+0800 [DEBUG] expiration: leases loading: progress=3000
2018-08-08T02:41:03.399+0800 [DEBUG] expiration: leases loading: progress=3500
2018-08-08T02:41:03.429+0800 [DEBUG] expiration: leases loading: progress=4000
2018-08-08T02:41:03.435+0800 [INFO ] core: pre-seal teardown starting
2018-08-08T02:41:03.435+0800 [INFO ] core: cluster listeners not running
2018-08-08T02:41:03.435+0800 [DEBUG] expiration: stop triggered
2018-08-08T02:41:03.435+0800 [DEBUG] expiration: finished stopping
2018-08-08T02:41:03.435+0800 [INFO ] rollback: stopping rollback manager
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b858aa53aafdb127e8de2933d680327a5ae8f14b
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b74e4857a325538be471d9364fca8649a0b67039
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b7cb71478f2bdd5b9e2c03b7772714128f7b246f
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b78823f8e704b6828e40cef4fbe21d606c2c941a
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b77a0ada5f5c1b6a587334888ed9cf26d80de5aa
2018-08-08T02:41:03.435+0800 [INFO ] core: pre-seal teardown complete
2018-08-08T02:41:03.435+0800 [ERROR] expiration: shutting down, not attempting further revocation of lease: lease_id=auth/svc/plt/login/b497c845a761e788ee75af44a5bb73c1b3911214
2018-08-08T02:41:03.435+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/userpass/login/lark/3349f2196f467eaa2bd26cf4eb45643169809cfe
2018-08-08T02:41:03.435+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/usercred/login/76478e102a3825c188990fe45ce4a2061e184a26

...skipping...

2018-08-08T02:40:51.567+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/svc/plt/login/6da39a488aefd934a8433ecb29648da799e6a0af
2018-08-08T02:40:51.567+0800 [ERROR] core: post-unseal setup failed: error="failed to update entity in MemDB: alias "06bab3ca-af25-11a5-452c-177d6ebf7676" in already tied to a different entity "98412413-afe0-7968-9eba-60eaf1903391""
2018-08-08T02:40:51.567+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/svc/plt/login/7d1c1a8a1f956d3d77a581ac0b484fa38e38524f

...skipping...

2018-08-08T02:40:51.569+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/svc/plt/login/ff8e4c86994a92a1e757310a1315a51cc3a0a950
2018-08-08T02:40:51.569+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/svc/plt/login/faf9ccf84b239c039c66ece245e6c6e4d27d0cf1
2018-08-08T02:40:51.573+0800 [TRACE] core: found new active node information, refreshing
2018-08-08T02:40:51.611+0800 [DEBUG] core: parsing information for new active node: active_cluster_addr=https://10.15.50.208:9333 active_redirect_addr=https://10.15.50.208:9332
2018-08-08T02:40:51.611+0800 [DEBUG] core: refreshing forwarding connection
2018-08-08T02:40:51.611+0800 [DEBUG] core: clearing forwarding clients
2018-08-08T02:40:51.611+0800 [DEBUG] core: done clearing forwarding clients
2018-08-08T02:40:51.611+0800 [DEBUG] core: done refreshing forwarding connection
2018-08-08T02:40:51.612+0800 [DEBUG] core: creating rpc dialer: host=fw-0b8475a0-0606-6488-5608-7bc75778bdc8
2018-08-08T02:40:51.612+0800 [DEBUG] core: forwarding: error sending echo request to active node: error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure"
2018-08-08T02:40:51.651+0800 [INFO ] core: acquired lock, enabling active operation
2018-08-08T02:40:51.652+0800 [TRACE] core: generating cluster private key
2018-08-08T02:40:51.663+0800 [DEBUG] core: generating local cluster certificate
2018-08-08T02:40:51.707+0800 [INFO ] core: post-unseal setup starting
2018-08-08T02:40:51.707+0800 [DEBUG] core: clearing forwarding clients
2018-08-08T02:40:51.707+0800 [DEBUG] core: done clearing forwarding clients
2018-08-08T02:40:51.707+0800 [DEBUG] core: forwarding: stopping heartbeating
2018-08-08T02:40:51.707+0800 [INFO ] core: loaded wrapping token key
2018-08-08T02:40:51.707+0800 [INFO ] core: successfully setup plugin catalog: plugin-directory=/opt/vault_plugins/catalog
2018-08-08T02:40:51.708+0800 [INFO ] core: successfully mounted backend: type=kv path=secret/
2018-08-08T02:40:51.708+0800 [INFO ] core: successfully mounted backend: type=system path=sys/
2018-08-08T02:40:51.708+0800 [INFO ] core: successfully mounted backend: type=identity path=identity/
2018-08-08T02:40:51.709+0800 [INFO ] core: successfully mounted backend: type=kv path=secrets/
2018-08-08T02:40:51.709+0800 [INFO ] core: successfully mounted backend: type=cubbyhole path=cubbyhole/
2018-08-08T02:40:51.751+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: starting plugin: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0 args=[/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0]
2018-08-08T02:40:51.751+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: waiting for RPC address: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0
2018-08-08T02:40:51.756+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin.vault-plugin-auth-plt-linux-amd64-0.2.0: plugin address: metadata=true address=/tmp/plugin885669146 network=unix timestamp=2018-08-08T02:40:51.756+0800
2018-08-08T02:40:51.756+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: setup: transport=gRPC status=started
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: setup: transport=gRPC status=finished err=<nil> took=779.173µs
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: special paths: transport=gRPC status=started
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: special paths: transport=gRPC status=finished took=213.349µs
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: type: transport=gRPC status=started
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: type: transport=gRPC status=finished took=191.959µs
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: cleanup: transport=gRPC status=started
2018-08-08T02:40:51.757+0800 [TRACE] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: cleanup: transport=gRPC status=finished took=182.704µs
2018-08-08T02:40:51.757+0800 [WARN ] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: error closing client during Kill: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2018-08-08T02:40:51.758+0800 [DEBUG] auth.plugin.auth_plugin_82ec2a71.plt-auth-plugin: plugin process exited: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-plt-linux-amd64-0.2.0
2018-08-08T02:40:51.799+0800 [DEBUG] auth.plugin.auth_plugin_95820632.user-auth-plugin: starting plugin: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-user-linux-amd64-0.1.0 args=[/opt/vault_plugins/catalog/vault-plugin-auth-user-linux-amd64-0.1.0]
2018-08-08T02:40:51.799+0800 [DEBUG] auth.plugin.auth_plugin_95820632.user-auth-plugin: waiting for RPC address: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-user-linux-amd64-0.1.0
2018-08-08T02:40:51.804+0800 [DEBUG] auth.plugin.auth_plugin_95820632.user-auth-plugin.vault-plugin-auth-user-linux-amd64-0.1.0: plugin address: metadata=true network=unix address=/tmp/plugin015885452 timestamp=2018-08-08T02:40:51.804+0800
2018-08-08T02:40:51.804+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: setup: transport=gRPC status=started
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: setup: transport=gRPC status=finished err=<nil> took=701.998µs
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: special paths: transport=gRPC status=started
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: special paths: transport=gRPC status=finished took=274.052µs
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: type: transport=gRPC status=started
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: type: transport=gRPC status=finished took=136.139µs
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: cleanup: transport=gRPC status=started
2018-08-08T02:40:51.805+0800 [TRACE] auth.plugin.auth_plugin_95820632.user-auth-plugin: cleanup: transport=gRPC status=finished took=203.882µs
2018-08-08T02:40:51.805+0800 [WARN ] auth.plugin.auth_plugin_95820632.user-auth-plugin: error closing client during Kill: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2018-08-08T02:40:51.806+0800 [DEBUG] auth.plugin.auth_plugin_95820632.user-auth-plugin: plugin process exited: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-user-linux-amd64-0.1.0
2018-08-08T02:40:51.861+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: starting plugin: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-svc-linux-amd64-0.8.2 args=[/opt/vault_plugins/catalog/vault-plugin-auth-svc-linux-amd64-0.8.2]
2018-08-08T02:40:51.861+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: waiting for RPC address: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-svc-linux-amd64-0.8.2
2018-08-08T02:40:51.868+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin.vault-plugin-auth-svc-linux-amd64-0.8.2: plugin address: metadata=true address=/tmp/plugin005325653 network=unix timestamp=2018-08-08T02:40:51.868+0800
2018-08-08T02:40:51.868+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: setup: transport=gRPC status=started
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: setup: transport=gRPC status=finished err=<nil> took=1.919106ms
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: special paths: transport=gRPC status=started
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: special paths: transport=gRPC status=finished took=224.657µs
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: type: transport=gRPC status=started
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: type: transport=gRPC status=finished took=168.546µs
2018-08-08T02:40:51.870+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: cleanup: transport=gRPC status=started
2018-08-08T02:40:51.870+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin.vault-plugin-auth-svc-linux-amd64-0.8.2: time="2018-08-08T02:40:51+08:00" level=info msg="dns unwatch, the ticked update is over": metadata=true
2018-08-08T02:40:51.870+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin.vault-plugin-auth-svc-linux-amd64-0.8.2: time="2018-08-08T02:40:51+08:00" level=info msg="dns close, update running terminated": metadata=true
2018-08-08T02:40:51.871+0800 [TRACE] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: cleanup: transport=gRPC status=finished took=308.573µs
2018-08-08T02:40:51.871+0800 [WARN ] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: error closing client during Kill: metadata=true err="rpc error: code = Canceled desc = grpc: the client connection is closing"
2018-08-08T02:40:51.871+0800 [DEBUG] auth.plugin.auth_plugin_dfbba7f0.svc-auth-plugin: plugin process exited: metadata=true path=/opt/vault_plugins/catalog/vault-plugin-auth-svc-linux-amd64-0.8.2
2018-08-08T02:40:51.871+0800 [INFO ] core: restoring leases
2018-08-08T02:40:51.871+0800 [INFO ] rollback: starting rollback manager
2018-08-08T02:40:51.871+0800 [DEBUG] expiration: collecting leases
2018-08-08T02:40:51.872+0800 [DEBUG] audit: socket backend options: path=socket/ address=127.0.0.1:6048 socket type=tcp
2018-08-08T02:40:51.872+0800 [DEBUG] audit: syslog backend options: path=syslog/ facility=local0 tag=vault
2018-08-08T02:40:51.872+0800 [DEBUG] identity: loading entities
2018-08-08T02:40:51.872+0800 [DEBUG] identity: entities collected: num_existing=256
2018-08-08T02:40:51.873+0800 [DEBUG] identity: entities loading: progress=0
2018-08-08T02:40:51.931+0800 [DEBUG] expiration: leases collected: num_existing=5735
2018-08-08T02:40:51.957+0800 [DEBUG] expiration: leases loading: progress=500
2018-08-08T02:40:51.987+0800 [DEBUG] expiration: leases loading: progress=1000
2018-08-08T02:40:52.012+0800 [DEBUG] expiration: leases loading: progress=1500
2018-08-08T02:40:52.040+0800 [DEBUG] expiration: leases loading: progress=2000
2018-08-08T02:40:52.062+0800 [DEBUG] expiration: leases loading: progress=2500
2018-08-08T02:40:52.089+0800 [DEBUG] expiration: leases loading: progress=3000
2018-08-08T02:40:52.117+0800 [DEBUG] expiration: leases loading: progress=3500
2018-08-08T02:40:52.138+0800 [DEBUG] expiration: leases loading: progress=4000
2018-08-08T02:40:52.168+0800 [DEBUG] expiration: leases loading: progress=4500
2018-08-08T02:40:52.197+0800 [DEBUG] expiration: leases loading: progress=5000
2018-08-08T02:40:52.223+0800 [DEBUG] expiration: leases loading: progress=5500
2018-08-08T02:40:52.242+0800 [INFO ] expiration: lease restore complete
2018-08-08T02:40:52.287+0800 [INFO ] core: pre-seal teardown starting
2018-08-08T02:40:52.287+0800 [INFO ] core: cluster listeners not running
2018-08-08T02:40:52.287+0800 [DEBUG] expiration: stop triggered
2018-08-08T02:40:52.287+0800 [DEBUG] expiration: finished stopping
2018-08-08T02:40:52.287+0800 [INFO ] rollback: stopping rollback manager
2018-08-08T02:40:52.287+0800 [INFO ] core: pre-seal teardown complete
2018-08-08T02:40:52.287+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/usercred/login/57edd7cd4a102bf954336404795e951dd4517789
2018-08-08T02:40:52.287+0800 [ERROR] expiration: core context canceled, not attempting further revocation of lease: lease_id=auth/usercred/login/2783a1b612da9d861a575d0b424bffde88362790

@jefferai
Copy link
Member

jefferai commented Aug 8, 2018

Sorry, I was wrong -- duplicate of #5000

@jefferai jefferai closed this as completed Aug 8, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants