Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Elasticsearch some nodes report such errors when full restart cluster. #30919

Closed
johnyannj opened this issue May 29, 2018 · 4 comments
Closed
Labels
:Distributed Indexing/Engine Anything around managing Lucene and the Translog in an open shard. feedback_needed

Comments

@johnyannj
Copy link

version: 6.2.2
OS: CentOS
jdk: 1.8
nodes: 6

We are testing the full restart of the elasticsearch cluster。
we found some nodes has follow errors in logs when it starts recovery.

The error log is as follows:

Does it mean data loss or normal logic?

[2018-05-28T17:12:50,350][WARN ][o.e.g.GatewayAllocator$InternalReplicaShardAllocator] [es_node1] [app20142][23]: failed to list shard for shard_store on node [Duy8P4B7SpKpJPWux9FaaA]
org.elasticsearch.action.FailedNodeException: Failed node [Duy8P4B7SpKpJPWux9FaaA]
	at org.elasticsearch.action.support.nodes.TransportNodesAction$AsyncAction.onFailure(TransportNodesAction.java:239) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.action.support.nodes.TransportNodesAction$AsyncAction.access$200(TransportNodesAction.java:153) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.action.support.nodes.TransportNodesAction$AsyncAction$1.handleException(TransportNodesAction.java:211) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.TransportService$ContextRestoreResponseHandler.handleException(TransportService.java:1098) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.TcpTransport.lambda$handleException$33(TcpTransport.java:1478) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.common.util.concurrent.EsExecutors$1.execute(EsExecutors.java:135) [elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.TcpTransport.handleException(TcpTransport.java:1476) [elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.TcpTransport.handlerResponseError(TcpTransport.java:1468) [elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.TcpTransport.messageReceived(TcpTransport.java:1398) [elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:64) [transport-netty4-6.2.2.jar:6.2.2]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) [netty-codec-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:241) [netty-handler-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:545) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:499) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) [netty-common-4.1.16.Final.jar:4.1.16.Final]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]
Caused by: org.elasticsearch.transport.RemoteTransportException: [es_node6][192.168.64.58:9300][internal:cluster/nodes/indices/shard/store[n]]
Caused by: org.elasticsearch.ElasticsearchException: Failed to list store metadata for shard [[app20142][23]]
	at org.elasticsearch.indices.store.TransportNodesListShardStoreMetaData.nodeOperation(TransportNodesListShardStoreMetaData.java:111) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.indices.store.TransportNodesListShardStoreMetaData.nodeOperation(TransportNodesListShardStoreMetaData.java:61) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.action.support.nodes.TransportNodesAction.nodeOperation(TransportNodesAction.java:140) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.action.support.nodes.TransportNodesAction$NodeTransportHandler.messageReceived(TransportNodesAction.java:262) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.action.support.nodes.TransportNodesAction$NodeTransportHandler.messageReceived(TransportNodesAction.java:258) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:66) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.TcpTransport$RequestHandler.doRun(TcpTransport.java:1555) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:672) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-6.2.2.jar:6.2.2]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_171]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_171]
	... 1 more
Caused by: java.io.FileNotFoundException: no segments* file found in store(MMapDirectory@/disk/data1/data/nodes/0/indices/XqfshaweRp-5GAXbK7C_qA/23/index lockFactory=org.apache.lucene.store.NativeFSLockFactory@39cb082a): files: [recovery.l2K98nCnS_CG69RJlGHtZA._10d7.dii, recovery.l2K98nCnS_CG69RJlGHtZA._10d7.dim, recovery.l2K98nCnS_CG69RJlGHtZA._10d7.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._10d7.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._10d7.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._10d7.si, recovery.l2K98nCnS_CG69RJlGHtZA._10d7_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._10d7_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._10d7_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._11gk.dii, recovery.l2K98nCnS_CG69RJlGHtZA._11gk.dim, recovery.l2K98nCnS_CG69RJlGHtZA._11gk.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._11gk.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._11gk.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._11gk.si, recovery.l2K98nCnS_CG69RJlGHtZA._11gk_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._11gk_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._11gk_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._12as.dii, recovery.l2K98nCnS_CG69RJlGHtZA._12as.dim, recovery.l2K98nCnS_CG69RJlGHtZA._12as.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._12as.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._12as.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._12as.si, recovery.l2K98nCnS_CG69RJlGHtZA._12as_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._12as_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._12as_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._13cw.dii, recovery.l2K98nCnS_CG69RJlGHtZA._13cw.dim, recovery.l2K98nCnS_CG69RJlGHtZA._13cw.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._13cw.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._13cw.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._13cw.si, recovery.l2K98nCnS_CG69RJlGHtZA._13cw_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._13cw_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._13cw_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._149n.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._149n.si, recovery.l2K98nCnS_CG69RJlGHtZA._15f1.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._15f1.si, recovery.l2K98nCnS_CG69RJlGHtZA._16co.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._16co.si, recovery.l2K98nCnS_CG69RJlGHtZA._17lo.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._17lo.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._17lo.si, recovery.l2K98nCnS_CG69RJlGHtZA._17oi.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._17oi.si, recovery.l2K98nCnS_CG69RJlGHtZA._17r2.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._17r2.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._17r2.si, recovery.l2K98nCnS_CG69RJlGHtZA._17wd.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._17wd.si, recovery.l2K98nCnS_CG69RJlGHtZA._17yv.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._17yv.si, recovery.l2K98nCnS_CG69RJlGHtZA._1833.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._1833.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._1833.si, recovery.l2K98nCnS_CG69RJlGHtZA._185w.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._185w.si, recovery.l2K98nCnS_CG69RJlGHtZA._18b8.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18b8.si, recovery.l2K98nCnS_CG69RJlGHtZA._18h0.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18h0.si, recovery.l2K98nCnS_CG69RJlGHtZA._18ja.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18ja.si, recovery.l2K98nCnS_CG69RJlGHtZA._18jz.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18jz.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18jz.si, recovery.l2K98nCnS_CG69RJlGHtZA._18ku.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18ku.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18ku.si, recovery.l2K98nCnS_CG69RJlGHtZA._18mo.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18mo.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18mo.si, recovery.l2K98nCnS_CG69RJlGHtZA._18op.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18op.si, recovery.l2K98nCnS_CG69RJlGHtZA._18pr.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18pr.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18pr.si, recovery.l2K98nCnS_CG69RJlGHtZA._18ql.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18ql.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18ql.si, recovery.l2K98nCnS_CG69RJlGHtZA._18rg.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18rg.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18rg.si, recovery.l2K98nCnS_CG69RJlGHtZA._18ri.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18ri.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18ri.si, recovery.l2K98nCnS_CG69RJlGHtZA._18rz.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18rz.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18rz.si, recovery.l2K98nCnS_CG69RJlGHtZA._18s1.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18s1.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18s1.si, recovery.l2K98nCnS_CG69RJlGHtZA._18si.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18si.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18si.si, recovery.l2K98nCnS_CG69RJlGHtZA._18sj.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18sj.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18sj.si, recovery.l2K98nCnS_CG69RJlGHtZA._18sk.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18sk.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18sk.si, recovery.l2K98nCnS_CG69RJlGHtZA._18sm.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18sm.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18sm.si, recovery.l2K98nCnS_CG69RJlGHtZA._18sn.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18sn.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18sn.si, recovery.l2K98nCnS_CG69RJlGHtZA._18sp.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18sp.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18sp.si, recovery.l2K98nCnS_CG69RJlGHtZA._18sq.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18sq.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18sq.si, recovery.l2K98nCnS_CG69RJlGHtZA._18ss.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18ss.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18ss.si, recovery.l2K98nCnS_CG69RJlGHtZA._18st.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18st.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18st.si, recovery.l2K98nCnS_CG69RJlGHtZA._18su.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18su.si, recovery.l2K98nCnS_CG69RJlGHtZA._18t2.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18t2.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18t2.si, recovery.l2K98nCnS_CG69RJlGHtZA._18tc.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18tc.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18tc.si, recovery.l2K98nCnS_CG69RJlGHtZA._18tn.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18tn.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18tn.si, recovery.l2K98nCnS_CG69RJlGHtZA._18tt.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18tt.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18tt.si, recovery.l2K98nCnS_CG69RJlGHtZA._18tv.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18tv.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18tv.si, recovery.l2K98nCnS_CG69RJlGHtZA._18ty.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18ty.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18ty.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u0.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u0.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u0.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u1.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u1.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u1.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u3.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u3.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u3.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u4.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u4.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u4.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u5.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u5.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u5.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u6.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u6.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u6.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u7.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u7.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u7.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u8.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u8.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u8.si, recovery.l2K98nCnS_CG69RJlGHtZA._18u9.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18u9.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18u9.si, recovery.l2K98nCnS_CG69RJlGHtZA._18ua.cfe, recovery.l2K98nCnS_CG69RJlGHtZA._18ua.cfs, recovery.l2K98nCnS_CG69RJlGHtZA._18ua.si, recovery.l2K98nCnS_CG69RJlGHtZA._n8t.dii, recovery.l2K98nCnS_CG69RJlGHtZA._n8t.dim, recovery.l2K98nCnS_CG69RJlGHtZA._n8t.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._n8t.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._n8t.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._n8t.si, recovery.l2K98nCnS_CG69RJlGHtZA._n8t_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._n8t_Lucene50_0.tim, recovery.l2K98nCnS_CG69RJlGHtZA._n8t_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._n8t_Lucene70_0.dvd, recovery.l2K98nCnS_CG69RJlGHtZA._n8t_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._vo5.dii, recovery.l2K98nCnS_CG69RJlGHtZA._vo5.dim, recovery.l2K98nCnS_CG69RJlGHtZA._vo5.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._vo5.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._vo5.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._vo5.si, recovery.l2K98nCnS_CG69RJlGHtZA._vo5_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._vo5_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._vo5_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._x0t.dii, recovery.l2K98nCnS_CG69RJlGHtZA._x0t.dim, recovery.l2K98nCnS_CG69RJlGHtZA._x0t.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._x0t.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._x0t.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._x0t.si, recovery.l2K98nCnS_CG69RJlGHtZA._x0t_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._x0t_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._x0t_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._y3g.dii, recovery.l2K98nCnS_CG69RJlGHtZA._y3g.dim, recovery.l2K98nCnS_CG69RJlGHtZA._y3g.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._y3g.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._y3g.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._y3g.si, recovery.l2K98nCnS_CG69RJlGHtZA._y3g_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._y3g_Lucene50_0.tim, recovery.l2K98nCnS_CG69RJlGHtZA._y3g_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._y3g_Lucene70_0.dvd, recovery.l2K98nCnS_CG69RJlGHtZA._y3g_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._yo6.dii, recovery.l2K98nCnS_CG69RJlGHtZA._yo6.dim, recovery.l2K98nCnS_CG69RJlGHtZA._yo6.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._yo6.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._yo6.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._yo6.si, recovery.l2K98nCnS_CG69RJlGHtZA._yo6_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._yo6_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._yo6_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA._zf8.dii, recovery.l2K98nCnS_CG69RJlGHtZA._zf8.dim, recovery.l2K98nCnS_CG69RJlGHtZA._zf8.fdt, recovery.l2K98nCnS_CG69RJlGHtZA._zf8.fdx, recovery.l2K98nCnS_CG69RJlGHtZA._zf8.fnm, recovery.l2K98nCnS_CG69RJlGHtZA._zf8.si, recovery.l2K98nCnS_CG69RJlGHtZA._zf8_Lucene50_0.doc, recovery.l2K98nCnS_CG69RJlGHtZA._zf8_Lucene50_0.tip, recovery.l2K98nCnS_CG69RJlGHtZA._zf8_Lucene70_0.dvm, recovery.l2K98nCnS_CG69RJlGHtZA.segments_74, write.lock]
	at org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:670) ~[lucene-core-7.2.1.jar:7.2.1 b2b6438b37073bee1fca40374e85bf91aa457c0b - ubuntu - 2018-01-10 00:48:43]
	at org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:627) ~[lucene-core-7.2.1.jar:7.2.1 b2b6438b37073bee1fca40374e85bf91aa457c0b - ubuntu - 2018-01-10 00:48:43]
	at org.apache.lucene.index.SegmentInfos.readLatestCommit(SegmentInfos.java:434) ~[lucene-core-7.2.1.jar:7.2.1 b2b6438b37073bee1fca40374e85bf91aa457c0b - ubuntu - 2018-01-10 00:48:43]
	at org.elasticsearch.common.lucene.Lucene.readSegmentInfos(Lucene.java:123) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.index.store.Store.readSegmentsInfo(Store.java:202) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.index.store.Store.access$200(Store.java:130) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.index.store.Store$MetadataSnapshot.loadMetadata(Store.java:859) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.index.store.Store$MetadataSnapshot.<init>(Store.java:792) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.index.store.Store.getMetadata(Store.java:288) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.index.shard.IndexShard.snapshotStoreMetadata(IndexShard.java:1143) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.indices.store.TransportNodesListShardStoreMetaData.listStoreMetaData(TransportNodesListShardStoreMetaData.java:125) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.indices.store.TransportNodesListShardStoreMetaData.nodeOperation(TransportNodesListShardStoreMetaData.java:109) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.indices.store.TransportNodesListShardStoreMetaData.nodeOperation(TransportNodesListShardStoreMetaData.java:61) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.action.support.nodes.TransportNodesAction.nodeOperation(TransportNodesAction.java:140) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.action.support.nodes.TransportNodesAction$NodeTransportHandler.messageReceived(TransportNodesAction.java:262) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.action.support.nodes.TransportNodesAction$NodeTransportHandler.messageReceived(TransportNodesAction.java:258) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:66) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.transport.TcpTransport$RequestHandler.doRun(TcpTransport.java:1555) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:672) ~[elasticsearch-6.2.2.jar:6.2.2]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) ~[elasticsearch-6.2.2.jar:6.2.2]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_171]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_171]
	... 1 more
@markharwood
Copy link
Contributor

Does it mean data loss or normal logic?

Looking at the stack trace there is a missing Lucene file which is not good. Did this index/shard have any docs in it before shutdown? Can you list the current files in the data1/data/nodes/0/indices/XqfshaweRp-5GAXbK7C_qA/23/index directory?

@colings86 colings86 added the :Distributed Indexing/Engine Anything around managing Lucene and the Translog in an open shard. label May 30, 2018
@elasticmachine
Copy link
Collaborator

Pinging @elastic/es-distributed

@javanna
Copy link
Member

javanna commented Jul 3, 2018

Closing for lack of feedback, feel free to reopen and provide the requested info.

@adarsh-nair
Copy link

shutdown

How can we resolve the issue if we have a write-lock for a segment?

Caused by: java.io.FileNotFoundException: no segments* file found in store(mmapfs(E:\nice_systems\RTServer\ElasticSearch\data\nodes\0\indices\AgGE3T0CQqmNUbTa7mgxog\0\index)): files: [recovery.AXzg-JDdOOIh9tEF6n7G._f7t.dii, recovery.AXzg-JDdOOIh9tEF6n7G._f7t.fdx, recovery.AXzg-JDdOOIh9tEF6n7G._f7t_2.liv, recovery.AXzg-JDdOOIh9tEF6n7G._f8p.si, recovery.AXzg-JDdOOIh9tEF6n7G._f8q.cfe, write.lock]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
:Distributed Indexing/Engine Anything around managing Lucene and the Translog in an open shard. feedback_needed
Projects
None yet
Development

No branches or pull requests

6 participants