You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
JVM Version
$ java -version
java version "1.8.0_131"
Java(TM) SE Runtime Environment (build 1.8.0_131-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode)
Description of the problem including expected versus actual behavior
Using this pipeline i see elasticsearch successfully ingest an event occasionally that contains field data which is mapped to a long, that exceeds the value of long. I would expect elasticsearch to reject such documents as the data for the datatype is invalid. The mappings appear in elasticsearch as expected. When elasticdump is run it exports the erroneous value however when importing using elasticdump it correctly rejects the document due to invalid value.
Steps to reproduce
install metricbeat on centos and configure to send to logstash(default mappings can be used for metricbeat)
logstash configured to send to elasticsearch index log-systeminfo-{date}
Mappings and A sample document that illustrates the issue can be found attached.
The field system.process.cgroup.memory.mem.limit.bytes clearly defined as a long but has the value 9223372036854776000 which exceeds max long value of 9223372036854775807by 193
Ingest Pipeline (all versions are 5.5.0)
Metricbeat(CentOS 6.5) > Logstash(Ubuntu 16.04) > Elasticsearch(Ubuntu 16.04)
JVM Version
$ java -version
java version "1.8.0_131"
Java(TM) SE Runtime Environment (build 1.8.0_131-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode)
Description of the problem including expected versus actual behavior
Using this pipeline i see elasticsearch successfully ingest an event occasionally that contains field data which is mapped to a long, that exceeds the value of long. I would expect elasticsearch to reject such documents as the data for the datatype is invalid. The mappings appear in elasticsearch as expected. When elasticdump is run it exports the erroneous value however when importing using elasticdump it correctly rejects the document due to invalid value.
Steps to reproduce
install metricbeat on centos and configure to send to logstash(default mappings can be used for metricbeat)
logstash configured to send to elasticsearch index log-systeminfo-{date}
Mappings and A sample document that illustrates the issue can be found attached.
The field system.process.cgroup.memory.mem.limit.bytes clearly defined as a long but has the value 9223372036854776000 which exceeds max long value of 9223372036854775807by 193
mapping.txt
sample.txt
The text was updated successfully, but these errors were encountered: