-
Notifications
You must be signed in to change notification settings - Fork 24.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feeding non-numeric data into a long
field may consume excessive CPU
#40323
Labels
>bug
:Search Foundations/Mapping
Index mappings, including merging and defining field types
Team:Search Foundations
Meta label for the Search Foundations team in Elasticsearch
v6.3.1
v6.4.0
v6.6.2
Comments
DaveCTurner
added
>bug
:Search Foundations/Mapping
Index mappings, including merging and defining field types
v6.4.0
v6.3.1
v6.6.2
labels
Mar 21, 2019
Pinging @elastic/es-search |
DaveCTurner
added a commit
to DaveCTurner/elasticsearch
that referenced
this issue
Mar 21, 2019
Today if you try and insert a very large number like `1e9999999` into a long field we first construct this number as a `BigDecimal`, convert this to a `BigInteger` and then reject it because it is out of range. Unfortunately making such a large `BigInteger` is rather expensive. We can avoid this expense by performing a (weaker) range check on the `BigDecimal` representation of incoming `long`s too. Relates elastic#26137 Closes elastic#40323
DaveCTurner
added a commit
that referenced
this issue
Mar 28, 2019
Today if you try and insert a very large number like `1e9999999` into a long field we first construct this number as a `BigDecimal`, convert this to a `BigInteger` and then reject it because it is out of range. Unfortunately making such a large `BigInteger` is rather expensive. We can avoid this expense by performing a (weaker) range check on the `BigDecimal` representation of incoming `long`s too. Relates #26137 Closes #40323
DaveCTurner
added a commit
that referenced
this issue
Mar 28, 2019
Today if you try and insert a very large number like `1e9999999` into a long field we first construct this number as a `BigDecimal`, convert this to a `BigInteger` and then reject it because it is out of range. Unfortunately making such a large `BigInteger` is rather expensive. We can avoid this expense by performing a (weaker) range check on the `BigDecimal` representation of incoming `long`s too. Relates #26137 Closes #40323
DaveCTurner
added a commit
that referenced
this issue
Mar 28, 2019
Today if you try and insert a very large number like `1e9999999` into a long field we first construct this number as a `BigDecimal`, convert this to a `BigInteger` and then reject it because it is out of range. Unfortunately making such a large `BigInteger` is rather expensive. We can avoid this expense by performing a (weaker) range check on the `BigDecimal` representation of incoming `long`s too. Relates #26137 Closes #40323
DaveCTurner
added a commit
that referenced
this issue
Apr 5, 2019
Today if you try and insert a very large number like `1e9999999` into a long field we first construct this number as a `BigDecimal`, convert this to a `BigInteger` and then reject it because it is out of range. Unfortunately making such a large `BigInteger` is rather expensive. We can avoid this expense by performing a (weaker) range check on the `BigDecimal` representation of incoming `long`s too. Relates #26137 Closes #40323
javanna
added
the
Team:Search Foundations
Meta label for the Search Foundations team in Elasticsearch
label
Jul 16, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
>bug
:Search Foundations/Mapping
Index mappings, including merging and defining field types
Team:Search Foundations
Meta label for the Search Foundations team in Elasticsearch
v6.3.1
v6.4.0
v6.6.2
I've seen a couple of cases where all the
write
threads are heroically trying to parse some very large numbers on the way into along
field, consuming far too much CPU in the process. The hot threads output reported many threads stuck doingBigInteger
shenanigans such as this:One example was at https://discuss.elastic.co/t/high-cpu-usage-in-elasticsearch-nodes/161504 and another was with a customer.
We fall back to
BigInteger
if a naive conversion tolong
fails, for instance if using scientific notation, but then reject the result if it doesn't fit into along
:elasticsearch/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/support/AbstractXContentParser.java
Lines 157 to 180 in 92b2e1a
This means that if you try and insert a short string such as
"1e99999999"
into a long field then we spend a lot of resources converting this into a very largeBigInteger
before deciding it's too big. I think we should not try so hard to parse values like"1e99999999"
as along
.The text was updated successfully, but these errors were encountered: