-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Kibana failed to export or filter saved objects if result window is too large #116106
Comments
you need to find the related index name in elasticsearch error logs and use the below command to change the settings using curl. However, scroll api calls, index name need not be mentioned but you must try it for single index
or update all existing indices from the api
you can also create the template to take effect on future indices that targets specific index patterns (or just * for global)
output would be:
another way |
Pinging @elastic/kibana-core (Team:Core) |
I'm surprised to see this reported for 7.14, as we introduced support for exporting >10k objects in 7.12: #89915 Are you certain this is a 7.14 deployment? Note that you will need to increase the |
The HTTP API supported it, but the UI didn't in |
Ah of course 🤦 Thanks for updating. |
Kibana version:
7.14.1
Elasticsearch version:
7.14.1
Server OS version:
Official Docker images.
Describe the bug:
Kibana fails to export or filter saved objects if their number exceeds 10.000.
Steps to reproduce:
Expected behavior:
Screenshots (if relevant):
Errors in browser console (if relevant):
Provide logs and/or server output (if relevant):
{
"statusCode": 400,
"error": "Bad Request",
"message": "all shards failed: search_phase_execution_exception: [illegal_argument_exception] Reason: Result window is too large, from + size must be less than or equal to: [10000] but was [11000]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting.",
}
The text was updated successfully, but these errors were encountered: