Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Painless / module failing with: "Too many dynamic script compilations within, max" #9600

Open
ph opened this issue Dec 17, 2018 · 34 comments
Open
Labels
bug Filebeat Filebeat flaky-test Unstable or unreliable test cases. module Team:Integrations Label for the Integrations team :Testing [zube]: Backlog

Comments

@ph
Copy link
Contributor

ph commented Dec 17, 2018

Discovered in #9599

This look like a changes in 7.0?

TransportError(500, u'general_script_exception', u'[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.max_compilations_rate] setting')
-------------------- >> begin captured stdout << ---------------------
Using elasticsearch: http://elasticsearch:9200

--------------------- >> end captured stdout << ----------------------

Stacktrace

  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File "/go/src/github.com/elastic/beats/filebeat/tests/system/test_pipeline.py", line 70, in test_input_pipeline_config
    "value": "test-pipeline",
  File "/go/src/github.com/elastic/beats/filebeat/build/python-env/local/lib/python2.7/site-packages/elasticsearch/transport.py", line 318, in perform_request
    status, headers_response, data = connection.perform_request(method, url, params, body, headers=headers, ignore=ignore, timeout=timeout)
  File "/go/src/github.com/elastic/beats/filebeat/build/python-env/local/lib/python2.7/site-packages/elasticsearch/connection/http_urllib3.py", line 186, in perform_request
    self._raise_error(response.status, raw_data)
  File "/go/src/github.com/elastic/beats/filebeat/build/python-env/local/lib/python2.7/site-packages/elasticsearch/connection/base.py", line 125, in _raise_error
    raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)
TransportError(500, u'general_script_exception', u'[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.max_compilations_rate] setting')
-------------------- >> begin captured stdout << ---------------------
Using elasticsearch: http://elasticsearch:9200

--------------------- >> end captured stdout << ----------------------

Standard Output

Using elasticsearch: http://elasticsearch:9200
@ph
Copy link
Contributor Author

ph commented Dec 17, 2018

@ruflin or @jsoriano do you mind taking a look at this?

@ph ph added the Team:Integrations Label for the Integrations team label Dec 17, 2018
@elasticmachine
Copy link
Collaborator

Pinging @elastic/infrastructure

@jsoriano
Copy link
Member

jsoriano commented Dec 17, 2018

This look like a changes in 7.0?

At least in documentation this limit hasn't changed, maybe we have reached it.

I can take a look to increase this limit.

@webmat
Copy link
Contributor

webmat commented Dec 18, 2018

@jsoriano The best solution is actually to not to increase the limit. If a test suite breaks the amount of compilations allowed, it will absolutely blow up in any serious environments.

The best solution is to figure out which painless script(s) are always recompiling, and parameterize them instead. I've had that happen on a few occasions, and I just needed to move some literal values out of the script and into params. Check out this PR: https://github.com/elastic/beats/pull/9308/files#diff-759f580883147ab049f76cd3501ec965R32

@andrewkroh
Copy link
Member

Duplicate of #9587.

@ruflin
Copy link
Contributor

ruflin commented Dec 18, 2018

I assume we hit the limit as we added more filebeat modules. +1 on fixing the actual scripts instead of increasing the limit.

@jsoriano
Copy link
Member

Agree on fixing the scripts if this helps before increasing the limit. Should we close this issue as duplicated of #9587?

@jsoriano
Copy link
Member

jsoriano commented Dec 18, 2018

@webmat on the PR you mention, what was the script before?

@jsoriano
Copy link
Member

I assume we hit the limit as we added more filebeat modules.

But all filebeat modules trigger recompilations? I have seen that we only have scripts in osquery, auditd and redis modules, and they are not new.

Btw, #5339 would help on things like this 😄

@ruflin
Copy link
Contributor

ruflin commented Dec 18, 2018

I opened a PR to quickly increase the limit for now to get CI to green: #9613

@ruflin
Copy link
Contributor

ruflin commented Dec 18, 2018

@jsoriano +1 on pushing #5339 forward to make debugging such issues easier.

@ruflin
Copy link
Contributor

ruflin commented Dec 18, 2018

@jsoriano Now that we have all conversation already here, should be close the other one and label this one correctly with flaky_test and remove bug?

@jsoriano jsoriano added the flaky-test Unstable or unreliable test cases. label Dec 18, 2018
@jsoriano
Copy link
Member

I actually wonder why is this error happening on test_pipeline.py, the pipeline supposed to be used there is quite small, it just sets a field, and doesn't contain any script, isn't it?

@webmat
Copy link
Contributor

webmat commented Dec 18, 2018

@jsoriano It's a script I was introducing, to adjust a nanosecond duration (ECS) vs the previous format (in ms, iirc). I tried to do this with a simple multiplication with the literal value right in the script. That would reliably trigger the "too many compilations" error.

Moving this to a param made the error go away reliably as well.

I can't say I understand why having this literal in the script would cause that, though. I talked with Jake from Ingest Node about it, and he wasn't 100% sure why either.

Note that the error is being raised by an ES instance that's being used for all the tests, however. It's very likely that it's actually a few scripts together that cause too many compilations to happen on that instance. It's not necessarily just one script. So it's probably worth finding all the places where we have some Painless scripts, and reviewing those as a whole.

@MakoWish
Copy link
Contributor

MakoWish commented Oct 8, 2020

This is an old thread, but I am now seeing this on Filebeat 7.9.2.

Oct 08 12:11:36 LS1 logstash[14876]: [2020-10-08T12:11:36,956][INFO ][logstash.outputs.elasticsearch][main][eabcf14f5c700799b4ebb4fd74921d5ae99feba209b54837a505ad6f3ee77e88] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-suricata-eve-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-suricata-eve-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{suricata.eve.alert.signature_id}}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{suricata.eve.alert.signature_id}}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:18:31 LS1 logstash[8404]: [2020-10-08T12:18:31,230][INFO ][logstash.outputs.elasticsearch][main][54d7488adfe75d95358bda6620c65771bfb94bdc3856d77e0b8acd4693af6ad2] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})

I did try doubling the limit to 150/5m (below) to test, but the setting does not seem to be taking effect. The logs are still showing the max is 75/5m (above log messages are after making the change). Does the cluster need to be restarted for the setting to take effect?

PUT /_cluster/settings
{
  "transient": {
    "script.context.template.max_compilations_rate": "150/5m"
  }
}

Response:

{
  "acknowledged" : true,
  "persistent" : { },
  "transient" : {
    "script" : {
      "context" : {
        "template" : {
          "max_compilations_rate" : "150/5m"
        }
      }
    }
  }
}

UPDATE: I did a rolling restart on the cluster, and it appears the new setting has taken affect. I am no longer seeing any errors relating to the Suricata pipeline, but the errors unfortunately remain for the Cisco pipeline.

Oct 08 12:34:43 LS1 logstash[15201]: [2020-10-08T12:34:43,582][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:43 LS1 logstash[15201]: [2020-10-08T12:34:43,582][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:43 LS1 logstash[15201]: [2020-10-08T12:34:43,582][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:43 LS1 logstash[15201]: [2020-10-08T12:34:43,583][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,397][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,398][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,398][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,399][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,399][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,399][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,399][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,399][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,400][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,400][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,400][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,402][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,403][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})
Oct 08 12:34:44 LS1 logstash[15201]: [2020-10-08T12:34:44,403][INFO ][logstash.outputs.elasticsearch][main][f33b923cea970591aab72cce1491a85aaf561bc780c4fa87f870edd36e46640c] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.2-cisco-ftd-asa-ftd-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{ cisco.ftd.source_interface }}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [150/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})

UPDATE 2: Quadrupled the setting to 300/5m, and the errors have stopped. I do understand from the above conversation, and others, that increasing the limit is not the solution, but it has at least temporarily resolved the errors. Any assistance with properly fixing these errors would be greatly appreciated!

@hueyg
Copy link

hueyg commented Oct 13, 2020

I have this same issue with the ASA module. It is simply non functional for me with the same "too many dynamic scripts" errors. Increasing the setting has no effect in my experience. The issue is so bad that I have reverted back to an old logstash filter that is parsing the files fine, but it has broken ILM so I am having to manually delete large indecies manually. My guess is that this module has not be tested on any actual ASA that is pushing a large amount of log data. Would love to be proven wrong so I can recommend trying it again.

@andrewkroh
Copy link
Member

If you look at GET _nodes/stats you can see the different script context and metrics:

"script" : {
        "compilations" : 284,
        "cache_evictions" : 0,
        "compilation_limit_triggered" : 0,
        "contexts" : [
          {
            "context" : "aggregation_selector",
            "compilations" : 0,
            "cache_evictions" : 0,
            "compilation_limit_triggered" : 0
          },

I recommend checking those metrics to see if you have cache evictions. Ideally these would only be compiled at startup or when the pipeline is first loaded and data volume wouldn't have an impact. If you have evictions then that limit (script.context.$CONTEXT.cache_max_size) might need to be increased to ensure that all the pipelines can be cached.

There were changes to these settings (see breaking changes to ES). I believe that if you want to move over to the new context based settings that you need to set script.max_compilations_rate: use-context first. script.context.ingest.max_compilations_rate and script.context.processor_conditional.max_compilations_rate are now defaulting to unlimited (ref).

There are some good details in elastic/elasticsearch#53756.

@alexander-marquardt
Copy link

I have written a blog on this issue at: https://alexmarquardt.com/2020/10/21/elasticsearch-too-many-script-compilations/

@MakoWish
Copy link
Contributor

I believe that if you want to move over to the new context based settings that you need to set script.max_compilations_rate: use-context first.

Can you confirm that, @andrewkroh? We have zero scripts that we created ourselves. Any scripts running at all would have been included with Elasticsearch, Kibana, Beats, etc.

script.context.ingest.max_compilations_rate and script.context.processor_conditional.max_compilations_rate are now defaulting to unlimited (ref).`

That is not the case for us. I had to manually increase the size as I mentioned above.

Nice blog, @alexander-marquardt! Unfortunately, since we have no scripts of our own, I feel this is an issue that needs to be addressed by the teams at Elastic.

@hueyg
Copy link

hueyg commented Oct 26, 2020 via email

@alexander-marquardt
Copy link

@hueyg - If you are running 7.8 or earlier, see: https://www.elastic.co/guide/en/beats/filebeat/7.8/filebeat-module-cisco.html#dynamic-script-compilations

I believe the issue you are referring to was fixed with enhancements made in 7.9.

@MakoWish
Copy link
Contributor

@alexander-marquardt We just upgraded to 7.9.3 a couple days ago, but when I came to this thread, we were still experiencing the issue with 7.9.2.

@alexander-marquardt
Copy link

alexander-marquardt commented Oct 27, 2020

@MakoWish - Do you know if your cluster using the newer use-context for script.max_compilations_rate? Or perhaps you are still using the depreciated setting (where this value directly contains expected compilations per minute in a format such as "15/1m" as opposed to a value of "use-context")?

You may be able to see this if you run GET _cluster/settings. Alternatively, if you execute GET /_nodes/stats?filter_path=nodes.*.script_cache.contexts and get an empty list, then it may indicate that you are still using the older settings even though you are running with a newer cluster. I understand that the newer per-context settings have increased some of the limits as mentioned in the documentation. For example:

For most contexts, you can compile up to 75 scripts per 5 minutes by default. For ingest contexts, the default script compilation rate is unlimited.

@andrewkroh
Copy link
Member

I have opened elastic/elasticsearch#64595 to request an unlimited compilation rate limit for the template context.

@MakoWish
Copy link
Contributor

MakoWish commented Nov 4, 2020

@MakoWish - Do you know if your cluster using the newer use-context for script.max_compilations_rate? Or perhaps you are still using the depreciated setting (where this value directly contains expected compilations per minute in a format such as "15/1m" as opposed to a value of "use-context")?

Here is the result of GET _cluster/settings:

{
  "persistent" : {
    <ommitted for brevity>
  },
  "transient" : {
    "script" : {
      "context" : {
        "template" : {
          "max_compilations_rate" : "300/5m"
        }
      }
    }
  }
}

@alexander-marquardt, are you suggesting I manually change the value of 300/5m to the literal string use-context?

@alexander-marquardt
Copy link

That looks like you are already using the new settings. The blog I posted earlier describes how to see if the script cache is churning and how to increase its size.

@MakoWish
Copy link
Contributor

MakoWish commented Nov 4, 2020

That looks like you are already using the new settings. The blog I posted earlier describes how to see if the script cache is churning and how to increase its size.

I fear my response may have been a bit confusing, as I had first posted the actual setting from GET _cluster/settings, then I asked about a sample POST request to change the setting. I removed the POST I was asking about for clarity. Are you saying the code snippet in my edited post above is the correct new setting? If so, again, this would fall on the Beats team to look into why Filebeat includes scripts that cause errors. These are not custom scripts for me to modify (included with Filebeat), and default settings should not cause errors.

@MakoWish
Copy link
Contributor

MakoWish commented Nov 17, 2020

Any update on this? I am still seeing the issue on 7.10.0. To note, it only appears to have gotten worse with 7.10.0. I had previously put this setting to 300/5m, and that stopped the errors, but now even that setting is throwing errors. I just had to increase it to 500/5m. Again, I understand this is just a band-aide, so still waiting on feedback/suggestions from someone at Elastic.

@brandenwagner
Copy link

brandenwagner commented Nov 27, 2020

I realize this is being worked on but i am looking for a temporary solution in the mean time, I thought I understood but with this applied:

"transient" : {
    "script" : {
      "context" : {
        "ingest" : {
          "max_compilations_rate" : "500/1m"
        }
      }
    }
  }
}

I am still getting this error
Too many dynamic script compilations within, max: [30/1m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.max_compilations_rate] setting];;

Why does it still reflect 30/1m in the error? I am using 7.10 zeek module is that a different context somehow?

@alexander-marquardt
Copy link

alexander-marquardt commented Nov 29, 2020

In order to use different contexts (such as the "ingest" context you are attempting to use above), contexts need to be enabled.

If you run the following command and get an empty response, then contexts are not enabled:

GET /_nodes/stats?filter_path=nodes.*.script_cache.contexts

To enable contexts you can do the following:

PUT _cluster/settings
{
    "persistent": {
        "script.max_compilations_rate": "use-context"
    }
}

Also, be careful with the transient setting, as that will only last until a reboot. I suspect persistent is likely preferred for the majority of use cases.

@brandenwagner
Copy link

Yes I have that setting applied, but its still giving the same error as if it wasnt applied. contexts are enabled, i can see various contexts in the stats query (and they all read zero)

{
    "context" : "ingest",
    "compilations" : 0,
    "cache_evictions" : 0,
    "compilation_limit_triggered" : 0
},

What am I missing? Where is the 30/1m coming from if I have use-context set and the ingest context rate set at 500/1m?
Why is compiliation_limit_triggered still 0? shouldnt that go up with every rejection?

@kofi-grid
Copy link

I'm also getting this in 7.10.1 filebeat while trying to pull in zeek via the zeek module. Is there something at the root of this that I can address?

@kkojouri
Copy link

We're seeing also seeing this for the Zeek module, but we're running filebeat 7.9.0. Logs keep queuing up in Logstash and the only way to resolve it is to delete and recreate the filebeat-7.9.0-zeek-ssl-pipeline pipeline. We're reluctant to increase the max_compilations_rate. We're using Elastic cloud service running Elasticsearch 7.10.0.

[2021-01-28T18:32:46,528][INFO ][logstash.outputs.elasticsearch][*******][********] retrying failed action with response code: 500 ({"type"=>"illegal_state_exception", "reason"=>"pipeline with id [filebeat-7.9.0-zeek-ssl-pipeline] could not be loaded, caused by [ElasticsearchParseException[Error updating pipeline with id [filebeat-7.9.0-zeek-ssl-pipeline]]; nested: GeneralScriptException[Failed to compile inline script [{{zeek.ssl.cipher}}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; GeneralScriptException[Failed to compile inline script [{{zeek.ssl.cipher}}] using lang [mustache]]; nested: CircuitBreakingException[[script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting];; org.elasticsearch.common.breaker.CircuitBreakingException: [script] Too many dynamic script compilations within, max: [75/5m]; please use indexed, or scripts with parameters instead; this limit can be changed by the [script.context.template.max_compilations_rate] setting]"})

@kesslerm
Copy link
Contributor

Confirmed with custom ingest pipelines, too, that an ingest pipeline needs to be deleted and re-loaded to get Logstash events flowing again.

@kvch kvch removed their assignment Oct 7, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Filebeat Filebeat flaky-test Unstable or unreliable test cases. module Team:Integrations Label for the Integrations team :Testing [zube]: Backlog
Projects
None yet
Development

Successfully merging a pull request may close this issue.