Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Elastic 7.1 elapsed_end_without_start but works in 6.8.1 (same config) #39

Open
herrkutt opened this issue Jun 26, 2019 · 4 comments
Open

Comments

@herrkutt
Copy link

For all general issues, please provide the following details for fast resolution:

  • Version: Logstash 7.1
  • Operating System: RHEL 6.5
  • Config File (if you have sensitive info, please remove it):
`...MORE ABOVE
if "_grokparsefailure" in [tags]{
grok {
remove_tag => ["_grokparsefailure"]
patterns_dir => ["/etc/logstash/conf.d/patterns"]
match => { "message" => "%{YEAR:log_year}\.%{MONTHNUM:log_month}\.%{MONTHDAY:log_day}\|%{TIME:log_time}\|\|%{SESSIONUID:SESUID}\|\|%{SESSIONUID:THREADUID}\|\(null\)\|%{WORD:ThreadName}\|%{WORD:Action}\|Begin\|check_patient_for_completeness%{GREEDYDATA}"}	
		add_tag => [ "CheckPatientForCompletenessStart"]
		add_field => { "event" => "CheckPatientForCompletenessStart" }
	}
}

if "_grokparsefailure" in [tags]{
	grok {
		remove_tag => ["_grokparsefailure"]
		patterns_dir => ["/etc/logstash/conf.d/patterns"]
		match => { "message" => "%{YEAR:log_year}\.%{MONTHNUM:log_month}\.%{MONTHDAY:log_day}\|%{TIME:log_time}\|\|%{SESSIONUID:SESUID}\|\|%{SESSIONUID:THREADUID}\|\(null\)\|%{WORD:ThreadName}\|%{WORD:Action}\|End\|check_patient_for_completeness%{GREEDYDATA}"}
		
		add_tag => [ "CheckPatientForCompletenessEnd"]
		add_field => { "event" => "CheckPatientForCompletenessEnd" }
	}
}
elapsed {
    start_tag => "CheckPatientForCompletenessStart"
    end_tag => "CheckPatientForCompletenessEnd"
    unique_id_field => "THREADUID"
}`
  • Sample Data:
`2019.04.25|09:36:49.850||B8CA3A947330-5CC1B251-1||B8CA3A947330-5CC1B251-146|(null)|MainThread|Generic|Begin|check_patient_for_completeness()|(null)|
2019.04.25|09:36:50.128||B8CA3A947330-5CC1B251-1||B8CA3A947330-5CC1B251-146|(null)|MainThread|Generic|End|check_patient_for_completeness()|(null)|`
  • Steps to Reproduce:
    I run the configs with the following command line parameters for 6.8.1:

/logstash-6.8.1/bin# ./logstash -f /etc/logstash/conf.d/performance_log/v2/performance.log.conf -w 1 -r

and for my 7.1 installation:

/usr/share/logstash/bin# ./logstash -f /etc/logstash/conf.d/performance_log/v2/performance.log.conf -w 1 -r

However, I get different results!
The 6.8.1 correctly identifies the CheckPatientForCompletenessStart tag as the start and the CheckPatientForCompletenessEnd tag as the ending parameter.
However the 7.1 installation does not do this correctly and constantly displays "elapsed_end_without_start" tag.
The only thing that has changed between these two runs is the version of logstash.

This smells like a bug, but I'm not sure where. Maybe 7.1 is not properly interpreting the -w 1 flag correctly?

7.1 Example (notice how it seems to process the end but completely ignores the start):

    {
         "@timestamp" => 2019-04-25T13:36:50.128Z,
    "workstationName" => "ii-rs-hc-mam-04",
              "event" => "CheckPatientForCompletenessEnd",
               "tags" => [
        [0] "CheckPatientForCompletenessEnd",
        [1] "elapsed_end_without_start"
    ],
          "timestamp" => "2019 04 25 09:36:50.128",
           "SiteName" => "USA-SOMEWHERE-XX",
           "@version" => "1",
               "type" => "plain",
          "THREADUID" => "B8CA3A947330-5CC1B251-146",
            "message" => "2019.04.25|09:36:50.128||B8CA3A947330-5CC1B251-1||B8CA3A947330-5CC1B251-146|(null)|MainThread|Generic|End|check_patient_for_completeness()|(null)|\r",
             "Action" => "Generic",
               "host" => "wtlsuv403.sitename.com",
               "path" => "/var/log/perf_logv2/v3/client_performance_USA-SOMEWHERE-XX_ii-rs-hc-mam-04.log",
             "SESUID" => "B8CA3A947330-5CC1B251-1",
         "ThreadName" => "MainThread"
    }
    {
         "@timestamp" => 2019-04-25T13:36:49.850Z,
    "workstationName" => "ii-rs-hc-mam-04",
              "event" => "CheckPatientForCompletenessStart",
               "tags" => [
        [0] "CheckPatientForCompletenessStart"
    ],
          "timestamp" => "2019 04 25 09:36:49.850",
           "SiteName" => "USA-SOMEWHERE-XX",
           "@version" => "1",
               "type" => "plain",
          "THREADUID" => "B8CA3A947330-5CC1B251-146",
            "message" => "2019.04.25|09:36:49.850||B8CA3A947330-5CC1B251-1||B8CA3A947330-5CC1B251-146|(null)|MainThread|Generic|Begin|check_patient_for_completeness()|(null)|\r",
             "Action" => "Generic",
               "host" => "wtlsuv403.sitename.com",
               "path" => "/var/log/perf_logv2/v3/client_performance_USA-SOMEWHERE-XX_ii-rs-hc-mam-04.log",
             "SESUID" => "B8CA3A947330-5CC1B251-1",
         "ThreadName" => "MainThread"
    }

6.8.1 Example (Correct)

    {
               "type" => "plain",
         "ThreadName" => "MainThread",
             "SESUID" => "B8CA3A947330-5CC1B251-1",
           "@version" => "1",
               "path" => "/var/log/perf_logv2/v3/client_performance_USA-SOMEWHERE-XX_ii-rs-hc-mam-04.log",
          "THREADUID" => "B8CA3A947330-5CC1B251-146",
              "event" => "CheckPatientForCompletenessStart",
               "host" => "wtlsuv403.sitename.com",
         "@timestamp" => 2019-04-25T13:36:49.850Z,
    "workstationName" => "ii-rs-hc-mam-04",
             "Action" => "Generic",
          "timestamp" => "2019 04 25 09:36:49.850",
               "tags" => [
        [0] "CheckPatientForCompletenessStart"
    ],
           "SiteName" => "USA-SOMEWHERE-XX"
    }
    {
                       "type" => "plain",
                 "ThreadName" => "MainThread",
                     "SESUID" => "B8CA3A947330-5CC1B251-1",
    "elapsed_timestamp_start" => 2019-04-25T13:36:49.850Z,
                   "@version" => "1",
                       "path" => "/var/log/perf_logv2/v3/client_performance_USA-SOMEWHERE-XX_ii-rs-hc-mam-04.log",
                  "THREADUID" => "B8CA3A947330-5CC1B251-146",
                      "event" => "CheckPatientForCompletenessEnd",
                       "host" => "wtlsuv403.sitename.com",
                 "@timestamp" => 2019-04-25T13:36:50.128Z,
            "workstationName" => "ii-rs-hc-mam-04",
                     "Action" => "Generic",
                  "timestamp" => "2019 04 25 09:36:50.128",
                       "tags" => [
        [0] "CheckPatientForCompletenessEnd",
        [1] "elapsed",
        [2] "elapsed_match"
    ],
               "elapsed_time" => 0.278,
                   "SiteName" => "USA-SOMEWHERE-XX"
    }

@alisaMfaingold
Copy link

Hi, I'm facing the same problem, did you succeed to get over it?

@herrkutt
Copy link
Author

herrkutt commented Aug 29, 2019

Yes. Kind of.
The problem is with the new java execution engine. I have to run logstash with the parameter —java-execution false in order to get it to work.

You can see more discussion here: https://discuss.elastic.co/t/elpased-filter-works-differently-6-8-1-vs-7-1-possible-bug/187400

@yonmu
Copy link

yonmu commented Sep 3, 2019

@herrkutt
I have the same issue but it's not a statistical one - I get "elapsed_end_without_start" on every end event. I do get the right tags on the events (both start and end), and ELK does recognize the end event through the tag (since it posts "elapsed_end..."), but it does not mark the start event.

Any ideas why? Maybe you have full logstash.config files with the elapsed filter that I can review and understand where my problem is?

Thanks in advance

@widhalmt
Copy link

widhalmt commented Jan 22, 2021

Did anyone succeed to solve this? I see the same problem.

Did you consider that per default Logstash uses 1 pipeline worker per CPU thread? That would work like different instances and so the events might not reach the same filter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants