Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ParseMysql exception (index out of range) #140

Closed
NITEMAN opened this issue May 27, 2015 · 10 comments
Closed

ParseMysql exception (index out of range) #140

NITEMAN opened this issue May 27, 2015 · 10 comments
Assignees
Labels

Comments

@NITEMAN
Copy link

NITEMAN commented May 27, 2015

We are just trying 1.0-beta and getting a lot of errors like this in our syslog

May 27 22:11:00 xxx /usr/bin/packetbeat[62464]: log_unix.go:123: ParseMysql exception. Recovering, but please report this: runtime error: index out of range.
May 27 22:11:00 xxx /usr/bin/packetbeat[62464]: log_unix.go:124: Stacktrace: /home/vagrant/src/github.com/elastic/packetbeat/Godeps/_workspace/src/github.com/elastic/libbeat/logp/log_unix.go:124 (0x4667f2)#012/usr/local/go/src/runtime/asm_amd64.s:401 (0x430135)#012/usr/local/go/src/runtime/panic.go:387 (0x41cea8)#012/usr/local/go/src/runtime/panic.go:12 (0x41c03e)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/mysql/mysql.go:634 (0x4e7962)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/mysql/mysql.go:589 (0x4e5bd9)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/mysql/mysql.go:498 (0x4e4895)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/mysql/mysql.go:459 (0x4e422f)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:80 (0x4f85a1)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:168 (0x4f9558)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:328 (0x4fb61d)#012/home/vagrant/src/github.com/elastic/packetbeat/sniffer/sniffer.go:316 (0x509b15)#012/home/vagrant/src/github.com/elastic/packetbeat/main.go:235 (0x408d66)#012/usr/local/go/src/runtime/asm_amd64.s:2232 (0x432271)

Quite weird the vagrant route in this errors too

@tsg tsg added the bug label May 27, 2015
@tsg
Copy link
Contributor

tsg commented May 27, 2015

Thanks for the report. Which mysql version is it? We'll try to figure it out, but if you can make a short trace like explained here it might speed us up.

@NITEMAN
Copy link
Author

NITEMAN commented May 28, 2015

It's MySQL 5.6 from dotdeb's repository over Debian Wheezy x64

I'll try to record a trace when I have time, but as this is a quite busy production system it might be a bit big&noisy.

@tsg
Copy link
Contributor

tsg commented May 28, 2015

Thanks, we'll check the code in the meantime. If we're fairly sure we know what the issue is, we'll let you know so you don't have to do the trace.

@tsg tsg self-assigned this May 28, 2015
@abulimov
Copy link

Hi, we are trying 1.0-beta1 and getting similar issue with Postgres 9.3.5:

May 28 16:15:55 db-01 /usr/bin/packetbeat[4669]: log_unix.go:123: ParsePgsql exception. Recovering, but please report this: runtime error: slice bounds out of range.
May 28 16:15:55 db-01 /usr/bin/packetbeat[4669]: log_unix.go:124: Stacktrace: /home/vagrant/src/github.com/elastic/packetbeat/Godeps/_workspace/src/github.com/elastic/libbeat/logp/log_unix.go:124 (0x4662f2)#012/usr/local/go/src/runtime/asm_amd64.s:401 (0x42fc35)#012/usr/local/go/src/runtime/panic.go:387 (0x41c9a8)#012/usr/local/go/src/runtime/panic.go:18 (0x41bb8e)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/pgsql/pgsql.go:506 (0x4ebfd3)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/pgsql/pgsql.go:671 (0x4ed1df)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:80 (0x4f80a1)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:168 (0x4f9058)#012/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:328 (0x4fb11d)#012/home/vagrant/src/github.com/elastic/packetbeat/sniffer/sniffer.go:316 (0x509615)#012/home/vagrant/rpmbuild/BUILD/packetbeat/main.go:235 (0x408866)#012/usr/local/go/src/runtime/asm_amd64.s:2232 (0x431d71)

While error says Recovering, packetbeat stops producing any data to Elasticsearch.

@NITEMAN
Copy link
Author

NITEMAN commented May 29, 2015

Hi, I've just captured an small trace with the error (around 31M), please let me know can I send you the file (I would prefer not uploading it publicly)

Output is:

# packetbeat -e -dump trace.pcap
log_unix.go:123: ERR  ParseMysql exception. Recovering, but please report this: runtime error: index out of range.
log_unix.go:124: ERR  Stacktrace: /home/vagrant/src/github.com/elastic/packetbeat/Godeps/_workspace/src/github.com/elastic/libbeat/logp/log_unix.go:124 (0x4667f2)
/usr/local/go/src/runtime/asm_amd64.s:401 (0x430135)
/usr/local/go/src/runtime/panic.go:387 (0x41cea8)
/usr/local/go/src/runtime/panic.go:12 (0x41c03e)
/home/vagrant/src/github.com/elastic/packetbeat/protos/mysql/mysql.go:634 (0x4e7962)
/home/vagrant/src/github.com/elastic/packetbeat/protos/mysql/mysql.go:589 (0x4e5bd9)
/home/vagrant/src/github.com/elastic/packetbeat/protos/mysql/mysql.go:498 (0x4e4895)
/home/vagrant/src/github.com/elastic/packetbeat/protos/mysql/mysql.go:459 (0x4e422f)
/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:80 (0x4f85a1)
/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:168 (0x4f9558)
/home/vagrant/src/github.com/elastic/packetbeat/protos/tcp/tcp.go:328 (0x4fb61d)
/home/vagrant/src/github.com/elastic/packetbeat/sniffer/sniffer.go:316 (0x509b15)
/home/vagrant/src/github.com/elastic/packetbeat/main.go:235 (0x408d66)
/usr/local/go/src/runtime/asm_amd64.s:2232 (0x432271)

@tsg
Copy link
Contributor

tsg commented May 29, 2015

Please send it to [email protected]. Thanks!

@tsg
Copy link
Contributor

tsg commented Jun 5, 2015

The transaction causing issues contains a huge mysql response. The way Packetbeat works in the default configuration in this case is that it generally only cares about the beginning of the response but still goes on to parse everything else so that it can correctly detect the next request / response from the same TCP connection.

In this case, the transaction at hand is correctly parsed and published, but when parsing the rest of the response the decoder encounters an error. It's not clear yet why the error happens, Wireshark is also confused about that response.

So while the end result is correct (at least for the trace I got), printing that exception is not ideal.

How often does it print the stack trace?

@tsg
Copy link
Contributor

tsg commented Jun 5, 2015

@abulimov: I suspect it's a different issue in your case. If you could send me a trace file, that would be great. Please open a new issue if you have more data to add.

@tsg
Copy link
Contributor

tsg commented Jun 5, 2015

Ok, on further inspection I found a bug that was affecting MySQL responses larger than 100kb, which seems to have been the issue in the trace. Should be fixed by this commit: 2a02653

@tsg tsg closed this as completed Jun 5, 2015
@NITEMAN
Copy link
Author

NITEMAN commented Jun 7, 2015

We will give a try to the next beta when released and confirm if it's fixed in our case

Tank you!

monicasarbu added a commit that referenced this issue Dec 2, 2015
tsg added a commit that referenced this issue Dec 2, 2015
Fix #140: Convert offest and line to long fields to prevent overflow
tsg pushed a commit to tsg/beats that referenced this issue Jan 20, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants