You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to use filebeat to parse a MySQL slow log, but, having issues with the pipeline not extracting fields and I think it's because the grok pattern doesn't match...
an example slowlog entry is:
# Query_time: 0.049540 Lock_time: 0.000188 Rows_sent: 1 Rows_examined: 10001
SET timestamp=1641329111;
select * from files limit 10000,1;
it seems like it fails matching Id: 8 which is confusing as that should match with (Id:%{SPACE}%{NUMBER:mysql.thread_id:long}%{METRICSPACE})? which is the next item in the pattern.
This seems related to beats#7892 where a similar problem occurred with a MySQL change to 5.7.22. We are running 5.7.26 and I'm assuming MySQL changed something again which caused the issue.
The text was updated successfully, but these errors were encountered:
Hi!
We just realized that we haven't looked into this issue in a while. We're sorry!
We're labeling this issue as Stale to make it hit our filters and make sure we get back to it as soon as possible. In the meantime, it'd be extremely helpful if you could take a look at it as well and confirm its relevance. A simple comment with a nice emoji will be enough :+1.
Thank you for your contribution!
Hi,
I'm trying to use filebeat to parse a MySQL slow log, but, having issues with the pipeline not extracting fields and I think it's because the
grok
pattern doesn't match...an example slowlog entry is:
and using the pattern from this repo in this online grok testing tool it seems like it doesn't match:
it seems like it fails matching
Id: 8
which is confusing as that should match with(Id:%{SPACE}%{NUMBER:mysql.thread_id:long}%{METRICSPACE})?
which is the next item in the pattern.This seems related to beats#7892 where a similar problem occurred with a MySQL change to
5.7.22
. We are running5.7.26
and I'm assuming MySQL changed something again which caused the issue.The text was updated successfully, but these errors were encountered: