You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
$ docker run --rm -p 6996:6996 docker.elastic.co/logstash/logstash:8.5.3 -e '' --version
Using bundled JDK: /usr/share/logstash/jdk
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2022-12-19T20:20:49,552][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
logstash 8.5.3
jruby 9.3.9.0 (2.6.8) 2022-10-24 537cd1f8bc OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0.5+8 +indy +jit [x86_64-linux]
java 17.0.5 (Eclipse Adoptium)
jvm OpenJDK 64-Bit Server VM / 17.0.5+8
OS version (uname -a if on a Unix-like system):
$ uname -a ; docker --version
Linux workstation 5.19.0-26-generic #27-Ubuntu SMP PREEMPT_DYNAMIC Wed Nov 23 20:44:15 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Docker version 20.10.16, build 20.10.16-0ubuntu1
Description of the problem including expected versus actual behaviour:
I raised issue 13606 around the failure of logstash to handle incoming records with embedded ['s in field names. It was closed under PR 14044, but I tested it today with 8.5.3 and the original repro still works:
When sending json data though the json filter or input codec, if one of the field names contains a specials char (e.g. '['), the parser fails.
Note that '[' is a legal char in a field name according to the json spec, and can be successfully parsed by other parsers (e.g. jq and ruby)
Ref: Logstash bug 14821
Logstash information:
OS version (
uname -a
if on a Unix-like system):Description of the problem including expected versus actual behaviour:
I raised issue 13606 around the failure of logstash to handle incoming records with embedded
[
's in field names. It was closed under PR 14044, but I tested it today with 8.5.3 and the original repro still works:When sending json data though the json filter or input codec, if one of the field names contains a specials char (e.g. '['), the parser fails.
Note that '[' is a legal char in a field name according to the json spec, and can be successfully parsed by other parsers (e.g. jq and ruby)
Steps to reproduce:
Start a very simple logstash instance:
Send in some sample data - when there are special chars present _jsonparsefailure occurs:
The text was updated successfully, but these errors were encountered: