You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Logstash version (e.g. bin/logstash --version) - 8.12.1
Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker) - deb
How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes) systemd
Description of the problem including expected versus actual behaviour:
Since I send my logstash logs to an elasticsearch cluster, I enabled JSON mode - this made them significantly easier to read and search in Kibana. Unfortunately, not all logs are making it in now. The primary issue is entries like this: Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-logs", :routing=>nil}, {"@version"=>"1", "loggerName"=>"logstash.outputs.elasticsearch", "logEvent"=>{"action"=>["create", {"pipeline"=>"test-pipeline", "_index"=>"test-index"},{"event"=>... ..., :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"can't merge a non object mapping [logEvent.action] with an object mapping"}}}}
As you can see, LogEvent.action is an array containing both a string and one or more objects: "logEvent"=>{"action"=>["create", {"pipeline"=>"test-pipeline",...}]. Elasticsearch will always reject this record, as it can't have a field of two different types.
I suggest keeping action as the string create and have a separate actions field as an array of the objects.
The workaround is to run logstash logs through another logstash pipeline and flatten the field or separate the items manually - I think it makes more sense for logstash's output to be elasticsearch compatible out of the box though.
Finally, this may not be the only place this kind of thing happens - it just happens to be the place where it's showing up for me.
The text was updated successfully, but these errors were encountered:
Logstash information:
Please include the following information:
bin/logstash --version
) - 8.12.1Description of the problem including expected versus actual behaviour:
Since I send my logstash logs to an elasticsearch cluster, I enabled JSON mode - this made them significantly easier to read and search in Kibana. Unfortunately, not all logs are making it in now. The primary issue is entries like this:
Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-logs", :routing=>nil}, {"@version"=>"1", "loggerName"=>"logstash.outputs.elasticsearch", "logEvent"=>{"action"=>["create", {"pipeline"=>"test-pipeline", "_index"=>"test-index"},{"event"=>...
..., :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"can't merge a non object mapping [logEvent.action] with an object mapping"}}}}
As you can see,
LogEvent.action
is an array containing both a string and one or more objects:"logEvent"=>{"action"=>["create", {"pipeline"=>"test-pipeline",...}]
. Elasticsearch will always reject this record, as it can't have a field of two different types.I suggest keeping
action
as the stringcreate
and have a separateactions
field as an array of the objects.The workaround is to run logstash logs through another logstash pipeline and flatten the field or separate the items manually - I think it makes more sense for logstash's output to be elasticsearch compatible out of the box though.
Finally, this may not be the only place this kind of thing happens - it just happens to be the place where it's showing up for me.
The text was updated successfully, but these errors were encountered: