You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 3, 2024. It is now read-only.
@shazChaudhry Hope you are doing good.
I am working on configuring ELK with filebeat on docker. I have all the logs available on Kinbana dashboard, which are coming from filebeat docker.
Now my query is how to filter the message content of a log file which is coming from another server.
I have included the following lines in logstash.conf file which is in ELK stack server.
filter {
#/var/log/xxx/error.log
if ([log][file][path] =~ "/logs/error.log") {
grok {
match => { "message" => "%{DATE:date} %{TIME:time} | %{LOGLEVEL:loglevel} | %{IP:client_ip} [%{NUMBER:bytes}] %{WORD:method} /%{NOTSPACE:request_page} HTTP/%{NUMBER:http_version} | %{GREEDYDATA:logmessage}" }
}
}
}
However, it is not working.
here is my filebeat-docker.yml
filebeat.config:
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
processors:
filebeat.inputs:
enabled: true
paths:
exclude_files: ['.gz$']
json.message_key: log
include_lines: ['^ERR', '^WARN']
output.elasticsearch:
hosts: '${ELASTICSEARCH_HOSTS:elasticsearch:9200}'
Can you please suggest on this?
Sabil.
The text was updated successfully, but these errors were encountered: