I'm currently in the process of deploying a fluent-bid daemonset on kubernetes. Our goal is to get the kubernetes pod logs to elastic.
Problem: We have the prerequisite to send logs in JSON format and of course not all of them log in that format.
One example would be our openldap server (where you cant change the log format in the application), logging in quite the random format:
conn=1234 fd=56 ACCEPT from IP=192.168.1.1:54321 (IP=0.0.0.0:389)
conn=1234 op=0 BIND dn="cn=admin,dc=example,dc=com" method=128
conn=1234 op=0 RESULT tag=97 err=0 text=
Thats just an example, there is unfortunately a few more varieties.
What would be the best way to get these logs as JSON to elastic? Maybe with the "record" function in the filter section to extract every field you want? But then I would loose non matching parts, right?
Appreciate your help!
Overall answer is to process data, for given example you I see everything is separated by a single space, so you can use PARSER with regex to expose it in format whatever format you like. This will not work if there is no way to create correct regular expression