Create a Logstash config to feed data to ElasticSearch from Kafka

Photo by Danny de Groot on Unsplash

You can feed Kafka messages to ElasticSearch via Logstash. You may use these values to search for messages between given dates (like a diff service) and also for logging purposes (Kibana). Different requirements, such as masking or hiding a field for security reasons in logging, may appear.

Here, we will mask the first 15 characters of “subject” field and also remove “fragment” field. If we do this directly on the “zimbra-events” logstash config log type, the fields required for search responses will be modified.

If we search through the internet, we come accross the usage of “clone” as recommended here:

https://alexmarquardt.com/2018/08/31/using-logstash-to-drive-filtered-data-from-a-single-source-into-multiple-output-destinations/

https://stackoverflow.com/questions/60735821/how-to-remove-field-in-logstash-output

Kafka input:

File output:

Logstash “pipeline.conf” file:

Here, “itemType” is a field in the message and depending on its value, it will be directed to another index type (if its value is “CALENDAR” or not) in ElasticSearch.

“mutate” is used to clone and modify values. A different type value is assigned to the mutated filter. For the output, this type value is used to direct the output to a local file here.

Happy Coding!

--

--

I would love to change the world, but they won’t give me the source code | coding 👩🏼‍💻 | coffee ☕️ | jazz 🎷 | anime 🐲 | books 📚 | drawing 🎨

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Nil Seri

I would love to change the world, but they won’t give me the source code | coding 👩🏼‍💻 | coffee ☕️ | jazz 🎷 | anime 🐲 | books 📚 | drawing 🎨