Logstash 6.2.1 发布,开源服务端数据处理流程

发布于 2018年02月11日
收藏 4

Logstash 6.2.1 已发布,Logstash 是一个开源的服务端数据处理流程,可同时从多个源中获取数据,将其转换,然后将其发送到“收藏”中,目前拥有超过 200 个插件。它能集中、转换和藏匿您的数据。

同样的,暂未发现 Logstash 6.2.1 更新内容,点此保持关注

不妨看看 6.2.0 的更新说明:

  • Added support to protect sensitive settings and configuration in a keystore.

  • Added the jdbc_static filter as a default plugin.

  • Set better defaults to allow for higher throughput under load. (#8707 and #8702)

  • Set the default configuration for RPM/DEB/Docker installations to use Multiple pipelines.

  • Added a default max size value (100MB) for log files.

  • Added compression when log files are rolled (for ZIP-based installs).

  • Added the ability to specify --pipeline.id from the command line. (#8868)

  • Implemented continued improvements to the next generation of execution. Give it a try with the command line switch --experimental-java-execution.


Jdbc_static Filter

  • Released the initial version the jdbc_static filter, which enriches events with data pre-loaded from a remote database.

Dissect Filter

  • Fixed multiple bugs. See the plugin release notes for 1.1.3.

Grok Filter

  • Fixed a thread leak that occurred when Logstash was reloaded.

Kafka Output

  • Improved error logging for when a producer cannot be created.


转载请注明:文章转载自 开源中国社区 [http://www.oschina.net]
本文标题:Logstash 6.2.1 发布,开源服务端数据处理流程