Replies: 1 comment
-
Since Centralized Logging with OpenSearch v2.0, there is a feature called S3 Connector is designed for this purse. The basic idea is sending the logs from on-premise to an Amazon S3 bucket, and use the CLO to create a pipeline to read, parse, and move the data from S3 to OpenSearch clusters. Here are the overall steps:
Step 1. Create A Log Config to parse the logsPlease follow this documentation to create a Log Config, which can parse your logs. Step 2. Create an application log pipelinePlease follow this documentation. Please ignore the Buffer settings, ingest logs from the S3 bucket do not need a Buffering layer. Do remember to choose a bucket (with prefix) without S3 Event Notification configured, otherwise, you will encounter an error. Choose the Log Config, you have created in Step 1. Once the pipeline is being active, it is ready to ingest logs from S3 to OpenSearch. We should suggest you to choose Step 3. Configure Fluent Bit to upload logs to the S3 bucketBy default, Fluent Bit supports upload logs from anywhere to Amazon S3 bucket. If you need to upload to on-premises, then you should configure the Here is an example of uploading Nginx logs to Amazon S3. Please remember to adjust the parameters using this guide fluent-bit.conf
applog_parsers.conf
Step 4. Create Index Pattern in OpenSearch Dashboards.
Step 5. Check the logs in OpenSearch Dashboards - Discover
|
Beta Was this translation helpful? Give feedback.
-
I have applications running on-premises, and would like to ingest logs to the OpenSearch on the AWS cloud.
Beta Was this translation helpful? Give feedback.
All reactions