Elastic Filebeat: Utilizing Google Pub/Sub to Read Logs from Google Cloud Storage – Equivalent to SQS in AWS #37452
Labels
enhancement
Filebeat
Filebeat
question
Team:Security-Service Integrations
Security Service Integrations Team
I have applications that can only write to either AWS S3 or Google Cloud Storage. Every hour, these applications generate a large number of small log files. Previously, these logs were written to AWS S3 with notifications sent to AWS SQS. Using Filebeat with input type: aws-s3 and specifying the queue_url successfully read the SQS queue, retrieving the logs from AWS S3 (https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-aws-s3.html#_queue_url).
Now the applications write logs to Google Cloud Storage. If I use Filebeat with type: gcs (https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-gcs.html) , Filebeat saves the offset for each file (of which there are a large number). If I set up notifications for newly created files in Google Cloud Storage to Google Pub/Sub. Filebeat with input type: gcp-pubsub (https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-gcp-pubsub.html) only retrieves notifications, not the actual logs.
Is there a mechanism in Filebeat to use Google Cloud Storage+Pub/Sub similar as S3+SQS? Of course, it's possible to additionally configure DataFlow to write the contents of files from Google Cloud Storage to Google Pub/Sub, but I would like to avoid that if possible.
The text was updated successfully, but these errors were encountered: