diff --git a/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-data-stream.png b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-data-stream.png
new file mode 100644
index 0000000000..e70670e4f3
Binary files /dev/null and b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-data-stream.png differ
diff --git a/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-destination-errors.png b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-destination-errors.png
new file mode 100644
index 0000000000..471fa9fbbc
Binary files /dev/null and b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-destination-errors.png differ
diff --git a/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-firehose-stream.png b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-firehose-stream.png
new file mode 100644
index 0000000000..b81f0b7f66
Binary files /dev/null and b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-firehose-stream.png differ
diff --git a/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-log-group.png b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-log-group.png
new file mode 100644
index 0000000000..44cb5e2dd9
Binary files /dev/null and b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-log-group.png differ
diff --git a/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-sample-logs.png b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-sample-logs.png
new file mode 100644
index 0000000000..4d6a8df618
Binary files /dev/null and b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-sample-logs.png differ
diff --git a/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-subscription-filter.png b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-subscription-filter.png
new file mode 100644
index 0000000000..2f7293d375
Binary files /dev/null and b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-subscription-filter.png differ
diff --git a/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-verify-discover.png b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-verify-discover.png
new file mode 100644
index 0000000000..bb95b164ad
Binary files /dev/null and b/docs/en/observability/cloud-monitoring/aws/images/firehose-cloudwatch-verify-discover.png differ
diff --git a/docs/en/observability/cloud-monitoring/aws/monitor-amazon-intro.asciidoc b/docs/en/observability/cloud-monitoring/aws/monitor-amazon-intro.asciidoc
index e644c5823d..928b86c0bb 100644
--- a/docs/en/observability/cloud-monitoring/aws/monitor-amazon-intro.asciidoc
+++ b/docs/en/observability/cloud-monitoring/aws/monitor-amazon-intro.asciidoc
@@ -40,6 +40,8 @@ include::monitor-aws-cloudtrail-firehose.asciidoc[leveloffset=+2]
include::monitor-aws-waf-firehose.asciidoc[leveloffset=+2]
+include::monitor-aws-cloudwatch-firehose.asciidoc[leveloffset=+2]
+
include::monitor-aws-firehose-troubleshooting.asciidoc[leveloffset=+2]
include::monitor-aws-esf.asciidoc[]
diff --git a/docs/en/observability/cloud-monitoring/aws/monitor-aws-cloudwatch-firehose.asciidoc b/docs/en/observability/cloud-monitoring/aws/monitor-aws-cloudwatch-firehose.asciidoc
new file mode 100644
index 0000000000..fbaa9d2118
--- /dev/null
+++ b/docs/en/observability/cloud-monitoring/aws/monitor-aws-cloudwatch-firehose.asciidoc
@@ -0,0 +1,204 @@
+[[monitor-aws-cloudwatch-firehose]]
+= Monitor CloudWatch logs
+
+++++
+Monitor CloudWatch logs
+++++
+
+In this section, you'll learn how to export log events from CloudWatch logs to an Elastic cluster by using Amazon Data Firehose.
+
+You'll go through the following steps:
+
+- Install AWS integration in {kib}
+- Select a CloudWatch log group to monitor
+- Create a delivery stream in Amazon Data Firehose
+- Set up a subscription filter to forward the logs using the Firehose stream
+- Visualize your logs in {kib}
+
+[discrete]
+[[firehose-cloudwatch-prerequisites]]
+== Before you begin
+
+We assume that you already have:
+
+- An AWS account with permissions to pull the necessary data from AWS.
+- A deployment using our hosted {ess} on {ess-trial}[{ecloud}]. The deployment includes an {es} cluster for storing and searching your data, and {kib} for visualizing and managing your data. AWS Data Firehose works with Elastic Stack version 7.17 or greater, running on Elastic Cloud only.
+
+IMPORTANT: AWS PrivateLink is not supported. Make sure the deployment is on AWS, because the Amazon Data Firehose delivery stream connects specifically to an endpoint that needs to be on AWS.
+
+[discrete]
+[[firehose-cloudwatch-step-one]]
+== Step 1: Install AWS integration in {kib}
+
+. In {kib}, navigate to *Management* > *Integrations* and browse the catalog to find the AWS integration.
+
+. Navigate to the *Settings* tab and click *Install AWS assets*.
+
+[discrete]
+[[firehose-cloudwatch-step-two]]
+== Step 2: Select a CloudWatch log group to monitor
+
+image::firehose-cloudwatch-log-group.png[CloudWatch log group]
+
+In this tutorial, you are going to collect application logs from an AWS Lambda-based app and forward them to Elastic.
+
+**Create a Lambda function**
+
+NOTE: You can skip this section if you already have a Lambda function, or any other service or application that sends logs to a CloudWatch log group. Take note of the log group from which you want to collect log events and move to the next section.
+
+Like many other services and platforms in AWS, Lambda functions natively log directly to CloudWatch out of the box.
+
+. Go to the https://console.aws.amazon.com/[AWS console] and open the AWS Lambda page.
+. Click **Create function** and select the option to create a function from scratch.
+. Select a **Function name**.
+. As a **Runtime**, select a recent version of Python. For example, Python 3.11.
+. Select your **Architecture** of choice between `arm64` and `x86_64`.
+. Confirm and create the Lambda function.
++
+When AWS finishes creating the function, go to the **Code source** section and paste the following Python code as function source code:
++
+[source,python]
+----
+import json
+
+def lambda_handler(event, context):
+ print("Received event: " + json.dumps(event))
+----
+
+. Click **Deploy** to deploy the changes to the source code.
+
+**Generate some sample logs**
+
+With the function ready to go, you can invoke it a few times to generate sample logs.
+On the function page, follow these steps:
+
+. Select **Test**.
+. Select the option to create a new test event.
+. Name the test event and **Save** the changes.
+. Click the **Test** button to execute the function.
+
+Visit the function's log group. Usually, the AWS console offers a handy link to jump straight to the log group it created for this function's logs.
+You should get something similar to the following:
+
+image::firehose-cloudwatch-sample-logs.png[CloudWatch log group with sample logs]
+
+Take note of the log group name for this Lambda function, as you will need it in the next steps.
+
+[discrete]
+[[firehose-cloudwatch-step-three]]
+== Step 3: Create a stream in Amazon Data Firehose
+
+image::firehose-cloudwatch-firehose-stream.png[Amazon Firehose Stream]
+
+. Go to the https://console.aws.amazon.com/[AWS console] and navigate to Amazon Data Firehose.
+
+. Click *Create Firehose stream* and choose the source and destination of your Firehose stream. Unless you are streaming data from Kinesis Data Streams, set source to `Direct PUT` and destination to `Elastic`.
+
+. Provide a meaningful *Firehose stream name* that will allow you to identify this delivery stream later.
++
+NOTE: For advanced use cases, source records can be transformed by invoking a custom Lambda function. When using Elastic integrations, this should not be required.
+
+. In the **Destination settings** section, set the following parameter:
+`es_datastream_name` = `logs-aws.generic-default`
+
+The Firehose stream is now ready to send logs to your Elastic Cloud deployment.
+
+[discrete]
+[[firehose-cloudwatch-step-four]]
+== Step 4: Send Lambda function log events to a Firehose stream
+
+image::firehose-cloudwatch-subscription-filter.png[CloudWatch subscription filter]
+
+To send log events from CloudWatch to Firehose, open the log group where the Lambda service is logging and create a subscription filter.
+
+**Create a subscription filter for Amazon Data Firehose**
+
+The https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/Subscriptions.html[subscription filter] allows you to pick log events from the log group and forward them to other services, such as an Amazon Kinesis stream, an Amazon Data Firehose stream, or AWS Lambda.
+
+. On the log group page, select *Subscription filters* and click the *Create Amazon Data Firehose subscription filter* button.
+
+From here, follow these steps:
+
+. Choose a destination. Select the Firehose stream you created in the previous step.
+
+. Grant the CloudWatch service permission to send log events to the stream in Firehose:
+
+.. Create a new role with a trust policy that allows CloudWatch service to assume the role.
+
+.. Assign a policy to the role that permits "putting records" into a Firehose stream.
+
+. Create a new IAM role and use the following JSON as the trust policy:
++
+[source,json]
+----
+{
+ "Version": "2012-10-17",
+ "Statement": [
+ {
+ "Effect": "Allow",
+ "Principal": {
+ "Service": "logs..amazonaws.com"
+ },
+ "Action": "sts:AssumeRole",
+ "Condition": {
+ "StringLike": {
+ "aws:SourceArn": "arn:aws:logs:::*"
+ }
+ }
+ }
+ ]
+}
+----
+
+. Assign a policy to the IAM role by using the following JSON file:
++
+[source,json]
+----
+{
+ "Version": "2012-10-17",
+ "Statement": [
+ {
+ "Effect": "Allow",
+ "Action": "firehose:PutRecord",
+ "Resource": "arn:aws:firehose:::deliverystream/"
+ }
+ ]
+}
+----
+
+When the new role is ready, you can select it in the subscription filter.
+
+. Configure log format and filters. Select the "Other" in the **Log format** option.
+
+. Set log format and filters
++
+If you want to forward all log events, you can empty the filter pattern. You can use the *Subscription filter pattern* to forward only the log events that match the pattern. The *Test pattern* tool on the same page allows you to test filter patterns before creating the subscription filter.
+
+. Generate additional logs.
++
+Open the AWS Lambda page again, select the function you created, and execute it a few times to generate new log events.
+
+**Check if there are destination error logs**
+
+On the https://console.aws.amazon.com/[AWS console], navigate to your Firehose stream and check for entries in the *Destination error logs* section.
+
+If everything is running smoothly, this list is empty. If there's an error, you can check the details. The following example shows a delivery stream that fails to send records to the Elastic stack due to bad authentication settings:
+
+image::firehose-cloudwatch-destination-errors.png[Firehose destination errors]
+
+The Firehose delivery stream reports:
+
+* The number of failed deliveries.
+* The failure detail.
+
+[discrete]
+[[firehose-cloudwatch-step-five]]
+== Step 5: Visualize your logs in {kib}
+
+image::firehose-cloudwatch-data-stream.png[Vizualize logs in Kibana]
+
+With the logs streaming to the Elastic stack, you can now visualize them in {kib}.
+
+In {kib}, navigate to the *Discover* page and select the index pattern that matches the Firehose stream name. Here is a sample of logs from the Lambda function you forwarded to the `logs-aws.generic-default` data stream:
+
+image::firehose-cloudwatch-verify-discover.png[Sample logs in Discover]