diff --git a/docusaurus/video/docusaurus/docs/api/recording/storage.mdx b/docusaurus/video/docusaurus/docs/api/recording/storage.mdx index 8544b1fd..02042cdf 100644 --- a/docusaurus/video/docusaurus/docs/api/recording/storage.mdx +++ b/docusaurus/video/docusaurus/docs/api/recording/storage.mdx @@ -159,10 +159,10 @@ curl -X POST "https://video.stream-io-api.com/video/call/default/${CALL_ID}/star | Name | Description | Required | |---------------|-------------|----------| -| name | | | -| storage_type | | | -| bucket | | | -| custom_folder | | | +| name |unique name | yes | +| storage_type |s3, gcs or abs| yes | +| bucket |bucket name| yes | +| custom_folder |path inside the bucket| | ## Amazon S3 @@ -170,15 +170,15 @@ To use Amazon S3 as your storage provider, you have two authentication options: If you do not specify the `s3_api_key` parameter, Stream will use IAM role authentication. In that case make sure to have the correct IAM role configured for your application. -| Name | Description | Required | -|------------|-------------|----------| -| s3_region | | yes | -| s3_api_key | | | -| s3_secret | | | +| Name | Required | +|------------|----------| +| s3_region | yes | +| s3_api_key | | +| s3_secret | | There are 2 ways to configure authentication on your S3 bucket: - By providing a key and secret -- Or by having Stream's AWS account assume a role on your SQS queue. +- Or by having Stream's AWS account assume a role on S3 bucket. With this option you omit the key and secret, but instead you set up a resource-based policy to grant Stream SendMessage permission on your S3 bucket. The following policy needs to be attached to your queue (replace the value of Resource with the fully qualified ARN of you S3 bucket):