From 61fee731638a49f98da640eb2bede0dc2e5318aa Mon Sep 17 00:00:00 2001 From: Tommaso Barbugli Date: Mon, 29 Jan 2024 14:19:14 +0100 Subject: [PATCH] call recording storage docs --- .../docusaurus/docs/api/recording/storage.mdx | 178 ++++++++++++++++++ 1 file changed, 178 insertions(+) create mode 100644 docusaurus/video/docusaurus/docs/api/recording/storage.mdx diff --git a/docusaurus/video/docusaurus/docs/api/recording/storage.mdx b/docusaurus/video/docusaurus/docs/api/recording/storage.mdx new file mode 100644 index 00000000..56b218a3 --- /dev/null +++ b/docusaurus/video/docusaurus/docs/api/recording/storage.mdx @@ -0,0 +1,178 @@ +--- +id: storage +sidebar_position: 2 +slug: /recording/storage +title: Storage +--- + +import Tabs from '@theme/Tabs'; +import TabItem from '@theme/TabItem'; + +By default, calls are stored on a S3 bucket managed by Stream. Recordings stored on Stream S3 storage are retained for two weeks time, and then automatically deleted. If you want to keep your recordings for longer, you can use your own storage. + +Stream supports the following external storage providers: + +- [Amazon S3](#amazon-s3) +- [Google Cloud Storage](#google-cloud-storage) +- [Azure Blob Storage](#azure-blob-storage) + +If you need support for a different storage provider, you can participate in the conversation [here](https://github.com/GetStream/protocol/discussions/371). + +## Configuring recording storage + +To use your own storage you need to: + +1. Configure a new storage for your Stream application +2. Configure your call type(s) to use the new storage + +Alternatively, you can also have the storage configured and use it for specific calls, while keeping existing calls on Stream storage. + + + +```js +// TODO: code example for Node +// 1. create a new storage with all the required parameters +// 2. update the call type to use the new storage +// 3. alternative, specify the storage when starting call recording +``` + + +```py +// TODO: code example for Python +``` + + +```bash +// TODO: curl code example +``` + + + +*Note*: recording are only uploaded to the storage when the call is completed or when `stop_recording` is called, whichever comes first. + +### Multiple storage providers and default storage + +You can configure multiple storage providers for your application. When starting a call recording, you can specify which storage provider to use. If you don't specify a storage provider, the default storage provider will be used. + +When recording a call, this is the order in which the storage provider is selected: + +1. If specified at call-level, use the storage provider specified for this call (`call.start_recording(storage_provider=...)`) +2. If specified at call-type-level, use the storage provider specified for this call type +3. Use Stream S3 storage + +Note: all Stream applications have Stream S3 storage enabled by default. You can refer to this configuration with the `stream-s3` name. + + + +```js +// TODO: code example for Node +// 1. update the call type to use Stream S3 storage +// 2. specify Stream S3 storage when starting call recording +``` + + +```py +# TODO: code example for Python +``` + + +```bash +// TODO: curl code example +``` + + + +## Storage configuration + +| Name | Description | Required | +|---------------|-------------|----------| +| name | | | +| storage_type | | | +| bucket | | | +| custom_folder | | | + +## Amazon S3 + +To use Amazon S3 as your storage provider, you have two authentication options: IAM role or API key. + +If you do not specify the `s3_api_key` parameter, Stream will use IAM role authentication. In that case make sure to have the correct IAM role configured for your application. + +TODO: copy/paste what is needed to do this (see other docs). + +| Name | Description | Required | +|------------|-------------|----------| +| s3_region | | yes | +| s3_api_key | | | +| s3_secret | | | + + +### Example S3 policy + +```json +{ + "Version": "2012-10-17", + "Id": "StreamExternalStoragePolicy", + "Statement": [ + { + "Sid": "ExampleStatement01", + "Effect": "Allow", + "Principal": { + "AWS": "arn:aws:iam::185583345998:root" + }, + "Action": [ + "s3:PutObject" + ], + "Resource": [ + "arn:aws:s3:::bucket_name/*", + "arn:aws:s3:::bucket_name" + ] + } + ] +} +``` + +## Google Cloud Storage + +To use Google Cloud Storage as your storage provider, you need to send your service account credentials as they are stored in your JSON file. + +Note: you can find the JSON service account credentials in the Google Cloud Console (...) + + + +```js +// TODO: show the param +``` + + +```py +# TODO: code example for Python +``` + + +```bash +// TODO: curl code example +``` + + + +### Example policy + +```json +{ + "bindings": [ + { + "role": "roles/storage.objectCreator", + "members": ["service_account_principal_identifier"] + } + ] +} +``` + +## Azure Blob Storage + +To use Azure Blob Storage as your storage provider, you need to create a container and a service principal with the following parameters: + +| Name | Description | Required | +|------------------|-------------|----------| +| abs_account_name | | yes | +| abs_account_key | | |