-
Notifications
You must be signed in to change notification settings - Fork 411
Workspace Location
By default, CloudBeaver stores all its files (configurations, scripts, etc.) in the/opt/cloudbeaver/
on the host machine.
Folder | Description |
---|---|
workspace |
Workspace files for CloudBeaver. |
drivers |
Auto downloaded database drivers. |
conf |
Configuration files for CloudBeaver. Learn more |
-
Locate the name of the running container:
- Open a terminal on the host machine.
- Run the following command to list all running containers in the Compose project:
docker-compose ps
-
Identify the service name and open a shell inside the container:
docker-compose exec <service_name> /bin/bash
Replace
<service_name>
with the actual name of the service from yourdocker-compose.yml
file. -
After entering the container, navigate to the workspace directory:
cd workspace/
CloudBeaver supports storing its workspace in an AWS S3 bucket. To enable this, update your docker-compose.yml
and configure the correct environment variables.
For more details on AWS S3 configuration, including setting up buckets, permissions, and best practices, see the official Amazon S3 Documentation
Make sure your CloudBeaver service includes the following environment variables:
services:
cloudbeaver:
environment:
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_REGION=${AWS_REGION}
- CLOUDBEAVER_WORKSPACE_LOCATION=${CLOUDBEAVER_WORKSPACE_LOCATION}
Define these variables in your .env
file:
AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
AWS_REGION=your-region
CLOUDBEAVER_WORKSPACE_LOCATION=s3:///dbeaver-downloads/test_workspace
Important:
- The
CLOUDBEAVER_WORKSPACE_LOCATION
path must use triple slashes (s3:///
) before the bucket name. This is required for proper S3 path handling.- Replace
dbeaver-downloads
with your actual S3 bucket name.test_workspace
is the subfolder where CloudBeaver will store workspace data.
-
No embedded databases
- CloudBeaver cannot use embedded databases (such as H2) with an external S3-based workspace.
- Storing an embedded database in S3 would cause severe performance issues.
-
Separate database node required
- To use an S3 workspace, you must configure an external database such as PostgreSQL, MySQL, or another supported DB.
- Make sure the database is properly defined in
docker-compose.yml
andCLOUDBEAVER_DB_*
environment variables.
For more information on CloudBeaver's database, see Server Database.
-
Administration
- Server configuration
- Create Connection
- Connection Templates Management
- Access Management
-
Authentication methods
-
Local Access Authentication
- Anonymous Access Configuration
- Reverse proxy header authentication
- LDAP
-
Single Sign On
-
SAML
-
OpenID
-
AWS OpenID
-
AWS SAML
-
AWS IAM
-
AWS OpenId via Okta
-
Snowflake SSO
-
Okta OpenId
-
Cognito OpenId
-
JWT authentication
-
Kerberos authentication
-
NTLM
-
Microsoft Entra ID authentication
-
Google authentication
-
Local Access Authentication
- Database authentication methods
- Network configuration settings
- User credentials storage
- Cloud databases configuration
-
Query Manager
-
Drivers Management
-
Features
- Server configuration
-
Domain manager
- Product configuration parameters
- Command line parameters
- Local Preferences
- API
- Deployment options
- Additional setup and management