Skip to content

Commit

Permalink
Merge pull request #163 from firebase/next
Browse files Browse the repository at this point in the history
Jan 16, 2020 release
  • Loading branch information
laurenzlong authored Jan 16, 2020
2 parents 5d6a007 + c7a1bc3 commit f276587
Show file tree
Hide file tree
Showing 26 changed files with 636 additions and 105 deletions.
2 changes: 1 addition & 1 deletion auth-mailchimp-sync/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Usage of this extension also requires you to have a Mailchimp account. You are r

**Configuration Parameters:**

* Deployment location: Where should the extension be deployed? For help selecting a location, refer to the [location selection guide](https://firebase.google.com/docs/functions/locations).
* Cloud Functions location: Where do you want to deploy the functions created for this extension?

* Mailchimp API key: What is your Mailchimp API key? To obtain a Mailchimp API key, go to your [Mailchimp account](https://admin.mailchimp.com/account/api/).

Expand Down
6 changes: 2 additions & 4 deletions auth-mailchimp-sync/extension.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -67,11 +67,9 @@ resources:
params:
- param: LOCATION
type: select
label: Deployment location
label: Cloud Functions location
description: >-
Where should the extension be deployed? For help selecting a location,
refer to the [location selection
guide](https://firebase.google.com/docs/functions/locations).
Where do you want to deploy the functions created for this extension?
options:
- label: Iowa (us-central1)
value: us-central1
Expand Down
4 changes: 4 additions & 0 deletions delete-user-data/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
## Version 0.1.3

feature - Support deletion of directories (issue #148).

## Version 0.1.2

feature - Add a new param for recursively deleting subcollections in Cloud Firestore (issue #14).
Expand Down
67 changes: 37 additions & 30 deletions delete-user-data/extension.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,11 @@
name: delete-user-data
displayName: Delete User Data
specVersion: v1beta
version: 0.1.2
version: 0.1.3

description:
Deletes data keyed on a userId from Cloud Firestore, Realtime
Database, and/or Cloud Storage when a user deletes their account.
Deletes data keyed on a userId from Cloud Firestore, Realtime Database, and/or
Cloud Storage when a user deletes their account.

license: Apache-2.0
billingRequired: false
Expand Down Expand Up @@ -52,9 +52,10 @@ resources:
- name: clearData
type: firebaseextensions.v1beta.function
description:
Listens for user accounts to be deleted from your project's authenticated users,
then removes any associated user data (based on Firebase Authentication's User ID) from
Realtime Database, Cloud Firestore, and/or Cloud Storage.
Listens for user accounts to be deleted from your project's authenticated
users, then removes any associated user data (based on Firebase
Authentication's User ID) from Realtime Database, Cloud Firestore, and/or
Cloud Storage.
properties:
sourceDirectory: .
location: ${LOCATION}
Expand All @@ -65,11 +66,12 @@ resources:
params:
- param: LOCATION
type: select
label: Deployment location
label: Cloud Functions location
description: >-
Where should the extension be deployed? You usually want a location close to your database.
For help selecting a location, refer to the
[location selection guide](https://firebase.google.com/docs/functions/locations).
Where do you want to deploy the functions created for this extension?
You usually want a location close to your database or Storage bucket.
For help selecting a location, refer to the [location selection
guide](https://firebase.google.com/docs/functions/locations).
options:
- label: Iowa (us-central1)
value: us-central1
Expand All @@ -95,21 +97,23 @@ params:
example: users/{UID},admins/{UID}
required: false
description: >-
Which paths in your Cloud Firestore instance contain user data? Leave empty if
you don't use Cloud Firestore.
Which paths in your Cloud Firestore instance contain user data? Leave
empty if you don't use Cloud Firestore.
Enter the full paths, separated by commas. You can represent the User ID of the deleted user with `{UID}`.
Enter the full paths, separated by commas. You can represent the User ID
of the deleted user with `{UID}`.
For example, if you have the collections `users` and `admins`, and each collection
has documents with User ID as document IDs, then you can enter `users/{UID},admins/{UID}`.
For example, if you have the collections `users` and `admins`, and each
collection has documents with User ID as document IDs, then you can enter
`users/{UID},admins/{UID}`.
- param: FIRESTORE_DELETE_MODE
type: select
label: Cloud Firestore delete mode
description: >-
(Only applicable if you use the `Cloud Firestore paths` parameter.) How do you want
to delete Cloud Firestore documents? To also delete documents in subcollections,
set this parameter to `recursive`.
(Only applicable if you use the `Cloud Firestore paths` parameter.) How do
you want to delete Cloud Firestore documents? To also delete documents in
subcollections, set this parameter to `recursive`.
options:
- label: Recursive
value: recursive
Expand All @@ -124,10 +128,11 @@ params:
example: users/{UID},admins/{UID}
required: false
description: >-
Which paths in your Realtime Database instance contain user data? Leave empty if you
don't use Realtime Database.
Which paths in your Realtime Database instance contain user data? Leave
empty if you don't use Realtime Database.
Enter the full paths, separated by commas. You can represent the User ID of the deleted user with `{UID}`.
Enter the full paths, separated by commas. You can represent the User ID
of the deleted user with `{UID}`.
For example: `users/{UID},admins/{UID}`.
Expand All @@ -140,12 +145,14 @@ params:
Where in Google Cloud Storage do you store user data? Leave empty if you
don't use Cloud Storage.
Enter the full paths, separated by commas. You can represent the User ID of the deleted user with `{UID}`.
You can use `{DEFAULT}` to represent your default bucket.
For example, if you are using your default bucket,
and the bucket has files with the naming scheme `{UID}-pic.png`,
then you can enter `{DEFAULT}/{UID}-pic.png`.
If you also have files in another bucket called `my-awesome-app-logs`,
and that bucket has files with the naming scheme `{UID}-logs.txt`,
then you can enter `{DEFAULT}/{UID}-pic.png,my-awesome-app-logs/{UID}-logs.txt`.
Enter the full paths to files or directories in your Storage buckets,
separated by commas. Use `{UID}` to represent the User ID of the deleted
user, and use `{DEFAULT}` to represent your default Storage bucket.
Here's a series of examples. To delete all the files in your default
bucket with the file naming scheme `{UID}-pic.png`, enter
`{DEFAULT}/{UID}-pic.png`. To also delete all the files in another bucket
called my-app-logs with the file naming scheme `{UID}-logs.txt`, enter
`{DEFAULT}/{UID}-pic.png,my-app-logs/{UID}-logs.txt`. To *also* delete a User
ID-labeled directory and all its files (like `media/{UID}`), enter
`{DEFAULT}/{UID}-pic.png,my-app-logs/{UID}-logs.txt,{DEFAULT}/media/{UID}`.
15 changes: 8 additions & 7 deletions delete-user-data/functions/lib/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -91,19 +91,20 @@ const clearStorageData = (storagePaths, uid) => __awaiter(void 0, void 0, void 0
const bucket = bucketName === "{DEFAULT}"
? admin.storage().bucket()
: admin.storage().bucket(bucketName);
const file = bucket.file(parts.slice(1).join("/"));
const bucketFilePath = `${bucket.name}/${file.name}`;
const prefix = parts.slice(1).join("/");
try {
logs.storagePathDeleting(bucketFilePath);
yield file.delete();
logs.storagePathDeleted(bucketFilePath);
logs.storagePathDeleting(prefix);
yield bucket.deleteFiles({
prefix,
});
logs.storagePathDeleted(prefix);
}
catch (err) {
if (err.code === 404) {
logs.storagePath404(bucketFilePath);
logs.storagePath404(prefix);
}
else {
logs.storagePathError(bucketFilePath, err);
logs.storagePathError(prefix, err);
}
}
}));
Expand Down
15 changes: 8 additions & 7 deletions delete-user-data/functions/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -92,17 +92,18 @@ const clearStorageData = async (storagePaths: string, uid: string) => {
bucketName === "{DEFAULT}"
? admin.storage().bucket()
: admin.storage().bucket(bucketName);
const file = bucket.file(parts.slice(1).join("/"));
const bucketFilePath = `${bucket.name}/${file.name}`;
const prefix = parts.slice(1).join("/");
try {
logs.storagePathDeleting(bucketFilePath);
await file.delete();
logs.storagePathDeleted(bucketFilePath);
logs.storagePathDeleting(prefix);
await bucket.deleteFiles({
prefix,
});
logs.storagePathDeleted(prefix);
} catch (err) {
if (err.code === 404) {
logs.storagePath404(bucketFilePath);
logs.storagePath404(prefix);
} else {
logs.storagePathError(bucketFilePath, err);
logs.storagePathError(prefix, err);
}
}
});
Expand Down
25 changes: 16 additions & 9 deletions firestore-bigquery-export/POSTINSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,13 +13,15 @@ You can test out this extension right away:
1. Query your **raw changelog table**, which should contain a single log of creating the `bigquery-mirror-test` document.

```
SELECT * FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_changelog`
SELECT *
FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_changelog`
```

1. Query your **latest view**, which should return the latest change event for the only document present -- `bigquery-mirror-test`.

```
SELECT * FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_latest`
SELECT *
FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_latest`
```

1. Delete the `bigquery-mirror-test` document from [Cloud Firestore](https://console.firebase.google.com/project/${param:PROJECT_ID}/database/firestore/data).
Expand All @@ -28,9 +30,10 @@ The `bigquery-mirror-test` document will disappear from the **latest view** and
1. You can check the changelogs of a single document with this query:

```
SELECT * FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_changelog`
WHERE document_name = "bigquery-mirror-test"
ORDER BY TIMESTAMP ASC
SELECT *
FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_changelog`
WHERE document_name = "bigquery-mirror-test"
ORDER BY TIMESTAMP ASC
```

### Using the extension
Expand All @@ -48,13 +51,17 @@ Note that this extension only listens for _document_ changes in the collection,

This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the import script provided by this extension.

The import script can read all existing documents in a Cloud Firestore collection and insert them into the raw changelog table created by this extension. The script adds a special changelog for each document with the operation of `IMPORT` and the timestamp of epoch. This is to ensure that any operation on an imported document supersedes the `IMPORT`
The import script can read all existing documents in a Cloud Firestore collection and insert them into the raw changelog table created by this extension. The script adds a special changelog for each document with the operation of `IMPORT` and the timestamp of epoch. This is to ensure that any operation on an imported document supersedes the `IMPORT`.

**Important:** Run the script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
**Important:** Run the import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.

You may pause and resume the script from the last batch at any point.
Learn more about using the import script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).

Learn more about using this script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).
### _(Optional)_ Generate schema views

After your data is in BigQuery, you can use the schema-views script (provided by this extension) to create views that make it easier to query relevant data. You only need to provide a JSON schema file that describes your data structure, and the schema-views script will create the views.

Learn more about using the schema-views script to [generate schema views](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/GENERATE_SCHEMA_VIEWS.md).

### Monitoring

Expand Down
10 changes: 7 additions & 3 deletions firestore-bigquery-export/PREINSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,15 @@ Before installing this extension, you'll need to:
+ [Set up Cloud Firestore in your Firebase project.](https://firebase.google.com/docs/firestore/quickstart)
+ [Link your Firebase project to BigQuery.](https://support.google.com/firebase/answer/6318765)

This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the import script provided by this extension.
#### Backfill your BigQuery dataset

**Important:** Run the script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the [import script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md) provided by this extension.

Learn more about using this script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).
**Important:** Run the import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.

#### Generate schema views

After your data is in BigQuery, you can run the [schema-views script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/GENERATE_SCHEMA_VIEWS.md) (provided by this extension) to create views that make it easier to query relevant data. You only need to provide a JSON schema file that describes your data structure, and the schema-views script will create the views.

#### Billing

Expand Down
12 changes: 8 additions & 4 deletions firestore-bigquery-export/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,15 @@ Before installing this extension, you'll need to:
+ [Set up Cloud Firestore in your Firebase project.](https://firebase.google.com/docs/firestore/quickstart)
+ [Link your Firebase project to BigQuery.](https://support.google.com/firebase/answer/6318765)

This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the import script provided by this extension.
#### Backfill your BigQuery dataset

**Important:** Run the script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the [import script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md) provided by this extension.

Learn more about using this script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).
**Important:** Run the import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.

#### Generate schema views

After your data is in BigQuery, you can run the [schema-views script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/GENERATE_SCHEMA_VIEWS.md) (provided by this extension) to create views that make it easier to query relevant data. You only need to provide a JSON schema file that describes your data structure, and the schema-views script will create the views.

#### Billing

Expand All @@ -43,7 +47,7 @@ When you use Firebase Extensions, you're only charged for the underlying resourc

**Configuration Parameters:**

* Deployment location: Where should the extension be deployed? You usually want a location close to your database. For help selecting a location, refer to the [location selection guide](https://firebase.google.com/docs/functions/locations).
* Cloud Functions location: Where do you want to deploy the functions created for this extension? You usually want a location close to your database. For help selecting a location, refer to the [location selection guide](https://firebase.google.com/docs/functions/locations). Note that this extension locates your BigQuery dataset in `us-central1`.

* Collection path: What is the path of the collection that you would like to export? You may use `{wildcard}` notation to match a subcollection of all documents in a collection (for example: `chatrooms/{chatid}/posts`).

Expand Down
10 changes: 6 additions & 4 deletions firestore-bigquery-export/extension.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -58,11 +58,13 @@ resources:
params:
- param: LOCATION
type: select
label: Deployment location
label: Cloud Functions location
description: >-
Where should the extension be deployed? You usually want a location close to your database.
For help selecting a location, refer to the
[location selection guide](https://firebase.google.com/docs/functions/locations).
Where do you want to deploy the functions created for this extension?
You usually want a location close to your database. For help selecting a
location, refer to the [location selection
guide](https://firebase.google.com/docs/functions/locations).
Note that this extension locates your BigQuery dataset in `us-central1`.
options:
- label: Iowa (us-central1)
value: us-central1
Expand Down
Loading

0 comments on commit f276587

Please sign in to comment.