Skip to content

Commit

Permalink
load_result -> load_stac
Browse files Browse the repository at this point in the history
  • Loading branch information
m-mohr committed Mar 8, 2023
1 parent baf08c6 commit 0d1165c
Show file tree
Hide file tree
Showing 2 changed files with 101 additions and 7 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- `between`: Support for temporal comparison.
- Deprecated `GeometryCollections` are not supported any longer. [#389](https://github.com/Open-EO/openeo-processes/issues/389)
- Deprecated PROJ definitions for the CRS are not supported any longer.
- `load_result`: Subtype `job-id` removed in favor of providing a URL. [#384](https://github.com/Open-EO/openeo-processes/issues/384)
- `load_result`: Renamed to `load_stac` and the subtype `job-id` was removed in favor of providing a URL. [#322](https://github.com/Open-EO/openeo-processes/issues/322), [#377](https://github.com/Open-EO/openeo-processes/issues/377), [#384](https://github.com/Open-EO/openeo-processes/issues/384)

### Fixed

Expand Down
106 changes: 100 additions & 6 deletions proposals/load_result.json → proposals/load_stac.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"id": "load_result",
"summary": "Load batch job results",
"description": "Loads batch job results and returns them as a processable data cube. A batch job result can be loaded by ID or URL:\n\n* **ID**: The identifier for a finished batch job. The job must have been submitted by the authenticated user on the back-end currently connected to.\n* **URL**: The URL to the STAC metadata for a batch job result. This is usually a signed URL that is provided by some back-ends since openEO API version 1.1.0 through the `canonical` link relation in the batch job result metadata.\n\nIf supported by the underlying metadata and file format, the data that is added to the data cube can be restricted with the parameters `spatial_extent`, `temporal_extent` and `bands`. If no data is available for the given extents, a `NoDataAvailable` exception is thrown.\n\n**Remarks:**\n\n* The bands (and all dimensions that specify nominal dimension labels) are expected to be ordered as specified in the metadata if the `bands` parameter is set to `null`.\n* If no additional parameter is specified this would imply that the whole data set is expected to be loaded. Due to the large size of many data sets, this is not recommended and may be optimized by back-ends to only load the data that is actually required after evaluating subsequent processes such as filters. This means that the values should be processed only after the data has been limited to the required extent and as a consequence also to a manageable size.",
"id": "load_stac",
"summary": "Loads data from STAC",
"description": "Loads data from a static STAC or a STAC API Collection, e.g. a batch job result, and returns the data as a processable data cube.\n\nA batch job result can be loaded by providing the URL to the STAC metadata for a batch job result. This is usually a signed URL that is provided by some back-ends since openEO API version 1.1.0 through the `canonical` link relation in the batch job result metadata.\n\nIf supported by the underlying metadata and file format, the data that is added to the data cube can be restricted with the parameters `spatial_extent`, `temporal_extent` and `bands`. If no data is available for the given extents, a `NoDataAvailable` exception is thrown.\n\n**Remarks:**\n\n* The bands (and all dimensions that specify nominal dimension labels) are expected to be ordered as specified in the metadata if the `bands` parameter is set to `null`.\n* If no additional parameter is specified this would imply that the whole data set is expected to be loaded. Due to the large size of many data sets, this is not recommended and may be optimized by back-ends to only load the data that is actually required after evaluating subsequent processes such as filters. This means that the values should be processed only after the data has been limited to the required extent and as a consequence also to a manageable size.",
"categories": [
"cubes",
"import"
Expand All @@ -10,7 +10,7 @@
"parameters": [
{
"name": "id",
"description": "The URL of a batch job with results.",
"description": "The URL of a static STAC or a STAC API Collection.",
"schema": {
"title": "URL",
"type": "string",
Expand All @@ -21,7 +21,7 @@
},
{
"name": "spatial_extent",
"description": "Limits the data to load from the batch job result to the specified bounding box or polygons.\n\n* For raster data, the process loads the pixel into the data cube if the point at the pixel center intersects with the bounding box or any of the polygons (as defined in the Simple Features standard by the OGC).\n* For vector data, the process loads the geometry into the data cube of the geometry is fully within the bounding box or any of the polygons (as defined in the Simple Features standard by the OGC).\n\nThe GeoJSON can be one of the following feature types:\n\n* A `Polygon` or `MultiPolygon` geometry,\n* a `Feature` with a `Polygon` or `MultiPolygon` geometry, or\n* a `FeatureCollection` containing at least one `Feature` with `Polygon` or `MultiPolygon` geometries.\n\nSet this parameter to `null` to set no limit for the spatial extent. Be careful with this when loading large datasets! It is recommended to use this parameter instead of using ``filter_bbox()`` or ``filter_spatial()`` directly after loading unbounded data.",
"description": "Limits the data to load to the specified bounding box or polygons.\n\n* For raster data, the process loads the pixel into the data cube if the point at the pixel center intersects with the bounding box or any of the polygons (as defined in the Simple Features standard by the OGC).\n* For vector data, the process loads the geometry into the data cube of the geometry is fully within the bounding box or any of the polygons (as defined in the Simple Features standard by the OGC).\n\nThe GeoJSON can be one of the following feature types:\n\n* A `Polygon` or `MultiPolygon` geometry,\n* a `Feature` with a `Polygon` or `MultiPolygon` geometry, or\n* a `FeatureCollection` containing at least one `Feature` with `Polygon` or `MultiPolygon` geometries.\n\nSet this parameter to `null` to set no limit for the spatial extent. Be careful with this when loading large datasets! It is recommended to use this parameter instead of using ``filter_bbox()`` or ``filter_spatial()`` directly after loading unbounded data.",
"schema": [
{
"title": "Bounding Box",
Expand Down Expand Up @@ -116,7 +116,7 @@
},
{
"name": "temporal_extent",
"description": "Limits the data to load from the batch job result to the specified left-closed temporal interval. Applies to all temporal dimensions. The interval has to be specified as an array with exactly two elements:\n\n1. The first element is the start of the temporal interval. The specified instance in time is **included** in the interval.\n2. The second element is the end of the temporal interval. The specified instance in time is **excluded** from the interval.\n\nThe specified temporal strings follow [RFC 3339](https://www.rfc-editor.org/rfc/rfc3339.html). Also supports open intervals by setting one of the boundaries to `null`, but never both.\n\nSet this parameter to `null` to set no limit for the temporal extent. Be careful with this when loading large datasets! It is recommended to use this parameter instead of using ``filter_temporal()`` directly after loading unbounded data.",
"description": "Limits the data to load to the specified left-closed temporal interval. Applies to all temporal dimensions. The interval has to be specified as an array with exactly two elements:\n\n1. The first element is the start of the temporal interval. The specified instance in time is **included** in the interval.\n2. The second element is the end of the temporal interval. The specified instance in time is **excluded** from the interval.\n\nThe specified temporal strings follow [RFC 3339](https://www.rfc-editor.org/rfc/rfc3339.html). Also supports open intervals by setting one of the boundaries to `null`, but never both.\n\nSet this parameter to `null` to set no limit for the temporal extent. Be careful with this when loading large datasets! It is recommended to use this parameter instead of using ``filter_temporal()`` directly after loading unbounded data.",
"schema": [
{
"type": "array",
Expand Down Expand Up @@ -187,6 +187,44 @@
],
"default": null,
"optional": true
},
{
"name": "properties",
"description": "Limits the data by metadata properties to include only data in the data cube which all given conditions return `true` for (AND operation).\n\nSpecify key-value-pairs with the key being the name of the metadata property, which can be retrieved with the openEO Data Discovery for Collections. The value must be a condition (user-defined process) to be evaluated against a STAC API. This parameter is not supported for static STAC.",
"schema": [
{
"type": "object",
"subtype": "metadata-filter",
"title": "Filters",
"description": "A list of filters to check against. Specify key-value-pairs with the key being the name of the metadata property name and the value being a process evaluated against the metadata values.",
"additionalProperties": {
"type": "object",
"subtype": "process-graph",
"parameters": [
{
"name": "value",
"description": "The property value to be checked against.",
"schema": {
"description": "Any data type."
}
}
],
"returns": {
"description": "`true` if the data should be loaded into the data cube, otherwise `false`.",
"schema": {
"type": "boolean"
}
}
}
},
{
"title": "No filter",
"description": "Don't filter by metadata properties.",
"type": "null"
}
],
"default": null,
"optional": true
}
],
"returns": {
Expand All @@ -196,6 +234,62 @@
"subtype": "datacube"
}
},
"examples": [
{
"title": "Load from a static STAC / batch job result",
"arguments": {
"id": "https://example.com/api/v1.0/jobs/123/results"
}
},
{
"title": "Load from a STAC API",
"arguments": {
"id": "https://example.com/collections/SENTINEL2",
"spatial_extent": {
"west": 16.1,
"east": 16.6,
"north": 48.6,
"south": 47.2
},
"temporal_extent": [
"2018-01-01",
"2019-01-01"
],
"properties": {
"eo:cloud_cover": {
"process_graph": {
"cc": {
"process_id": "between",
"arguments": {
"x": {
"from_parameter": "value"
},
"min": 0,
"max": 50
},
"result": true
}
}
},
"platform": {
"process_graph": {
"pf": {
"process_id": "eq",
"arguments": {
"x": {
"from_parameter": "value"
},
"y": "Sentinel-2B",
"case_sensitive": false
},
"result": true
}
}
}
}
}
}
],
"exceptions": {
"NoDataAvailable": {
"message": "There is no data available for the given extents."
Expand Down

0 comments on commit 0d1165c

Please sign in to comment.