-
Notifications
You must be signed in to change notification settings - Fork 0
Home
The Carbon Portal is a set of services to collect, store, and distribute ICOS data (and more). This infrastructure is also used for the SITES data portal. It is a collection of microservices written in Scala and JavaScript.
The data service stores the uploaded data as files on disk. Depending on the data type, these files can be processed during the upload to generate a binary representation of the data, allowing for a fast preview of the data.
- Portal: The data portal is the main way to access our data. It is a React/Redux JavaScript app that queries our metadata store using SPARQL.
- Dygraph: A JavaScript wrapper to preview our data using the dygraph charting library.
- Stats: A React/Redux app to view download and preview statistics about our data.
- NetCDF: NetCDF files preview.
- Map-Graph: An app to preview data with ship tracks using both a map and dygraph.
The metadata service stores metadata associated to uploaded data and exposes it through a SPARQL API.
- Upload: A frontend to upload data. Once you login, the metadata is submitted using your API token, the data is validated, and then uploaded to our storage.
- Metadata editing: An internal scala.js app to edit reusable metadata.
- SPARQL Client: An interface to write SPARQL queries against our metadata store.
The authentication service allows people to create an account. It allows you to permanently accept the data license so you don't have to accept it every time you want to download data. After being granted submitter rights, you can use the upload form or the API token found in your profile to upload data to the portal. More information about the upload can be found on Github.
A MongoDB with a rest API used to store usage statistics.
RDF Log is a Postgres database that contains every an history of every RDF statement applied to the metadata. It is the base on top of which the SPARQL engine is instantiated.