Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev -> Main sync #3032

Merged
merged 65 commits into from
Nov 9, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
65 commits
Select commit Hold shift + click to select a range
6cd7712
First pass of integrating the monthly EIA923 data into the rest of th…
aesharpe Oct 12, 2023
681b72d
remove breakpoint
aesharpe Oct 12, 2023
5899fc0
Merge branch 'dev' into add-data-maturity-for-923m
aesharpe Oct 12, 2023
2bff97d
Add function to drop ytd records for annual tables
aesharpe Oct 12, 2023
796b5da
Adjust monthly row expectations for gf and frc tables after dropping …
aesharpe Oct 12, 2023
77e052e
Tweak the way we add data maturity to the eia923 monthly files and re…
aesharpe Oct 16, 2023
44b4d44
Merge branch 'dev' into add-data-maturity-for-923m
aesharpe Oct 17, 2023
63dee8e
Merge with dev
aesharpe Oct 24, 2023
b50ea1d
Litle updates:
aesharpe Oct 25, 2023
572147c
For now, comment out the checks that make sure we have the same years…
aesharpe Oct 25, 2023
aa42b28
Update min max rows
aesharpe Oct 25, 2023
a588420
Add data_maturity field to harvested EIA tables so that we can drop y…
aesharpe Oct 25, 2023
1d267ab
Address PR comments:
aesharpe Oct 27, 2023
8ccc44c
Merge branch 'dev' into add-data-maturity-for-923m
aesharpe Oct 27, 2023
b3c11c2
Fix release note trailing whitespace error
aesharpe Oct 30, 2023
3b35892
Update test_eia923_dependency function to make sure some 860 and 923 …
aesharpe Oct 30, 2023
47304b3
Only generate alphanumeric entity IDs in test - non-printable charact…
jdangerx Oct 31, 2023
dfea5ef
Set up Cloud SQL postgres database for dagster storage
bendnorman Oct 31, 2023
05901ab
Copy dagster.yaml after DAGSTER_HOME is created
bendnorman Oct 31, 2023
02c52f6
Add proper quoting rules to DAGSTER_PG_PASSWORD secret
bendnorman Oct 31, 2023
dcfb8de
Use max cpus for nightly builds and spin dagster-storage SQL instance…
bendnorman Nov 1, 2023
ad8442d
Create and delete Cloud SQL db during nightly builds
bendnorman Nov 1, 2023
ab9425a
Set PUDL_SETTINGS_YML to etl_full.yml and add git sha to Cloud SQL da…
bendnorman Nov 1, 2023
6592fad
Add short github ref to database name
bendnorman Nov 1, 2023
2f85315
Update DAGSTER_PG_DB with short git sha
bendnorman Nov 1, 2023
8c60657
Update date range for nightly build links to include 2022
zaneselvans Nov 1, 2023
24f6b8d
Update 923 settings files to accomodate 2023 data and update settings…
aesharpe Nov 1, 2023
45f2d02
Fix calculating the report_date in demand_hourly_pa_ferc714
rousik Nov 1, 2023
bf77d1a
Require non-null report_date in FERC 714 hourly demand table.
zaneselvans Nov 1, 2023
b376141
Update date validation function to only look at instances where data_…
aesharpe Nov 1, 2023
dbf686f
Merge pull request #2936 from catalyst-cooperative/add-data-maturity-…
aesharpe Nov 1, 2023
e848d64
Fix calculating the report_date in demand_hourly_pa_ferc714
rousik Nov 1, 2023
1c6dd24
Merge branch 'dev' into setup-dagster-postgres
bendnorman Nov 1, 2023
5b33500
Remove Cloud SQL lifecycle management from gcp_pudl_etl.sh script
bendnorman Nov 1, 2023
c15e063
Merge branch 'dev' into setup-dagster-postgres
bendnorman Nov 2, 2023
633b63d
Update data contributors, add zenodo role and doi field, update US co…
e-belfer Nov 2, 2023
4b028d2
Update to ZenodoDoi class, update to https
e-belfer Nov 2, 2023
6e6bf5e
Remove leftover string
e-belfer Nov 2, 2023
e7aedda
Merge pull request #3004 from catalyst-cooperative/datapackage-update
e-belfer Nov 2, 2023
c11b1a4
Switch regex strategy to sampling strategy to improve performance (#2…
jdangerx Nov 2, 2023
a79edeb
add alembic schema changes for the recent constraint.
rousik Nov 2, 2023
0fa0c85
Merge pull request #3012 from catalyst-cooperative/fix-alembic-schema
rousik Nov 2, 2023
18bb60f
Merge pull request #2996 from catalyst-cooperative/setup-dagster-post…
bendnorman Nov 3, 2023
ef20339
only fix a reporting_frequency_code when the column exists
cmgosnell Nov 3, 2023
2848e01
Update responses requirement from <0.24,>=0.14 to >=0.14,<0.25
dependabot[bot] Nov 6, 2023
51ed7e5
Update pyarrow requirement from <14,>=13 to >=13,<15
dependabot[bot] Nov 6, 2023
c2e7889
Update dagster-postgres requirement
dependabot[bot] Nov 6, 2023
1ed333c
Merge pull request #3015 from catalyst-cooperative/dependabot/pip/dev…
zaneselvans Nov 6, 2023
039116a
Merge pull request #3016 from catalyst-cooperative/dependabot/pip/dev…
zaneselvans Nov 6, 2023
71ee2bd
Merge pull request #3017 from catalyst-cooperative/dependabot/pip/dev…
zaneselvans Nov 6, 2023
aea60b9
[pre-commit.ci] pre-commit autoupdate
pre-commit-ci[bot] Nov 6, 2023
acbf788
update tox and eia923 rows
cmgosnell Nov 6, 2023
51be942
Merge pull request #3019 from catalyst-cooperative/pre-commit-ci-upda…
zaneselvans Nov 6, 2023
5a96335
Merge branch 'dev' into fix-reporting_frequency_code
zaneselvans Nov 6, 2023
f251def
Merge pull request #3013 from catalyst-cooperative/fix-reporting_freq…
zaneselvans Nov 7, 2023
e4c1c46
update excepted rows for no-fips id-ed respondents but keep annualize…
cmgosnell Nov 7, 2023
1e2857f
add report year validation test
cmgosnell Nov 7, 2023
f25e921
add minmax rows into validation test for chonky table
cmgosnell Nov 7, 2023
fcf4ccc
Merge pull request #3023 from catalyst-cooperative/ferc714_mystery_da…
cmgosnell Nov 7, 2023
095d31b
idk exactly why the "nan"s began existing but this fixes it
cmgosnell Nov 7, 2023
14e6bc8
revert the replace of "nan" by stopping introducing them! plus some l…
cmgosnell Nov 7, 2023
b7533cf
REALLY REALLY its a nullable string
cmgosnell Nov 8, 2023
a2bdffa
Merge pull request #3025 from catalyst-cooperative/fix_mystery_fuel_c…
cmgosnell Nov 8, 2023
d8512b5
Deploy Datasette to fly.io instead of Cloud Run (#3018)
jdangerx Nov 9, 2023
1bb33dd
Merge branch 'main' into dev
jdangerx Nov 9, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/build-deploy-pudl.yml
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,7 @@ jobs:
--container-env DAGSTER_PG_HOST="104.154.182.24" \
--container-env DAGSTER_PG_DB="dagster-storage" \
--container-env PUDL_SETTINGS_YML="/home/catalyst/src/pudl/package_data/settings/etl_full.yml" \
--container-env FLY_ACCESS_TOKEN=${{ secrets.FLY_ACCESS_TOKEN }} \

# Start the VM
- name: Start the deploy-pudl-vm
Expand Down
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -38,3 +38,9 @@ notebooks/*.tgz
terraform/.terraform/*
.env
.hypothesis/

# generated by datasette/publish.py fresh for every deploy - we shouldn't track changes.
devtools/datasette/fly/Dockerfile
devtools/datasette/fly/inspect-data.json
devtools/datasette/fly/metadata.yml
devtools/datasette/fly/all_dbs.tar.zst
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ repos:
# Formatters: hooks that re-write Python & documentation files
####################################################################################
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.1.3
rev: v0.1.4
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
Expand Down
34 changes: 34 additions & 0 deletions devtools/datasette/fly/fly.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# fly.toml app configuration file generated for catalyst-coop-pudl on 2023-11-03T15:31:15-04:00
#
# See https://fly.io/docs/reference/configuration/ for information about how to use this file.
#
app = "catalyst-coop-pudl"
primary_region = "bos"

[[mounts]]
destination = "/data"
source = "datasette"

[[services]]
internal_port = 8080
protocol = "tcp"

[services.concurrency]
hard_limit = 25
soft_limit = 20

[[services.ports]]
handlers = ["http"]
port = 80

[[services.ports]]
handlers = ["tls", "http"]
port = 443

[[services.tcp_checks]]
grace_period = "1m"
interval = 10000
timeout = 2000

[deploy]
wait_timeout = "15m"
10 changes: 10 additions & 0 deletions devtools/datasette/fly/run.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
#! /usr/bin/env bash
set -eux

shopt -s nullglob

find /data/ -name '*.sqlite' -delete
mv all_dbs.tar.zst /data
zstd -f -d /data/all_dbs.tar.zst -o /data/all_dbs.tar
tar -xf /data/all_dbs.tar --directory /data
datasette serve --host 0.0.0.0 /data/*.sqlite --cors --inspect-file inspect-data.json --metadata metadata.yml --setting sql_time_limit_ms 5000 --port $PORT
122 changes: 122 additions & 0 deletions devtools/datasette/publish.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
"""Publish the datasette to fly.io.

We use custom logic here because the datasette-publish-fly plugin bakes the
uncompressed databases into the image, which makes the image too large.

We compress the databases before baking them into the image. Then we decompress
them at runtime to a Fly volume mounted at /data. This avoids a long download
at startup, and allows us stay within the Fly.io 8GB image size limit.

The volume handling is done manually outside of this publish.py script - it
should be terraformed at some point.

Some static fly.io deployment-related files live in ./fly:
* fly.toml - service configuration
* run.sh - service entrypoint

Apart from that: the Dockerfile and dataset-specific
metadata.yml/inspect-data.json are generated by this script.
"""

import json
import logging
import secrets
from pathlib import Path
from subprocess import check_call, check_output

from pudl.metadata.classes import DatasetteMetadata
from pudl.workspace.setup import PudlPaths

logging.basicConfig(format="%(asctime)s %(message)s", level=logging.INFO)

DOCKERFILE_TEMPLATE = """
FROM python:3.11.0-slim-bullseye
COPY . /app
WORKDIR /app

RUN apt-get update
RUN apt-get install -y zstd

ENV DATASETTE_SECRET '{datasette_secret}'
RUN pip install -U datasette datasette-cluster-map datasette-vega datasette-block-robots
ENV PORT 8080
EXPOSE 8080

CMD ["./run.sh"]
"""


def make_dockerfile():
"""Write a dockerfile from template, to use in fly deploy.

We write this from template so we can generate a datasette secret. This way
we don't have to manage secrets at all.
"""
datasette_secret = secrets.token_hex(16)
return DOCKERFILE_TEMPLATE.format(datasette_secret=datasette_secret)


def inspect_data(datasets, pudl_out):
"""Pre-inspect databases to generate some metadata for Datasette.

This is done in the image build process in datasette-publish-fly, but since
we don't have access to the databases in the build process we have to
inspect before building the Docker image.
"""
inspect_output = json.loads(
check_output(
[ # noqa: S603
"datasette",
"inspect",
]
+ [str(pudl_out / ds) for ds in datasets]
)
)

for dataset in inspect_output:
name = Path(inspect_output[dataset]["file"]).name
new_filepath = Path("/data") / name
inspect_output[dataset]["file"] = str(new_filepath)
return inspect_output


def metadata(pudl_out) -> str:
"""Return human-readable metadata for Datasette."""
return DatasetteMetadata.from_data_source_ids(pudl_out).to_yaml()


def main():
"""Generate deployment files and run the deploy."""
fly_dir = Path(__file__).parent.absolute() / "fly"
docker_path = fly_dir / "Dockerfile"
inspect_path = fly_dir / "inspect-data.json"
metadata_path = fly_dir / "metadata.yml"

pudl_out = PudlPaths().pudl_output
datasets = [str(p.name) for p in pudl_out.glob("*.sqlite")]
logging.info(f"Inspecting DBs for datasette: {datasets}...")
inspect_output = inspect_data(datasets, pudl_out)
with inspect_path.open("w") as f:
f.write(json.dumps(inspect_output))

logging.info("Writing metadata...")
with metadata_path.open("w") as f:
f.write(metadata(pudl_out))

logging.info("Writing Dockerfile...")
with docker_path.open("w") as f:
f.write(make_dockerfile())

logging.info(f"Compressing {datasets} and putting into docker context...")
check_call(
["tar", "-a", "-czvf", fly_dir / "all_dbs.tar.zst"] + datasets, # noqa: S603
cwd=pudl_out,
)

logging.info("Running fly deploy...")
check_call(["/usr/bin/env", "flyctl", "deploy"], cwd=fly_dir) # noqa: S603
logging.info("Deploy finished!")


if __name__ == "__main__":
main()
26 changes: 0 additions & 26 deletions devtools/datasette/publish.sh

This file was deleted.

9 changes: 9 additions & 0 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
FROM condaforge/mambaforge:23.3.1-1

SHELL [ "/bin/bash", "-exo", "pipefail", "-c" ]

# Install curl and js
# awscli requires unzip, less, groff and mandoc
# hadolint ignore=DL3008
Expand All @@ -24,6 +26,10 @@ ENV CONTAINER_HOME=/home/catalyst
USER catalyst
WORKDIR ${CONTAINER_HOME}

# Install flyctl
RUN curl -L https://fly.io/install.sh | sh
ENV PATH="${CONTAINER_HOME}/.fly/bin:$PATH"

ENV CONDA_PREFIX=${CONTAINER_HOME}/env
ENV PUDL_REPO=${CONTAINER_HOME}/pudl
ENV CONDA_RUN="conda run --no-capture-output --prefix ${CONDA_PREFIX}"
Expand All @@ -37,6 +43,9 @@ ENV DAGSTER_HOME=${CONTAINER_PUDL_WORKSPACE}/dagster_home
# Create data input/output directories
RUN mkdir -p ${PUDL_INPUT} ${PUDL_OUTPUT} ${DAGSTER_HOME}

# Copy dagster configuration file
COPY docker/dagster.yaml ${DAGSTER_HOME}/dagster.yaml

# Create a conda environment based on the specification in the repo
COPY test/test-environment.yml test/test-environment.yml
RUN mamba create --copy --prefix ${CONDA_PREFIX} --yes python=${PYTHON_VERSION} && \
Expand Down
12 changes: 12 additions & 0 deletions docker/dagster.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
storage:
postgres:
postgres_db:
username:
env: DAGSTER_PG_USERNAME
password:
env: DAGSTER_PG_PASSWORD
hostname:
env: DAGSTER_PG_HOST
db_name:
env: DAGSTER_PG_DB
port: 5432
1 change: 1 addition & 0 deletions docker/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ services:
environment:
- API_KEY_EIA
- GCP_BILLING_PROJECT
- FLY_ACCESS_TOKEN
env_file:
- .env
build:
Expand Down
15 changes: 9 additions & 6 deletions docker/gcp_pudl_etl.sh
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,6 @@ function run_pudl_etl() {
$PUDL_SETTINGS_YML \
&& pudl_etl \
--loglevel DEBUG \
--max-concurrent 6 \
--gcs-cache-path gs://internal-zenodo-cache.catalyst.coop \
$PUDL_SETTINGS_YML \
&& pytest \
Expand Down Expand Up @@ -86,20 +85,24 @@ function notify_slack() {
# 2>&1 redirects stderr to stdout.
run_pudl_etl 2>&1 | tee $LOGFILE

# Notify slack if the etl succeeded.
# if pipeline is successful, distribute + publish datasette
if [[ ${PIPESTATUS[0]} == 0 ]]; then
notify_slack "success"

# Dump outputs to s3 bucket if branch is dev or build was triggered by a tag
if [ $GITHUB_ACTION_TRIGGER = "push" ] || [ $GITHUB_REF = "dev" ]; then
copy_outputs_to_distribution_bucket
fi

# Deploy the updated data to datasette
if [ $GITHUB_REF = "dev" ]; then
gcloud config set run/region us-central1
source ~/devtools/datasette/publish.sh
python ~/devtools/datasette/publish.py 2>&1 | tee -a $LOGFILE
fi
fi

# Notify slack about entire pipeline's success or failure;
# PIPESTATUS[0] either refers to the failed ETL run or the last distribution
# task that was run above
if [[ ${PIPESTATUS[0]} == 0 ]]; then
notify_slack "success"
else
notify_slack "failure"
fi
Expand Down
20 changes: 10 additions & 10 deletions docs/data_access.rst
Original file line number Diff line number Diff line change
Expand Up @@ -83,42 +83,42 @@ AWS CLI, or programmatically via the S3 API. They can also be downloaded directl
HTTPS using the following links:

* `PUDL SQLite DB <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/pudl.sqlite>`__
* `EPA CEMS Hourly Emissions Parquet (1995-2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/hourly_emissions_epacems.parquet>`__
* `EPA CEMS Hourly Emissions Parquet (1995-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/hourly_emissions_epacems.parquet>`__
* `Census DP1 SQLite DB (2010) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/censusdp1tract.sqlite>`__

* Raw FERC Form 1:

* `FERC-1 SQLite derived from DBF (1994-2020) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc1.sqlite>`__
* `FERC-1 SQLite derived from XBRL (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc1_xbrl.sqlite>`__
* `FERC-1 SQLite derived from XBRL (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc1_xbrl.sqlite>`__
* `FERC-1 Datapackage (JSON) describing SQLite derived from XBRL <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc1_xbrl_datapackage.json>`__
* `FERC-1 XBRL Taxonomy Metadata as JSON (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc1_xbrl_taxonomy_metadata.json>`__
* `FERC-1 XBRL Taxonomy Metadata as JSON (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc1_xbrl_taxonomy_metadata.json>`__

* Raw FERC Form 2:

* `FERC-2 SQLite derived from DBF (1996-2020) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc2.sqlite>`__
* `FERC-2 SQLite derived from XBRL (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc2_xbrl.sqlite>`__
* `FERC-2 SQLite derived from XBRL (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc2_xbrl.sqlite>`__
* `FERC-2 Datapackage (JSON) describing SQLite derived from XBRL <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc2_xbrl_datapackage.json>`__
* `FERC-2 XBRL Taxonomy Metadata as JSON (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc2_xbrl_taxonomy_metadata.json>`__
* `FERC-2 XBRL Taxonomy Metadata as JSON (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc2_xbrl_taxonomy_metadata.json>`__

* Raw FERC Form 6:

* `FERC-6 SQLite derived from DBF (2000-2020) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc6.sqlite>`__
* `FERC-6 SQLite derived from XBRL (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc6_xbrl.sqlite>`__
* `FERC-6 SQLite derived from XBRL (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc6_xbrl.sqlite>`__
* `FERC-6 Datapackage (JSON) describing SQLite derived from XBRL <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc6_xbrl_datapackage.json>`__
* `FERC-6 XBRL Taxonomy Metadata as JSON (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc6_xbrl_taxonomy_metadata.json>`__
* `FERC-6 XBRL Taxonomy Metadata as JSON (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc6_xbrl_taxonomy_metadata.json>`__

* Raw FERC Form 60:

* `FERC-60 SQLite derived from DBF (2006-2020) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc60.sqlite>`__
* `FERC-60 SQLite derived from XBRL (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc60_xbrl.sqlite>`__
* `FERC-60 SQLite derived from XBRL (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc60_xbrl.sqlite>`__
* `FERC-60 Datapackage (JSON) describing SQLite derived from XBRL <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc60_xbrl_datapackage.json>`__
* `FERC-60 XBRL Taxonomy Metadata as JSON (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc60_xbrl_taxonomy_metadata.json>`__

* Raw FERC Form 714:

* `FERC-714 SQLite derived from XBRL (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc714_xbrl.sqlite>`__
* `FERC-714 SQLite derived from XBRL (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc714_xbrl.sqlite>`__
* `FERC-714 Datapackage (JSON) describing SQLite derived from XBRL <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc714_xbrl_datapackage.json>`__
* `FERC-714 XBRL Taxonomy Metadata as JSON (2021) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc714_xbrl_taxonomy_metadata.json>`__
* `FERC-714 XBRL Taxonomy Metadata as JSON (2021-2022) <https://s3.us-west-2.amazonaws.com/pudl.catalyst.coop/dev/ferc714_xbrl_taxonomy_metadata.json>`__


.. _access-zenodo:
Expand Down
3 changes: 2 additions & 1 deletion docs/release_notes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,8 @@ Data Coverage
^^^^^^^^^^^^^

* Updated :doc:`data_sources/eia860` to include early release data from 2022.
* Updated :doc:`data_sources/eia923` to include early release data from 2022.
* Updated :doc:`data_sources/eia923` to include early release data from 2022 and
monthly YTD data as of April 2023.
* Updated :doc:`data_sources/epacems` to switch from the old FTP server to the new
CAMPD API, and to include 2022 data. Due to changes in the ETL, Alaska, Puerto Rico
and Hawaii are now included in CEMS processing. See issue :issue:`1264` & PRs
Expand Down
Loading