Skip to content

Commit

Permalink
feat: Add Script to Migrate Compliance Report Summary
Browse files Browse the repository at this point in the history
* Update readme
  • Loading branch information
dhaselhan committed Jan 16, 2025
1 parent 8aacc2f commit 4b58ba4
Show file tree
Hide file tree
Showing 7 changed files with 561 additions and 10 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
"""Add TFRS Summary Columns
Revision ID: 98d79870df6b
Revises: 5163af6ba4a4
Create Date: 2025-01-16 20:01:34.038941
"""

import sqlalchemy as sa
from alembic import op

# revision identifiers, used by Alembic.
revision = "98d79870df6b"
down_revision = "5163af6ba4a4"
branch_labels = None
depends_on = None


def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.add_column(
"compliance_report_summary",
sa.Column("credits_offset_a", sa.Integer(), nullable=True),
)
op.add_column(
"compliance_report_summary",
sa.Column("credits_offset_b", sa.Integer(), nullable=True),
)
op.add_column(
"compliance_report_summary",
sa.Column("credits_offset_c", sa.Integer(), nullable=True),
)
# ### end Alembic commands ###


def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column("compliance_report_summary", "credits_offset_c")
op.drop_column("compliance_report_summary", "credits_offset_b")
op.drop_column("compliance_report_summary", "credits_offset_a")
# ### end Alembic commands ###
5 changes: 5 additions & 0 deletions backend/lcfs/db/models/compliance/ComplianceReportSummary.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,11 @@ class ComplianceReportSummary(BaseModel, Auditable):
line_21_non_compliance_penalty_payable = Column(Float, nullable=False, default=0)
total_non_compliance_penalty_payable = Column(Float, nullable=False, default=0)

# Legacy TFRS Columns
credits_offset_a = Column(Integer)
credits_offset_b = Column(Integer)
credits_offset_c = Column(Integer)

compliance_report = relationship("ComplianceReport", back_populates="summary")

def __repr__(self):
Expand Down
Binary file modified etl/database/nifi-registry-primary.mv.db
Binary file not shown.
Binary file modified etl/nifi/conf/flow.json.gz
Binary file not shown.
Binary file modified etl/nifi/conf/flow.xml.gz
Binary file not shown.
499 changes: 499 additions & 0 deletions etl/nifi_scripts/compliance_summary.groovy

Large diffs are not rendered by default.

26 changes: 16 additions & 10 deletions etl/readme.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# ETL Overview
This project sets up Apache NiFi along with two PostgreSQL databases, TFRS and LCFS, using Docker. It enables data migration between these databases via NiFi.

This project sets up Apache NiFi along with two PostgreSQL databases, TFRS and LCFS, using Docker. It enables data
migration between these databases via NiFi.

## How to Use

Expand All @@ -9,12 +11,11 @@ This project sets up Apache NiFi along with two PostgreSQL databases, TFRS and L
$ docker-compose up -d
```

Starts three containers: NiFi, TFRS, and LCFS databases.
Starts three containers: NiFi, TFRS Database, and Zookeeper.

## 2. Access NiFi:

Go to http://localhost:8080/nifi/

Go to http://localhost:8091/nifi/

## 3. Load NiFi Template:

Expand All @@ -33,7 +34,7 @@ Drag the template onto the canvas.

## 5. Enable Services:

Click the lignting bolt next to all services to Enable them
Click the lightning bolt next to all services to Enable them

## 6. Data transfer between OpenShift and local containers:

Expand Down Expand Up @@ -67,19 +68,24 @@ Click the Start icon to begin the data flow.
To monitor your NiFi data flow:

## 1. View Flow Status:
- Check each processor and connection for real-time data on processed, queued, or penalized FlowFiles.
- Click on components for detailed stats and performance metrics.

- Check each processor and connection for real-time data on processed, queued, or penalized FlowFiles.
- Click on components for detailed stats and performance metrics.

## 2. Enable Bulletins:
- Configure bulletins to receive alerts (INFO, WARN, ERROR) for any issues that require attention.

- Configure bulletins to receive alerts (INFO, WARN, ERROR) for any issues that require attention.

## 3. Use Data Provenance:
- Track the lineage of each FlowFile to see its origin, processing steps, and final destination.

- Track the lineage of each FlowFile to see its origin, processing steps, and final destination.

## 4. Monitor System Health:
- Use the ## Summary## tab to check overall system health, including memory usage, thread activity, and performance.

- Use the ## Summary## tab to check overall system health, including memory usage, thread activity, and performance.

## Error Handling

If any records cannot be added to the databases, they will be logged and stored in the `nifi_output` directory.

You can access these failed records for further inspection or troubleshooting.

0 comments on commit 4b58ba4

Please sign in to comment.