Skip to content

Commit

Permalink
Simplified zone geometry (bcgov#2252)
Browse files Browse the repository at this point in the history
- Add generalised "shape" object (to stick zones and things into)
- Importing simplified zones
- Added files/config for developing api inside dev container
- Updated some readme's
- Ran poetry update - and added some timeouts where lint was complaining.
  • Loading branch information
Sybrand authored Sep 1, 2022
1 parent e6a30ef commit e5b815b
Show file tree
Hide file tree
Showing 17 changed files with 455 additions and 101 deletions.
19 changes: 19 additions & 0 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the README at:
// https://github.com/microsoft/vscode-dev-containers/tree/v0.245.0/containers/docker-existing-dockerfile
{
"name": "Existing Dockerfile",
// Sets the run context to one level up instead of the .devcontainer folder.
"context": "..",
// Update the 'dockerFile' property if you aren't using the standard 'Dockerfile' filename.
"dockerFile": "../Dockerfile.vscode",
// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [],
// Uncomment the next line to run commands after the container is created - for example installing curl.
// "postCreateCommand": "apt-get update && apt-get install -y curl",
// Uncomment when using a ptrace-based debugger like C++, Go, and Rust
// "runArgs": [ "--cap-add=SYS_PTRACE", "--security-opt", "seccomp=unconfined" ],
// Uncomment to use the Docker CLI from inside the container. See https://aka.ms/vscode-remote/samples/docker-from-docker.
// "mounts": [ "source=/var/run/docker.sock,target=/var/run/docker.sock,type=bind" ],
// Uncomment to connect as a non-root user if you've added one. See https://aka.ms/vscode-remote/containers/non-root.
"remoteUser": "vscode"
}
5 changes: 4 additions & 1 deletion .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -55,5 +55,8 @@
"python.testing.pytestArgs": [
"api"
],
"typescript.preferences.importModuleSpecifier": "non-relative"
"typescript.preferences.importModuleSpecifier": "non-relative",
"cSpell.words": [
"Albers"
]
}
86 changes: 86 additions & 0 deletions Dockerfile.vscode
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# NOTE:
# This Dockerfile is for local development only!

# debian would match more closely what we have in production, and would probably be ideal,
# but it's also a pain working with because debian is so old.
FROM ubuntu:22.04

ARG USERNAME=vscode
ARG USER_UID=1000
ARG USER_GID=$USER_UID

# Tell r-base not to wait for interactive input.
ENV DEBIAN_FRONTEND=noninteractive

# Install dependancies needed by python developer packages
# One should really run all these installs and the update in one go - for a consistent install
# but ease of development trumps consistency in this instance: it's easer to have more
# faster running steps that can fail, that one big monster install that takes forever
# and fails.
# NOTE: Once we no longer need pyodbc, please remove the apt-get update and install commands below.
RUN apt-get -y update
RUN apt-get -y install unixodbc-dev
# Install old (2.4.*; current debian) version of gdal
RUN apt-get -y install libgdal-dev

# Install R
RUN apt-get update --fix-missing && apt-get -y install r-base

# Install cffdrs
RUN R -e "install.packages('cffdrs')"

# Install some other dependancies
RUN apt-get -y install git build-essential python3 python3-dev python3-pip curl vim

# Install JDK
RUN apt-get -y install openjdk-11-jdk

# We could install poetry manually, but it's easier to use apt.
RUN apt-get -y install python3-poetry
# Poetry expects "python", but by default, on ubuntu, you need to specify "python3", so
# we work around that, by using the python3-poetry command.
RUN apt-get -y install python-is-python3

# I prefer zsh to bash
RUN apt-get -y install zsh

# from: https://code.visualstudio.com/remote/advancedcontainers/add-nonroot-user
RUN groupadd --gid $USER_GID $USERNAME \
&& useradd --uid $USER_UID --gid $USER_GID -m $USERNAME

# RUN mkdir /vscode
# RUN chown vscode /vscode
USER $USERNAME
ENV PATH="/home/${USERNAME}/.local/bin:${PATH}"

WORKDIR /home/$USERNAME

# Update pip
RUN python3 -m pip install --upgrade pip
RUN python3 -m pip install cachecontrol

# I like oh-my-zsh:
RUN sh -c "$(curl -fsSL https://raw.github.com/ohmyzsh/ohmyzsh/master/tools/install.sh)"
# BUT - for some reason git+zsh == slowness, so tell git not to slow down zsh:
# git config --add oh-my-zsh.hide-dirty 1

# Copy poetry files.
# COPY pyproject.toml poetry.lock ./

# COPY --chown=worker:worker poetry.lock pyproject.toml ./

# RUN poetry install

# # We can't have this inside pyproject.toml because the gdal version differs from platform to platform.
# # To figure out what version of pygdal you need, run gdal-config
# RUN poetry run python -m pip install pygdal==3.4.1.10

# COPY ./app /app/app
# RUN mkdir /app/libs
# COPY ./libs /app/libs

EXPOSE 8080 3000

# ENV CLASSPATH=/app/libs/REDapp_Lib.jar:/app/libs/WTime.jar:/app/libs/hss-java.jar:${CLASSPATH}
# CMD PYTHONPATH=. poetry run alembic upgrade head && poetry run uvicorn app.main:app --host 0.0.0.0 --reload --port 8080

10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,14 @@ Wildfire Predictive Services to support decision making in prevention, preparedn
4. Open [http://localhost:8080](http://localhost:8080) to view the front end served up from a static folder by the python api.
5. Open [http://localhost:3000](http://localhost:3000) to view the front end served up in developer mode by node.

#### Developing the application in a dev container, using vscode:

- Open up the project: `Remote-Containers: Open Folder in Container`, select docker-compose.vscode.yml
- Sometimes VSCode doesn't pick up you've changed the docker container: `Remote-Containers: Rebuild Container`
- Install extensions into the container, as needed.
- You can point the API database to: `host.docker.internal`
- You can start up other services outside of vscode, e.g.: `docker compose up db` and `docker compose up redis`

#### Running the api alone

Refer to [api/README.md](api/README.md).
Expand All @@ -41,6 +49,8 @@ A glossary of terms relating to Wildfire that are relevant to Predictive Service

## Architecture

*if you're not seeing an architecture diagram below, you need the mermaid plugin*

```mermaid
graph LR
Expand Down
3 changes: 3 additions & 0 deletions api/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -203,5 +203,8 @@ docker-run-hourly-actuals:
database-upgrade:
PYTHONPATH=. $(POETRY_RUN) alembic upgrade head

database-downgrade:
PYTHONPATH=. $(POETRY_RUN) alembic downgrade -1

docker-database-upgrade:
docker compose exec -e PYTHONPATH=. api alembic upgrade head
6 changes: 6 additions & 0 deletions api/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -328,6 +328,12 @@ PYTHONPATH=. alembic revision --autogenerate -m "Comment relevant to change"

You may have to modify the generated code to import geoalchemy2

You may want to have a data import/modification step, where you're not actually changing the database, but want to manage new data. You can create an "empty" migration, and insert data as needed:

```bash
PYTHONPATH=. alembic revision -m "Comment relevant to change"
```

Then apply:

```bash
Expand Down
7 changes: 6 additions & 1 deletion api/alembic.ini
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ script_location = alembic

# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
keys = root,sqlalchemy,alembic,app

[handlers]
keys = console
Expand All @@ -71,6 +71,11 @@ level = INFO
handlers =
qualname = alembic

[logger_app]
level = INFO
handlers = console
qualname = app

[handler_console]
class = StreamHandler
args = (sys.stderr,)
Expand Down
56 changes: 56 additions & 0 deletions api/alembic/versions/17b1c787f420_advisory_areas.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
"""advisory areas
Revision ID: 17b1c787f420
Revises: 62d35d76e1bf
Create Date: 2022-08-31 22:46:45.138215
"""
from alembic import op
import sqlalchemy as sa
import geoalchemy2


# revision identifiers, used by Alembic.
revision = '17b1c787f420'
down_revision = '62d35d76e1bf'
branch_labels = None
depends_on = None


def upgrade():
# ### commands auto generated by Alembic! ###
op.create_table('advisory_shape_types',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.Enum('fire_centre', 'fire_zone', name='shapetypeenum'), nullable=False),
sa.PrimaryKeyConstraint('id'),
comment='Identify kind of advisory area (e.g. Zone, Fire etc.)'
)
op.create_index(op.f('ix_advisory_shape_types_name'), 'advisory_shape_types', ['name'], unique=True)
op.create_table('advisory_shapes',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('source_identifier', sa.String(), nullable=False),
sa.Column('shape_type', sa.Integer(), nullable=False),
sa.Column('geom', geoalchemy2.types.Geometry(geometry_type='MULTIPOLYGON',
spatial_index=False, from_text='ST_GeomFromEWKT', name='geometry'), nullable=False),
sa.ForeignKeyConstraint(['shape_type'], ['advisory_shape_types.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('source_identifier', 'shape_type'),
comment='Record identifying some area of interest with respect to advisories'
)
op.create_index('idx_advisory_shapes_geom', 'advisory_shapes', ['geom'], unique=False, postgresql_using='gist')
op.create_index(op.f('ix_advisory_shapes_shape_type'), 'advisory_shapes', ['shape_type'], unique=False)
op.create_index(op.f('ix_advisory_shapes_source_identifier'),
'advisory_shapes', ['source_identifier'], unique=False)
# ### end Alembic commands ###


def downgrade():
# ### commands auto generated by Alembic! ###
op.drop_index(op.f('ix_advisory_shapes_source_identifier'), table_name='advisory_shapes')
op.drop_index(op.f('ix_advisory_shapes_shape_type'), table_name='advisory_shapes')
op.drop_index('idx_advisory_shapes_geom', table_name='advisory_shapes', postgresql_using='gist')
op.drop_table('advisory_shapes')
op.drop_index(op.f('ix_advisory_shape_types_name'), table_name='advisory_shape_types')
op.drop_table('advisory_shape_types')
sa.Enum(name='shapetypeenum').drop(op.get_bind())
# ### end Alembic commands ###
88 changes: 88 additions & 0 deletions api/alembic/versions/c04f22e31997_import_zones.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
"""Import zones
Revision ID: c04f22e31997
Revises: 17b1c787f420
Create Date: 2022-08-31 22:56:52.264112
"""
from typing import Final
import tempfile
from alembic import op
import sqlalchemy as sa
from sqlalchemy.orm.session import Session
import geoalchemy2
from shapely.geometry import MultiPolygon, Polygon
from shapely import wkb
from app.utils import esri


# revision identifiers, used by Alembic.
revision = 'c04f22e31997'
down_revision = '17b1c787f420'
branch_labels = None
depends_on = None


shape_type_table = sa.Table('advisory_shape_types', sa.MetaData(),
sa.Column('id', sa.Integer),
sa.Column('name', sa.String))

shape_table = sa.Table('advisory_shapes', sa.MetaData(),
sa.Column('id', sa.Integer),
sa.Column('source_identifier', sa.String),
sa.Column('shape_type', sa.Integer),
sa.Column('geom', geoalchemy2.Geometry))


def upgrade():
session = Session(bind=op.get_bind())
statement = shape_type_table.insert().values(name='fire_zone').returning(shape_type_table.c.id)
result = session.execute(statement).fetchone()
shape_type_id = result.id

# We fetch a list of object id's, fetching the entire layer in one go, will most likely crash
# the server we're talking to.
zone_url: Final = "https://maps.gov.bc.ca/arcserver/rest/services/whse/bcgw_pub_whse_legal_admin_boundaries/MapServer/8"
zone_ids = esri.fetch_object_list(zone_url)
for object_id in zone_ids:
# Fetch each object in turn.
obj = esri.fetch_object(object_id, zone_url)
for feature in obj.get('features', []):
attributes = feature.get('attributes', {})
# Each zone is uniquely identified by a fire zone id.
mof_fire_zone_id = attributes.get('MOF_FIRE_ZONE_ID')
fire_zone_id = str(int(mof_fire_zone_id))
geometry = feature.get('geometry', {})
# Rings???
# That's right:
# https://developers.arcgis.com/documentation/common-data-types/geometry-objects.htm
# "A polygon (specified as esriGeometryPolygon) contains an array of rings or curveRings
# and a spatialReference."
rings = geometry.get('rings', [[]])
polygons = []
for ring in rings:
# Simplify each polygon to 1000 meters, preserving topology.
polygons.append(Polygon(ring).simplify(1000, preserve_topology=True))
geom = MultiPolygon(polygons)
# Insert.
statement = shape_table.insert().values(
source_identifier=fire_zone_id,
shape_type=shape_type_id,
geom=wkb.dumps(geom, hex=True, srid=3005))
session.execute(statement)


def downgrade():
session = Session(bind=op.get_bind())
# Delete 'fire_zones'
statement = shape_type_table.select().where(shape_type_table.c.name == 'fire_zone')
result = session.execute(statement).fetchone()
shape_type_id = result.id

# Delete areas of type
statement = shape_table.delete().where(shape_table.c.shape_type == shape_type_id)
session.execute(statement)

# Delete 'fire_zone' type
statement = shape_type_table.delete().where(shape_type_table.c.name == 'fire_zone')
session.execute(statement)
1 change: 1 addition & 0 deletions api/app/.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ BC_FIRE_WEATHER_USER=user
BC_FIRE_WEATHER_SECRET=password
BC_FIRE_WEATHER_FILTER_ID=0
KEYCLOAK_PUBLIC_KEY=thisispublickey
# POSTGRES_WRITE_HOST=host.docker.internal
POSTGRES_WRITE_HOST=db
POSTGRES_READ_HOST=db
POSTGRES_READ_USER=wpsread
Expand Down
1 change: 1 addition & 0 deletions api/app/db/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@
PredictionModelGridSubset, ModelRunGridSubsetPrediction,
WeatherStationModelPrediction)
from app.db.models.hfi_calc import (FireCentre, FuelType, PlanningArea, PlanningWeatherStation)
from app.db.models.advisory import (Shape, ShapeType)
44 changes: 44 additions & 0 deletions api/app/db/models/advisory.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
import enum
from sqlalchemy import (Integer, String, Column, Index, ForeignKey, Enum, UniqueConstraint)
from geoalchemy2 import Geometry
from app.db.database import Base


class ShapeTypeEnum(enum.Enum):
""" Define different shape types. e.g. "Zone", "Fire Centre" - later we may add
"Incident"/"Fire", "Custom" etc. etc. """
fire_centre = 1
fire_zone = 2


class ShapeType(Base):
""" Identify some kind of area type, e.g. "Zone", or "Fire" """
__tablename__ = 'advisory_shape_types'
__table_args__ = (
{'comment': 'Identify kind of advisory area (e.g. Zone, Fire etc.)'}
)

id = Column(Integer, primary_key=True)
name = Column(Enum(ShapeTypeEnum), nullable=False, unique=True, index=True)


class Shape(Base):
""" Identify some area of interest with respect to advisories. """
__tablename__ = 'advisory_shapes'
__table_args__ = (
# we may have to re-visit this constraint - but for the time being, the idea is
# that for any given type of area, it has to be unique for the kind of thing that
# it is. e.g. a zone has some id.
UniqueConstraint('source_identifier', 'shape_type'),
{'comment': 'Record identifying some area of interest with respect to advisories'}
)

id = Column(Integer, primary_key=True)
# An area is uniquely identified, e.g. a zone has a number, so does a fire.
source_identifier = Column(String, nullable=False, index=True)
shape_type = Column(Integer, ForeignKey('advisory_shape_types.id'), nullable=False, index=True)
geom = Column(Geometry('MULTIPOLYGON', spatial_index=False), nullable=False)


# Explict creation of index due to issue with alembic + geoalchemy.
Index('idx_advisory_areas_geom', Shape.geom, postgresql_using='gist')
2 changes: 1 addition & 1 deletion api/app/health.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ def patroni_cluster_health_check():
header = {
'Authorization': 'Bearer ' + config.get('STATUS_CHECKER_SECRET')
}
resp = requests.get(url, headers=header)
resp = requests.get(url, headers=header, timeout=10)
resp_json = resp.json()
# NOTE: In Openshift parlance "replica" refers to how many of one pod we have, in Patroni, a "Replica"
# refers to a read only copy of of the Leader.
Expand Down
Loading

0 comments on commit e5b815b

Please sign in to comment.