Skip to content

Commit

Permalink
Merge pull request #11 from aws-solutions/release-candidate/v1.4.0
Browse files Browse the repository at this point in the history
Release candidate/v1.4.0
  • Loading branch information
aassadza authored Sep 29, 2021
2 parents 9b84ed8 + 6f1ad09 commit c2315ee
Show file tree
Hide file tree
Showing 47 changed files with 3,026 additions and 1,821 deletions.
17 changes: 16 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,27 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [1.4.0] - 2021-09-28

### Added

- A new pipeline to deploy [AWS SageMaker Model Quality Monitor](https://docs.aws.amazon.com/sagemaker/latest/dg/model-monitor-model-quality.html). The new pipeline monitors the performance of a deployed model by comparing the
predictions that the model makes with the actual ground truth labels that the model attempts to predict.

### Updated

- The Model Monitor pipeline's API call. Now, the Model Monitor pipeline is split into two pipelines, Data Quality Monitor pipeline, and Model Quality Monitor pipeline.
- The format of CloudFormation templates parameters' names from `PARAMETERNAME` to `ParameterName`.
- The APIs of the Realtime Inference pipeline to support passing an optional custom endpoint name.
- The data quality baseline's Lambda to use AWS SageMaker SDK to create the baseline, instead of using Boto3.
- AWS Cloud Development Kit (AWS CDK) and AWS Solutions Constructs to version 1.117.0.

## [1.3.0] - 2021-06-24

### Added

- The option to use [Amazon SageMaker Model Registry](https://docs.aws.amazon.com/sagemaker/latest/dg/model-registry.html) to deploy versioned models. The model registry allows you to catalog models for production, manage model versions, associate metadata with models, manage the approval status of a model, deploy models to production, and automate model deployment with CI/CD.
- The option to use an [AWS Organizations delegated administrator account](https://docs.amazonaws.cn/en_us/AWSCloudFormation/latest/UserGuide/stacksets-orgs-delegated-admin.html) to orchestrate the deployment of Machine Learning (ML) workloads across the AWS Organizations accounts using AWS CloudFormation StackSets.
- The option to use an [AWS Organizations delegated administrator account](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/stacksets-orgs-delegated-admin.html) to orchestrate the deployment of Machine Learning (ML) workloads across the AWS Organizations accounts using AWS CloudFormation StackSets.

### Updated

Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ pipeline for building and registering Docker images for custom algorithms that c
deployment on an [Amazon SageMaker](https://aws.amazon.com/sagemaker/) endpoint.

You can use batch and real-time data inferences to configure the pipeline for your business context.
You can also provision multiple Model Monitor pipelines to periodically monitor the quality of deployed
You can also provision multiple data quality and model quality Monitor pipelines to periodically monitor the quality of deployed
Amazon SageMaker ML models. This solution increases your team’s agility and efficiency by allowing them
to repeat successful processes at scale.

Expand Down Expand Up @@ -119,7 +119,7 @@ chmod +x ./build-s3-dist.sh
./build-s3-dist.sh $DIST_OUTPUT_BUCKET $SOLUTION_NAME $VERSION
```

- Upload the distributable assets to your Amazon S3 bucket in your account. Note: Ensure that you own the Amazon S3 bucket before uploading the assets. To upload the assets to the S3 bucket, you can use the AWS Console or the AWS CLI as shown below.
- Upload the distributable assets to your Amazon S3 bucket in your account. Note: ensure that you own the Amazon S3 bucket before uploading the assets. To upload the assets to the S3 bucket, you can use the AWS Console or the AWS CLI as shown below.

```
aws s3 cp ./global-s3-assets/ s3://my-bucket-name-<aws_region>/aws-mlops-framework/<my-version>/ --recursive --acl bucket-owner-full-control --profile aws-cred-profile-name
Expand Down
42 changes: 23 additions & 19 deletions deployment/build-s3-dist.sh
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
set -e

# Important: CDK global version number
cdk_version=1.96.0
cdk_version=1.117.0

# Check to see if the required parameters have been provided:
if [ -z "$1" ] || [ -z "$2" ] || [ -z "$3" ]; then
Expand Down Expand Up @@ -111,23 +111,27 @@ echo "npm install -g aws-cdk@$cdk_version"
npm install -g aws-cdk@$cdk_version

#Run 'cdk synth for BYOM blueprints
echo "cdk synth ModelMonitorStack > lib/blueprints/byom/byom_model_monitor.yaml"
cdk synth ModelMonitorStack > lib/blueprints/byom/byom_model_monitor.yaml
echo "cdk synth SingleAccountCodePipelineStack > lib/blueprints/byom/single_account_codepipeline.yaml"
cdk synth SingleAccountCodePipelineStack > lib/blueprints/byom/single_account_codepipeline.yaml
echo "cdk synth MultiAccountCodePipelineStack > lib/blueprints/byom/multi_account_codepipeline.yaml"
cdk synth MultiAccountCodePipelineStack > lib/blueprints/byom/multi_account_codepipeline.yaml
echo "cdk synth BYOMRealtimePipelineStack > lib/blueprints/byom/byom_realtime_inference_pipeline.yaml"
cdk synth BYOMRealtimePipelineStack > lib/blueprints/byom/byom_realtime_inference_pipeline.yaml
echo "cdk synth BYOMCustomAlgorithmImageBuilderStack > lib/blueprints/byom/byom_custom_algorithm_image_builder.yaml"
cdk synth BYOMCustomAlgorithmImageBuilderStack > lib/blueprints/byom/byom_custom_algorithm_image_builder.yaml
echo "cdk synth BYOMBatchStack > lib/blueprints/byom/byom_batch_pipeline.yaml"
cdk synth BYOMBatchStack > lib/blueprints/byom/byom_batch_pipeline.yaml
echo "cdk synth DataQualityModelMonitorStack > lib/blueprints/byom/byom_data_quality_monitor.yaml --path-metadata false --version-reporting false"
cdk synth DataQualityModelMonitorStack > lib/blueprints/byom/byom_data_quality_monitor.yaml --path-metadata false --version-reporting false
echo "cdk synth ModelQualityModelMonitorStack > lib/blueprints/byom/byom_model_quality_monitor.yaml --path-metadata false --version-reporting false"
cdk synth ModelQualityModelMonitorStack > lib/blueprints/byom/byom_model_quality_monitor.yaml --path-metadata false --version-reporting false
echo "cdk synth SingleAccountCodePipelineStack > lib/blueprints/byom/single_account_codepipeline.yaml --path-metadata false --version-reporting false"
cdk synth SingleAccountCodePipelineStack > lib/blueprints/byom/single_account_codepipeline.yaml --path-metadata false --version-reporting false
echo "cdk synth MultiAccountCodePipelineStack > lib/blueprints/byom/multi_account_codepipeline.yaml --path-metadata false --version-reporting false"
cdk synth MultiAccountCodePipelineStack > lib/blueprints/byom/multi_account_codepipeline.yaml --path-metadata false --version-reporting false
echo "cdk synth BYOMRealtimePipelineStack > lib/blueprints/byom/byom_realtime_inference_pipeline.yaml --path-metadata false --version-reporting false"
cdk synth BYOMRealtimePipelineStack > lib/blueprints/byom/byom_realtime_inference_pipeline.yaml --path-metadata false --version-reporting false
echo "cdk synth BYOMCustomAlgorithmImageBuilderStack > lib/blueprints/byom/byom_custom_algorithm_image_builder.yaml --path-metadata false --version-reporting false"
cdk synth BYOMCustomAlgorithmImageBuilderStack > lib/blueprints/byom/byom_custom_algorithm_image_builder.yaml --path-metadata false --version-reporting false
echo "cdk synth BYOMBatchStack > lib/blueprints/byom/byom_batch_pipeline.yaml --path-metadata false --version-reporting false"
cdk synth BYOMBatchStack > lib/blueprints/byom/byom_batch_pipeline.yaml --path-metadata false --version-reporting false

# Replace %%VERSION%% in other templates
replace="s/%%VERSION%%/$3/g"
echo "sed -i -e $replace lib/blueprints/byom/byom_model_monitor.yaml"
sed -i -e $replace lib/blueprints/byom/byom_model_monitor.yaml
echo "sed -i -e $replace lib/blueprints/byom/byom_data_quality_monitor.yaml"
sed -i -e $replace lib/blueprints/byom/byom_data_quality_monitor.yaml
echo "sed -i -e $replace lib/blueprints/byom/byom_model_quality_monitor.yaml"
sed -i -e $replace lib/blueprints/byom/byom_model_quality_monitor.yaml
echo "sed -i -e $replace lib/blueprints/byom/byom_realtime_inference_pipeline.yaml"
sed -i -e $replace lib/blueprints/byom/byom_realtime_inference_pipeline.yaml
echo "sed -i -e $replace lib/blueprints/byom/single_account_codepipeline.yaml"
Expand All @@ -140,10 +144,10 @@ echo "sed -i -e $replace lib/blueprints/byom/byom_batch_pipeline.yaml"
sed -i -e $replace lib/blueprints/byom/byom_batch_pipeline.yaml

# Run 'cdk synth' for main templates to generate raw solution outputs
echo "cdk synth aws-mlops-single-account-framework --output=$staging_dist_dir"
cdk synth aws-mlops-single-account-framework --output=$staging_dist_dir
echo "cdk synth aws-mlops-multi-account-framework --output=$staging_dist_dir"
cdk synth aws-mlops-multi-account-framework --output=$staging_dist_dir
echo "cdk synth aws-mlops-single-account-framework --path-metadata false --version-reporting false --output=$staging_dist_dir"
cdk synth aws-mlops-single-account-framework --path-metadata false --version-reporting false --output=$staging_dist_dir
echo "cdk synth aws-mlops-multi-account-framework --path-metadata false --version-reporting false --output=$staging_dist_dir"
cdk synth aws-mlops-multi-account-framework --path-metadata false --version-reporting false --output=$staging_dist_dir

# Remove unnecessary output files
echo "cd $staging_dist_dir"
Expand Down
28 changes: 20 additions & 8 deletions source/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,26 +56,38 @@
batch_stack = BYOMBatchStack(
app,
"BYOMBatchStack",
description=(
f"({solution_id}byom-bt) - BYOM Batch Transform pipeline" f"in AWS MLOps Framework. Version {version}"
),
description=(f"({solution_id}byom-bt) - BYOM Batch Transform pipeline in AWS MLOps Framework. Version {version}"),
)

core.Aspects.of(batch_stack).add(AwsSDKConfigAspect(app, "SDKUserAgentBatch", solution_id, version))

model_monitor_stack = ModelMonitorStack(
data_quality_monitor_stack = ModelMonitorStack(
app,
"DataQualityModelMonitorStack",
monitoring_type="DataQuality",
description=(f"({solution_id}byom-dqmm) - DataQuality Model Monitor pipeline. Version {version}"),
)

core.Aspects.of(data_quality_monitor_stack).add(
AwsSDKConfigAspect(app, "SDKUserAgentDataMonitor", solution_id, version)
)

model_quality_monitor_stack = ModelMonitorStack(
app,
"ModelMonitorStack",
description=(f"({solution_id}byom-mm) - Model Monitor pipeline. Version {version}"),
"ModelQualityModelMonitorStack",
monitoring_type="ModelQuality",
description=(f"({solution_id}byom-mqmm) - ModelQuality Model Monitor pipeline. Version {version}"),
)

core.Aspects.of(model_monitor_stack).add(AwsSDKConfigAspect(app, "SDKUserAgentMonitor", solution_id, version))
core.Aspects.of(model_quality_monitor_stack).add(
AwsSDKConfigAspect(app, "SDKUserAgentModelMonitor", solution_id, version)
)


realtime_stack = BYOMRealtimePipelineStack(
app,
"BYOMRealtimePipelineStack",
description=(f"({solution_id}byom-rip) - BYOM Realtime Inference Pipleline. Version {version}"),
description=(f"({solution_id}byom-rip) - BYOM Realtime Inference Pipeline. Version {version}"),
)

core.Aspects.of(realtime_stack).add(AwsSDKConfigAspect(app, "SDKUserAgentRealtime", solution_id, version))
Expand Down
Binary file modified source/architecture-option-2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
32 changes: 24 additions & 8 deletions source/lambdas/pipeline_orchestration/index.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@
from json import JSONEncoder
import os
import datetime
from botocore.client import BaseClient
from typing import Dict, Any, List, Union
from shared.wrappers import BadRequest, api_exception_handler
from shared.logger import get_logger
from shared.helper import get_client
Expand Down Expand Up @@ -45,7 +47,7 @@ def default(self, obj):


@api_exception_handler
def handler(event, context):
def handler(event: Dict[str, Any], context: Dict[str, Any]) -> Dict[str, Any]:
if "httpMethod" in event and event["httpMethod"] == "POST": # Lambda is being invoked from API Gateway
if event["path"] == "/provisionpipeline":
return provision_pipeline(json.loads(event["body"]))
Expand All @@ -57,12 +59,16 @@ def handler(event, context):
return provision_pipeline(event)
else:
raise BadRequest(
"Bad request format. Expected httpMethod or pipeline_type, recevied none. Check documentation "
"Bad request format. Expected httpMethod or pipeline_type, received none. Check documentation "
+ "for API & config formats."
)


def provision_pipeline(event, client=cloudformation_client, s3_client=s3_client):
def provision_pipeline(
event: Dict[str, Any],
client: BaseClient = cloudformation_client,
s3_client: BaseClient = s3_client,
) -> Dict[str, Any]:
"""
provision_pipeline takes the lambda event object and creates a cloudformation stack
Expand Down Expand Up @@ -109,7 +115,7 @@ def provision_pipeline(event, client=cloudformation_client, s3_client=s3_client)
codepipeline_params = get_codepipeline_params(
is_multi_account, provisioned_pipeline_stack_name, template_zip_name, template_file_name
)
# format the params (the format is the same for multi-accouunt parameters)
# format the params (the format is the same for multi-account parameters)
formatted_codepipeline_params = format_template_parameters(codepipeline_params, "True")
# create the codepipeline
stack_response = create_codepipeline_stack(
Expand Down Expand Up @@ -143,7 +149,12 @@ def provision_pipeline(event, client=cloudformation_client, s3_client=s3_client)
return response


def update_stack(codepipeline_stack_name, pipeline_template_url, template_parameters, client):
def update_stack(
codepipeline_stack_name: str,
pipeline_template_url: str,
template_parameters: List[Dict[str, str]],
client: BaseClient,
) -> Dict[str, str]:
try:
update_response = client.update_stack(
StackName=codepipeline_stack_name,
Expand Down Expand Up @@ -171,8 +182,11 @@ def update_stack(codepipeline_stack_name, pipeline_template_url, template_parame


def create_codepipeline_stack(
codepipeline_stack_name, pipeline_template_url, template_parameters, client=cloudformation_client
):
codepipeline_stack_name: str,
pipeline_template_url: str,
template_parameters: List[Dict[str, str]],
client: BaseClient = cloudformation_client,
) -> Dict[str, str]:
try:
stack_response = client.create_stack(
StackName=codepipeline_stack_name,
Expand Down Expand Up @@ -204,7 +218,9 @@ def create_codepipeline_stack(
raise e


def pipeline_status(event, cfn_client=cloudformation_client, cp_client=codepipeline_client):
def pipeline_status(
event: Dict[str, Any], cfn_client: BaseClient = cloudformation_client, cp_client: BaseClient = codepipeline_client
) -> Dict[str, Any]:
"""
pipeline_status takes the lambda event object and returns the status of codepipeline project that's
running the pipeline
Expand Down
Loading

0 comments on commit c2315ee

Please sign in to comment.