Skip to content

Commit

Permalink
Aceaas and minikube updates (#51)
Browse files Browse the repository at this point in the history
* Update ACEaaS pipeline picture

Signed-off-by: Trevor Dolby <[email protected]>

* Fix naming and add comments

Signed-off-by: Trevor Dolby <[email protected]>

* Update picture with new names

Signed-off-by: Trevor Dolby <[email protected]>

* Containers update

Signed-off-by: Trevor Dolby <[email protected]>

* Switch containers

Signed-off-by: Trevor Dolby <[email protected]>

* Libpath fix

Signed-off-by: Trevor Dolby <[email protected]>

* Change default image

Signed-off-by: Trevor Dolby <[email protected]>

* Add container version comments

Signed-off-by: Trevor Dolby <[email protected]>

* Backslash comments

Signed-off-by: Trevor Dolby <[email protected]>

* Tekton name generation

Signed-off-by: Trevor Dolby <[email protected]>

* Tekton dashboard

Signed-off-by: Trevor Dolby <[email protected]>

* Fix image link

Signed-off-by: Trevor Dolby <[email protected]>

* Update pipelines diagram

Signed-off-by: Trevor Dolby <[email protected]>

* Fix wording and diagrams

Signed-off-by: Trevor Dolby <[email protected]>

* Picture changes

Signed-off-by: Trevor Dolby <[email protected]>

* Minikube and CP4i updates

Signed-off-by: Trevor Dolby <[email protected]>

* Fix link

Signed-off-by: Trevor Dolby <[email protected]>

* Update pipeline

Signed-off-by: Trevor Dolby <[email protected]>

* Minikube fixes and doc updates

Signed-off-by: Trevor Dolby <[email protected]>

* Add links

* Update docs

Signed-off-by: Trevor Dolby <[email protected]>

* Doc and version fixup

Signed-off-by: Trevor Dolby <[email protected]>

* Remove extraneous typescript file

Signed-off-by: Trevor Dolby <[email protected]>

---------

Signed-off-by: Trevor Dolby <[email protected]>
  • Loading branch information
trevor-dolby-at-ibm-com authored Apr 9, 2024
1 parent f07980e commit 075abc0
Show file tree
Hide file tree
Showing 31 changed files with 698 additions and 184 deletions.
4 changes: 2 additions & 2 deletions Jenkinsfile
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
pipeline {
agent { docker {
/* image 'cp.icr.io/cp/appc/ace:12.0.11.0-r1' */
image 'ace-minimal:12.0.11.0-alpine'
image 'cp.icr.io/cp/appc/ace:12.0.11.0-r1'
/* image 'ace-minimal:12.0.11.0-alpine' */
args '-e LICENSE=accept --entrypoint ""'
} }
parameters {
Expand Down
2 changes: 1 addition & 1 deletion Jenkinsfile.windows
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ pipeline {
string(name: 'integrationServerName', defaultValue: 'default', description: 'Integration server name')
}
environment {
ACE_COMMAND = "C:\\Program Files\\IBM\\ACE\\12.0.10.0\\ace"
ACE_COMMAND = "C:\\Program Files\\IBM\\ACE\\12.0.11.0\\ace"
CT_JDBC = credentials('CT_JDBC')
}
stages {
Expand Down
60 changes: 36 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,69 +3,81 @@
Demo pipeline for ACE to show how ACE solutions can be built in CI/CD pipelines using standard
tools. The main focus is on how to use existing ACE capabilities in a pipeline, with the application
being constructed to show pipeline-friendliness rather than being a "best practice" application.
As part of this, the pipeline scripts are stored in this repo along with the application source
to make the demo simpler, while in practice they would often be stored separately.

The overall goal is to deploy a REST application to an ACE integration server:
The overall goal is to deploy a REST HTTP application to an ACE integration server:

![Pipeline high-level](/demo-infrastructure/images/pipeline-high-level.png)

The application used to demonstrate the pipeline consists of a REST API that accepts JSON and interacts
with a database, with a supporting shared library containing a lot of the code. It is designed around
indexing different types of tea, storing the name and strength of the tea and assigning a unique integer
id to each type so that it can be retrieved later. Audit data is logged as XML for each operation performed.
with a database via JDBC, with a supporting shared library containing a lot of the code (hereafter
referred to as the "Tea REST application"). It is designed around indexing different types of tea, storing
the name and strength of the tea along with assigning a unique integer id to each type so that it can be
retrieved later. Audit data is logged as XML for each operation performed.

As this application exists to help demonstrate pipelines and how they work with ACE, there are some shortcuts
in the code that would not normally be present in a production-ready application: the database table is
created on-demand to make setup easier, the logging goes to the console instead of an audit service, etc.

Testing is split into “Unit Test” and "Component Test” categories, where "unit tests" are self-contained
and do not connect to external services (so they can run reliably anywhere) while the term “component test”
was used in the ACE product development pipeline to mean “unit tests that use external services”. See
[ACE unit and component tests](https://community.ibm.com/community/user/integration/blogs/trevor-dolby/2023/03/20/app-connect-enterprise-ace-unit-and-component-test)
for a discussion of the difference between test styles in integration.
## Recent changes

- Minikube added as the default "plain Kubernetes" option.
- ACE-as-a-Service added as a deploy target (see below).

## Technology and target options

This repo can be built in several different ways, and can deploy to different targets (see
[Getting started](#getting-started)) for suggestions on how to choose):
[Getting started](#getting-started) for suggestions on how to choose a target) from the same
source as shown in this diagram:

![Pipeline overview](/demo-infrastructure/images/pipelines-overview.jpg)

Testing is split into "Unit Test" and "Component Test" categories, where "unit tests" are self-contained
and do not connect to external services such as databases (so they can run reliably anywhere) while the
term "component test" was used in the ACE product development pipeline to mean "unit tests that use external
services (such as databases)". See
[ACE unit and component tests](https://community.ibm.com/community/user/integration/blogs/trevor-dolby/2023/03/20/app-connect-enterprise-ace-unit-and-component-test)
for a discussion of the difference between test styles in integration.

Pipeline technology options currently include:

- [GitHub Actions](https://docs.github.com/en/actions/learn-github-actions/understanding-github-actions)
for CI build and test before PRs are merged. This requires a GitHub instance that supports actions
(not all Enterprise variants do), and credit enough to run the actions. There is currently no
for CI build and test before pull requests (PRs) are merged. This requires a GitHub instance that supports
actions (not all Enterprise variants do), and credit enough to run the actions. There is currently no
component testing nor a deploy target (though these could be added) for these builds.
- [Tekton](https://tekton.dev/docs/concepts/overview/) can be used to build, test, and deploy the Tea
REST application to runtime infrastructure. Tekton is the basis for many Kubernetes application build
pipelines and also underpins RedHat OpenShift Pipelines.
REST application to ACE runtime infrastructure such as Kubernetes containers. Tekton is the basis for
many Kubernetes application build pipelines and also underpins RedHat OpenShift Pipelines.
- [Jenkins](https://www.jenkins.io/) can be used to build, test, and deploy the Tea REST application
to runtime infrastructure. Jenkins is widely used in many organizations for build and deployment.
to ACE runtime infrastructure such as integration nodes. Jenkins is widely used in many organizations
for build and deployment.

ACE deploy targets currently include:

- Kubernetes containers, with both standalone ACE containers and ACE certified containers (via the
ACE operator) as possible runtimes. [Minikube](https://minikube.sigs.k8s.io/docs/) (easily installed
ACE operator code) as possible runtimes. [Minikube](https://minikube.sigs.k8s.io/docs/) (easily installed
locally) and OpenShift can be used with the former, while the latter expects to deploy to the Cloud
Pak for Integration (CP4i). See [tekton/README.md#container-deploy-target](tekton/README.md#container-deploy-target)
for a description of the container deploy pipelines.
- [ACE-as-a-Service](https://www.ibm.com/docs/en/app-connect/12.0?topic=app-connect-enterprise-as-service)
(ACEaaS) running on AWS. This option requires an instance (which can be a trial instance) of ACEaaS to
be available but does not require any software to be installed and the flows run entirely in the cloud.
See [demo-infrastructure/README-aceaas-pipelines.md](demo-infrastructure/README-aceaas-pipelines.md)
(ACEaaS) running on Amazon Web Services (AWS). This option requires an instance (which can be a trial instance)
of ACEaaS to be available but does not require ACE servers to managed directly (in virtual machines or containers)
as the flows run entirely in the cloud. See [demo-infrastructure/README-aceaas-pipelines.md](demo-infrastructure/README-aceaas-pipelines.md)
for an overview of the pipelines deploying to ACEaaS.
- An ACE integration node, using an existing ACE integration node.

As can be seen from the diagram above, not all deployment options have been configured for all of
the pipeline options, but more could be added as needed.
As can be seen from the diagram above, not all deployment targets have been configured for all of
the pipeline technology options, but more could be added as needed.

As well as multiple options for pipelines and deploy targets, multiple build tools can be used to
build and test the application in the pipeline and locally:
build the ACE flows, Java code, Maps, etc and test the application in the pipeline and locally:

- Standard ACE commands introduced at v12 (such as ibmint).
- Standard ACE commands introduced at v12 (such as ibmint) can be used to build, deploy, and test
the application.
- Maven can also be used, and was the default in the ACE v11 version of this repo.
- Gradle can be used to run builds and unit tests, but has not been enabled for component tests.
- The toolkit can build and run the application and tests.
- The toolkit can build and run the application and tests, and also to check source into the GitHub repo.

## Getting started

Expand Down
31 changes: 23 additions & 8 deletions demo-infrastructure/Jenkinsfile.aceaas
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
pipeline {
agent { docker {
/* See README-jenkins.md for image discussion */
image 'cp.icr.io/cp/appc/ace:12.0.11.0-r1'
/* image 'ace-minimal:12.0.11.0-alpine' */
/* image 'ace-minimal-build:12.0.11.0-alpine' */
args '-e LICENSE=accept --entrypoint ""'
} }
parameters {
Expand All @@ -12,6 +13,7 @@ pipeline {
string(name: 'serverName', defaultValue: '19af6446-6171-4641-8aba-9dcff8e1b6ff.c1ogj3sd0tgtu0lqde00.databases.appdomain.cloud', description: 'JDBC database host')
string(name: 'portNumber', defaultValue: '30699', description: 'JDBC database port')
string(name: 'deployPrefix', defaultValue: 'tdolby', description: 'ACEaaS artifact prefix')
/* Could put the endpoint in as a credential, but it's not really secret . . . */
string(name: 'APPCON_ENDPOINT', defaultValue: 'api.p-vir-c1.appconnect.automation.ibm.com', description: 'ACEaaS endpoint hostname')
booleanParam(name: 'DEPLOY_CONFIGURATION', defaultValue: false, description: 'Create policies, runtime, etc')
}
Expand Down Expand Up @@ -107,7 +109,7 @@ pipeline {
}
}

stage('Next stage deploy') {
stage('Push BAR file') {
steps {
sh "echo ${params.APPCON_ENDPOINT} > /tmp/APPCON_ENDPOINT"
sh "echo ${params.deployPrefix} > /tmp/deployPrefix"
Expand All @@ -116,9 +118,9 @@ pipeline {
# Set HOME to somewhere writable by Maven
export HOME=/tmp

export LICENSE=accept
. /opt/ibm/ace-12/server/bin/mqsiprofile
# Fix for ace-minimal-build and curl
unset LD_LIBRARY_PATH

set -e # Fail on error - this must be done after the profile in case the container has the profile loaded already

echo "########################################################################"
Expand All @@ -135,6 +137,10 @@ pipeline {
--data "{\\"apiKey\\": \\"${APPCON_API_KEY}\\"}" --output /tmp/token-output.txt
cat /tmp/token-output.txt | tr -d '{}"' | tr ',' '\n' | grep access_token | sed 's/access_token://g' > /tmp/APPCON_TOKEN

echo "########################################################################"
echo "# PUTting BAR file to ACE service"
echo "########################################################################" && echo

curl -X PUT https://`cat /tmp/APPCON_ENDPOINT`/api/v1/bar-files/`cat /tmp/deployPrefix`-tea-jenkins \
-H "x-ibm-instance-id: ${APPCON_INSTANCE_ID}" -H "Content-Type: application/octet-stream" \
-H "Accept: application/json" -H "X-IBM-Client-Id: ${APPCON_CLIENT_ID}" -H "authorization: Bearer `cat /tmp/APPCON_TOKEN`" \
Expand All @@ -150,7 +156,7 @@ pipeline {
}
}

stage('Create configuration') {
stage('Create Configurations and IR') {
when {
expression {
return params.DEPLOY_CONFIGURATION
Expand All @@ -168,7 +174,7 @@ pipeline {
export LICENSE=accept
. /opt/ibm/ace-12/server/bin/mqsiprofile

#set -e # Fail on error - this must be done after the profile in case the container has the profile loaded already
set -e # Fail on error - this must be done after the profile in case the container has the profile loaded already

echo ========================================================================
echo Creating `cat /tmp/deployPrefix`-jdbc-policies configuration
Expand All @@ -181,10 +187,14 @@ pipeline {
(cd /tmp && /opt/ibm/ace-12/common/jdk/bin/jar cvf /tmp/JDBCPolicies.zip JDBCPolicies)
cat /tmp/JDBCPolicies.zip | base64 -w 0 > /tmp/JDBCPolicies.zip.base64

# Fix for ace-minimal-build and curl
unset LD_LIBRARY_PATH

cp tekton/aceaas/create-configuration-template.json /tmp/jdbc-policies-configuration.json
sed -i "s/TEMPLATE_NAME/`cat /tmp/deployPrefix`-jdbc-policies/g" /tmp/jdbc-policies-configuration.json
sed -i "s/TEMPLATE_TYPE/policyproject/g" /tmp/jdbc-policies-configuration.json
sed -i "s/TEMPLATE_DESCRIPTION/`cat /tmp/deployPrefix` JDBCPolicies project/g" /tmp/jdbc-policies-configuration.json
# Backslash issues with groovy scripting - the effect is to escape / characters in the base64 data to avoid issues with sed
sed -i "s/TEMPLATE_BASE64DATA/`cat /tmp/JDBCPolicies.zip.base64 | sed 's/\\//\\\\\\\\\\\\//g'`/g" /tmp/jdbc-policies-configuration.json
#cat /tmp/jdbc-policies-configuration.json

Expand All @@ -202,6 +212,7 @@ pipeline {
sed -i "s/TEMPLATE_NAME/`cat /tmp/deployPrefix`-jdbc-setdbparms/g" /tmp/jdbc-setdbparms-configuration.json
sed -i "s/TEMPLATE_TYPE/setdbparms/g" /tmp/jdbc-setdbparms-configuration.json
sed -i "s/TEMPLATE_DESCRIPTION/`cat /tmp/deployPrefix` JDBC credentials/g" /tmp/jdbc-setdbparms-configuration.json
# Backslash issues with groovy scripting - the effect is to escape / characters in the base64 data to avoid issues with sed
sed -i "s/TEMPLATE_BASE64DATA/`cat /tmp/jdbc-setdbparms.base64 | sed 's/\\//\\\\\\\\\\\\//g'`/g" /tmp/jdbc-setdbparms-configuration.json
#cat /tmp/jdbc-setdbparms-configuration.json

Expand All @@ -219,6 +230,7 @@ pipeline {
sed -i "s/TEMPLATE_NAME/`cat /tmp/deployPrefix`-default-policy-project/g" /tmp/default-policy-project-configuration.json
sed -i "s/TEMPLATE_TYPE/serverconf/g" /tmp/default-policy-project-configuration.json
sed -i "s/TEMPLATE_DESCRIPTION/`cat /tmp/deployPrefix` default policy project for JDBC/g" /tmp/default-policy-project-configuration.json
# Backslash issues with groovy scripting - the effect is to escape / characters in the base64 data to avoid issues with sed
sed -i "s/TEMPLATE_BASE64DATA/`cat /tmp/default-policy-project.base64 | sed 's/\\//\\\\\\\\\\\\//g'`/g" /tmp/default-policy-project-configuration.json
#cat /tmp/default-policy-project-configuration.json

Expand All @@ -228,10 +240,11 @@ pipeline {
--data-binary @/tmp/default-policy-project-configuration.json

echo ========================================================================
echo Creating IR JSON
echo Creating IntegrationRuntime JSON
echo ========================================================================
cp tekton/aceaas/create-integrationruntime-template.json /tmp/create-integrationruntime.json
sed -i "s/TEMPLATE_NAME/`cat /tmp/deployPrefix`-tea-jenkins-ir/g" /tmp/create-integrationruntime.json
# Backslash issues with groovy scripting - the effect is to escape / characters in the base64 data to avoid issues with sed
sed -i "s/TEMPLATE_BARURL/`cat /tmp/BARURL | sed 's/\\//\\\\\\\\\\\\//g'`/g" /tmp/create-integrationruntime.json
sed -i "s/TEMPLATE_POLICYPROJECT/`cat /tmp/deployPrefix`-jdbc-policies/g" /tmp/create-integrationruntime.json
sed -i "s/TEMPLATE_SERVERCONF/`cat /tmp/deployPrefix`-default-policy-project/g" /tmp/create-integrationruntime.json
Expand All @@ -254,5 +267,7 @@ pipeline {
APPCON_CLIENT_ID = credentials('APPCON_CLIENT_ID')
APPCON_CLIENT_SECRET = credentials('APPCON_CLIENT_SECRET')
APPCON_API_KEY = credentials('APPCON_API_KEY')
/* Could put the endpoint in as a credential, but it's not really secret . . . */
/* APPCON_ENDPOINT = credentials('APPCON_ENDPOINT') */
}
}
21 changes: 14 additions & 7 deletions demo-infrastructure/README-jenkins.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,16 +31,20 @@ For Windows, the ACE_COMMAND environment variable may need to be changed to matc
version of ACE (currently set to 12.0.10). Container support is not required.

For Linux, the pipeline will use containers for the actual build steps, and this requires either
the `ace` container image from cp.icr.io or the ace-minimal build container image to be created
the `ace` container image from cp.icr.io or the `ace-minimal-build` container image to be created
first (for users without an IBM Entitlement Key). The use of a container to run ACE commands ensures
that the Jenkins environment (for example, Java level) does not affect ACE commands, and ensures
a consistent environment for building ACE artifacts. See the [ace-minimal-build](/demo-infrastructure/docker/ace-minimal-build)
directory for information on building the image, or [Obtaining an IBM App Connect Enterprise
server image](https://www.ibm.com/docs/en/app-connect/12.0?topic=cacerid-building-sample-app-connect-enterprise-image-using-docker#aceimages__title__1)
to download the `ace` image. The Jenkinsfile will need to be updated to use the correct image.

Once those values have been updated and containers built if needed, then the pipeline can be
constructed, but it may be a good idea to change "GitHub API usage" under "System
The ACE version does not have to match the exact modification level (12.0.X) of the deploy target
(integration node or ACEaaS) but keeping build containers up-to-date is a good idea in general in
order to benefit from fixes and new capabilities.

Once the Jenkinsfile values have been updated and containers built if needed, then the pipeline
can be constructed, but it may be a good idea to change "GitHub API usage" under "System
Configuration" -> "System" in the Jenkins settings as otherwise messages such as the
following may appear regularly:
```
Expand Down Expand Up @@ -151,17 +155,17 @@ curl -X POST --data '{"name": "Assam", "strength": 5}' http://localhost:7800/tea
## ACE-as-a-Service target

See [README-aceaas-pipelines.md](README-aceaas-pipelines.md) for a general overview. The
Jenkins pipeline for ACEaaS looks as follows, with the (optional) "Create configuration"
Jenkins pipeline for ACEaaS looks as follows, with the (optional) "Create Configurations and IR"
shown as running only for the initial build:

![Pipeline overview](/demo-infrastructure/images/jenkins-aceaas-pipeline.png)

Similar to the integration node pipeline, the following values should be changed in
[Jenkinsfile.aceaas](/demo-infrastructure/Jenkinsfile.aceaas):
[Jenkinsfile.aceaas](/demo-infrastructure/Jenkinsfile.aceaas) or set when running the build:

- deployPrefix, which is used as a prefix for the various configurations to avoid conflicts on shared services.
- APPCON_ENDPOINT, which is the API endpoint
- DEPLOY_CONFIGURATION, which defaults to `false` but should be set to `true` for the initial configuration creation.
- APPCON_ENDPOINT, which is the API endpoint.
- DEPLOY_CONFIGURATION, which enables the "Create Configurations and IR" stage and defaults to `false` but should be set to `true` for the initial configuration creation.

To create the pipeline (and following the Jenkins pipeline tour instructions), a "multibranch
pipeline" should be created and pointed at the github repo. This pipeline must refer to
Expand All @@ -177,6 +181,9 @@ for details on how to create the correct credentials, and then set the following
- APPCON_CLIENT_SECRET is the client secret created from the "Public API credentials" section of the ACEaaS dashboard
- APPCON_API_KEY is the API key created from the ACEaaS dashboard

The pipeline could be changed to store APPCON_ENDPOINT as a credential as well (similar to the
Tekton equivalent), but the URL is not secret so in the default case it is provided in the Jenkinsfile.

The pipeline should create the required configurations based on the JDBC credentials
and other values if the DEPLOY_CONFIGURATION is set to `true`; this should only be used
for the first pipeline run or after any change to the credentials (see the "ACEaaS API rate
Expand Down
Loading

0 comments on commit 075abc0

Please sign in to comment.