Skip to content

Commit

Permalink
Initial commit
Browse files Browse the repository at this point in the history
Signed-off-by: Dale Lane <[email protected]>
  • Loading branch information
dalelane committed Dec 11, 2023
1 parent 4b6bf72 commit bd81ad0
Show file tree
Hide file tree
Showing 291 changed files with 19,858 additions and 0 deletions.
18 changes: 18 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# Maven
target/

# macOS
.DS_Store

# Eclipse
.classpath
.settings/
.project

# IntelliJ IDEA
.idea/
*.iml
*.iws

# Visual Studio Code
.vscode/
67 changes: 67 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
# kafka-connect-xml-converter

A Kafka Connect plugin to make it easier to work with XML data in Kafka Connect pipelines.

## Contents

- `com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlConverter`
- a Kafka Connect converter for converting to/from XML strings
- `com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlTransformation`
- a Kafka Connect transformation for converting Kafka Connect records to/from XML strings
- `com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlMQRecordBuilder`
- an MQ Source Record builder for parsing MQ messages containing XML strings

## Configuration

Optional configuration that can be set when using the plugin to turn XML strings into Connect records

| **Option** | **Default value** | **Notes** |
|------------------------|-------------------|------------------------------------------------------------------|
| `root.element.name` | `root` | The name of the root element in the XML document being parsed. |
| `xsd.schema.path` | | Location of a schema file to use to parse the XML string. |
| `xml.doc.flat.enable` | `false` | Set to true if the XML strings contain a single value (e.g. `<root>the message</root>` |)

Optional configuration that can be set when using the plugin to create XML strings from Connect records

| **Option** | **Default value** | **Notes** |
|------------------------|-------------------|------------------------------------------------------------------|
| `root.element.name` | `root` | The name to use for the root element of the XML document being created. Only used when no name can be found within the schema of the Connect record. |

## Example uses

Use **`XmlConverter`** with Source Connectors to produce structured Connect records to Kafka topics as XML strings.

```
value.converter=com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlConverter
value.converter.schemas.enable=false
```

Use **`XmlConverter`** with Source Connectors to produce Connect records to Kafka topics as XML strings, with an embedded XSD schema. (requires structs)

```
value.converter=com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlConverter
value.converter.schemas.enable=true
```

Use **`XmlTransformation`** with Sink Connectors to convert a Connect record containing an XML string into a structured Connect record.

```
transforms=xmlconvert
transforms.xmlconvert.type=com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlTransformation
transforms.xmlconvert.converter.type=value
```

Use **`XmlConverter`** with the MQ Sink Connector to send non-XML Kafka messages to MQ queues as XML strings.

```
mq.message.builder=com.ibm.eventstreams.connect.mqsink.builders.ConverterMessageBuilder
mq.message.builder.value.converter=com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlConverter
```

Use **`XmlMQRecordBuilder`** with the MQ Source Connector to convert XML strings from MQ queues into Connect records.

```
mq.record.builder=com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlMQRecordBuilder
mq.record.builder.schemas.enable=true
mq.record.builder.xsd.schema.path=/location/of/mq-message-schema.xsd
```
Binary file added examples/diagrams/converter-datagen.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/converter-json.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/converter-stockprices.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/recordbuilder-sink.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/recordbuilder-source-avro.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/recordbuilder-source-json.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/sink-pipeline.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/source-pipeline.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/transformer-sink.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added examples/diagrams/transformer-source.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
17 changes: 17 additions & 0 deletions examples/eventstreams/credentials.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
kind: Secret
apiVersion: v1
metadata:
name: alphavantage
type: Opaque
data:
apikey: UkVEQUNURUQ=

---

kind: Secret
apiVersion: v1
metadata:
name: weather
type: Opaque
data:
appid: UkVEQUNURUQ=
21 changes: 21 additions & 0 deletions examples/eventstreams/datagen-json.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
apiVersion: eventstreams.ibm.com/v1beta2
kind: KafkaConnector
metadata:
name: kafka-datagen-json
labels:
eventstreams.ibm.com/cluster: kafka-connect-cluster
spec:
class: com.ibm.eventautomation.demos.loosehangerjeans.DatagenSourceConnector
tasksMax: 1
config:
key.converter: org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable: false
value.converter: org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable: false

topic.name.orders: ORDERS.NEW.JSON
topic.name.cancellations: CANCELLATIONS.JSON
topic.name.stockmovements: STOCK.MOVEMENT.JSON
topic.name.badgeins: DOOR.BADGEIN.JSON
topic.name.newcustomers: CUSTOMERS.NEW.JSON
topic.name.sensorreadings: SENSOR.READINGS.JSON
21 changes: 21 additions & 0 deletions examples/eventstreams/datagen-xml.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
apiVersion: eventstreams.ibm.com/v1beta2
kind: KafkaConnector
metadata:
name: kafka-datagen-xml
labels:
eventstreams.ibm.com/cluster: kafka-connect-cluster
spec:
class: com.ibm.eventautomation.demos.loosehangerjeans.DatagenSourceConnector
tasksMax: 1
config:
key.converter: org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable: false
value.converter: com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlConverter
value.converter.schemas.enable: false

topic.name.orders: ORDERS.NEW.XML
topic.name.cancellations: CANCELLATIONS.XML
topic.name.stockmovements: STOCK.MOVEMENT.XML
topic.name.badgeins: DOOR.BADGEIN.XML
topic.name.newcustomers: CUSTOMERS.NEW.XML
topic.name.sensorreadings: SENSOR.READINGS.XML
11 changes: 11 additions & 0 deletions examples/eventstreams/example-api-request.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
{
"product": {
"id": "ABCD1234",
"type": "Something impressive"
},
"value": 12.99,
"customer": {
"id": "XXXXYYYY",
"name": "Joe Bloggs"
}
}
46 changes: 46 additions & 0 deletions examples/eventstreams/http-sink.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
apiVersion: eventstreams.ibm.com/v1beta2
kind: KafkaConnector
metadata:
name: http-json-to-xml
labels:
eventstreams.ibm.com/cluster: kafka-connect-cluster
spec:
class: io.aiven.kafka.connect.http.HttpSinkConnector
tasksMax: 1
config:
# the Kafka topic to consume from
topics: TEST.MESSAGES

# transform the JSON message contents to match the web service API before converting to XML
transforms: drop,insert,rename,xmlconvert
# remove one of the properties
transforms.drop.type: org.apache.kafka.connect.transforms.ReplaceField$Value
transforms.drop.blacklist: customer
# insert a new property
transforms.insert.type: org.apache.kafka.connect.transforms.InsertField$Value
transforms.insert.static.field: origin
transforms.insert.static.value: demo
# rename one of the properties
transforms.rename.type: org.apache.kafka.connect.transforms.ReplaceField$Value
transforms.rename.renames: value:cost
# convert the results into an XML string
transforms.xmlconvert.type: com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlTransformation
transforms.xmlconvert.converter.type: value
transforms.xmlconvert.root.element.name: request

# web service to submit the XML payload to
http.url: http://test-web-service-event-automation.apps.dalelane.cp.fyre.ibm.com/api/webhooks/001
http.authorization.type: none
http.headers.content.type: text/xml

# format of the messages to produce
mq.message.body.jms: true
mq.message.builder: com.ibm.eventstreams.connect.mqsink.builders.ConverterMessageBuilder
mq.message.builder.value.converter: com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlConverter
mq.message.builder.value.converter.root.element.name: msg

# format of the messages to consume
key.converter: org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable: false
value.converter: org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable: false
122 changes: 122 additions & 0 deletions examples/eventstreams/kafka-connect.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
apiVersion: eventstreams.ibm.com/v1beta2
kind: KafkaConnect
metadata:
name: kafka-connect-cluster
annotations:
eventstreams.ibm.com/use-connector-resources: 'true'
spec:
build:
output:
image: image-registry.openshift-image-registry.svc:5000/event-automation/xml-processing-demo:0.0.1
type: docker
plugins:
- name: xml-converter
artifacts:
- type: jar
url: https://github.com/IBM-messaging/kafka-connect-xml-converter/releases/download/0.1.0/kafka-connect-xml-converter-0.1.0-jar-with-dependencies.jar
- name: datagen
artifacts:
- type: jar
url: https://github.com/IBM/kafka-connect-loosehangerjeans-source/releases/download/0.0.4/kafka-connect-loosehangerjeans-source-0.0.4-jar-with-dependencies.jar
- name: http
artifacts:
- type: zip
url: https://github.com/Aiven-Open/http-connector-for-apache-kafka/releases/download/v0.7.0/http-connector-for-apache-kafka-0.7.0.zip
- name: stock-prices
artifacts:
- type: jar
url: https://github.com/dalelane/kafka-connect-stockprice-source/releases/download/v0.0.3/kafka-connect-stockprice-source-connector.jar
- name: mq-source
artifacts:
- type: jar
url: https://github.com/ibm-messaging/kafka-connect-mq-source/releases/download/v1.3.4/kafka-connect-mq-source-1.3.4-jar-with-dependencies.jar
- name: mq-sink
artifacts:
- type: jar
url: https://github.com/ibm-messaging/kafka-connect-mq-sink/releases/download/v1.5.2/kafka-connect-mq-sink-1.5.2-jar-with-dependencies.jar
- name: weather
artifacts:
- artifact: camel-weather-kafka-connector
group: org.apache.camel.kafkaconnector
type: maven
version: 0.11.5
- name: apicurio
artifacts:
- artifact: apicurio-registry-serdes-avro-serde
group: io.apicurio
type: maven
version: 2.5.5.Final
- artifact: apicurio-registry-utils-converter
group: io.apicurio
type: maven
version: 2.5.5.Final
- name: confluent
artifacts:
- type: maven
repository: https://packages.confluent.io/maven/
group: io.confluent
artifact: kafka-connect-avro-converter
version: 7.5.1
externalConfiguration:
volumes:
- name: weather
secret:
secretName: weather
- name: alphavantage
secret:
secretName: alphavantage
- configMap:
name: xml-schemas
name: xml-schemas
config:
client.id: kafka-connect-client
group.id: kafka-connect
config.storage.topic: connect-configs
offset.storage.topic: connect-offsets
status.storage.topic: connect-status
config.providers: file
config.providers.file.class: org.apache.kafka.common.config.provider.DirectoryConfigProvider
bootstrapServers: 'my-kafka-cluster-kafka-bootstrap.event-automation.svc:9095'
resources:
limits:
cpu: 2048m
memory: 2Gi
requests:
cpu: 2048m
memory: 2Gi
authentication:
passwordSecret:
password: password
secretName: kafka-connect-credentials
type: scram-sha-512
username: kafka-connect-credentials
template:
buildConfig:
pullSecret: ibm-entitlement-key
connectContainer:
securityContext:
allowPrivilegeEscalation: false
capabilities:
drop:
- ALL
privileged: false
readOnlyRootFilesystem: true
runAsNonRoot: true
pod:
imagePullSecrets: []
metadata:
annotations:
cloudpakId: c8b82d189e7545f0892db9ef2731b90d
productVersion: 11.2.5
productID: 2a79e49111f44ec3acd89608e56138f5
cloudpakName: IBM Cloud Pak for Integration
productChargedContainers: kafka-connect-cluster-connect
productCloudpakRatio: '2:1'
productName: IBM Event Streams for Non Production
eventstreams.production.type: CloudPakForIntegrationNonProduction
productMetric: VIRTUAL_PROCESSOR_CORE
tls:
trustedCertificates:
- certificate: ca.crt
secretName: my-kafka-cluster-cluster-ca-cert
replicas: 1
32 changes: 32 additions & 0 deletions examples/eventstreams/mq-json-to-xml.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
apiVersion: eventstreams.ibm.com/v1beta2
kind: KafkaConnector
metadata:
name: mq-xml-to-json
labels:
eventstreams.ibm.com/cluster: kafka-connect-cluster
spec:
class: com.ibm.eventstreams.connect.mqsink.MQSinkConnector
tasksMax: 1
config:
# the Kafka topic to consume from
topics: XML.MESSAGES.4

# the MQ queue to put messages to
mq.queue: XML.MESSAGES.4

# connection details for the queue manager
mq.queue.manager: MYQMGR
mq.connection.name.list: queuemanager-ibm-mq(1414)
mq.channel.name: KAFKA.SVRCONN

# format of the messages to produce
mq.message.body.jms: true
mq.message.builder: com.ibm.eventstreams.connect.mqsink.builders.ConverterMessageBuilder
mq.message.builder.value.converter: com.ibm.eventstreams.kafkaconnect.plugins.xml.XmlConverter
mq.message.builder.value.converter.root.element.name: msg

# format of the messages to consume
key.converter: org.apache.kafka.connect.storage.StringConverter
key.converter.schemas.enable: false
value.converter: org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable: false
24 changes: 24 additions & 0 deletions examples/eventstreams/mq-message.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
<?xml version="1.0" encoding="UTF-8" ?>
<ordermessage>
<customer>
<name>Helen Velazquez</name>
<phone type="landline" number="0911 910 5491"/>
<email>[email protected]</email>
<address>3249 Hendrerit Av.</address>
<postalZip>F2 1IX</postalZip>
<region>Dunbartonshire</region>
</customer>
<product>
<brand>Acme Inc</brand>
<item>Awesome-ivator</item>
<quantity>1</quantity>
</product>
<product>
<brand>Globex</brand>
<item>Widget</item>
<quantity>2</quantity>
</product>
<order>
<date>2023-11-05 22:11:00</date>
</order>
</ordermessage>
Loading

0 comments on commit bd81ad0

Please sign in to comment.