Skip to content
Ryan Lacy edited this page Mar 13, 2023 · 60 revisions

Spark Messaging is a message queue system that allows Vault Java SDK developers to send and receive messages from durable queue. It can allow loosely coupled, asynchronous integration within a Vault, between different Vaults, or between a Vault and an external system. This page describes example use cases of the integration between a Vault and an external 3rd party system and provides sample code that you can use with your Vault to step through the code in the Vault Java SDK Debugger.

Introduction

The vsdk-spark-external-aws-sample project contains an example of an External Connection integration using Spark Messaging that sends messages from a Vault to an external Amazon Web Services (AWS) SQS queue for processing. The project demonstrates:

  • How to create a sample 3rd party AWS Application:
    • Amazon Simple Queue Service (SQS): Stores Spark messages in AWS for processing.
    • Amazon API Gateway: Point of entry from Vault to AWS.
    • Amazon Lambda: Verify, enqueue, and process Spark messages.
  • How to configure the source Vault:
    • External Connection: Sends a simple message to AWS for processing.

Business Use Case

Loan Approval

Users request loans for items by creating Loan Approval records in Vault. They must enter basic information about the loan request such as the name of the person requiring the loan, the item, the amount, and the loan duration. Once saved, a request is sent to an external finance system within AWS for approval or rejection based on the specified loan criteria. The loan approval record within Vault is then updated with an approval decision and, if approved, with details of the loan.

Users can make subsequent loan approval requests for the same item via a Loan re-quote record action.

Setup

Prerequisites

  1. An AWS account. If you don't have one, sign up for an account at AWS Free Tier.
  2. A source Vault. You must have the Vault Owner security profile and the appropriate permissions to complete this project.

There are four main setup steps:

  1. Create AWS Application Services in the external system that will process the messages from Vault.
  2. Create a Connection record in the Vault that links it to AWS.
  3. Import Vault Packages to create necessary components and Spark Message Queues.
  4. Create a Queue object in the Vault for the Spark Messages.

The project contains a Vault Package File (VPK) in the "deploy-vpk" directory with the necessary objects, queues, and Vault Java SDK code.

Create AWS Application Services

Create AWS IAM Role

  1. In the Amazon IAM console, click Roles in the menu on the left hand side of the page.
  2. Click Create role.
  3. Select AWS service as the trusted entity type and then select Lambda as the use case.
  4. Click Next: Permissions.
  5. Add the following permissions policies by searching for the policy name and clicking the check mark next to it when it appears (if you have multiple results, select the policy with a type of "AWS managed"):
    • AWSLambdaBasicExecutionRole
    • AWSLambdaSQSQueueExecutionRole
    • AmazonSQSFullAccess
  6. Click Next.
  7. For Role Name, enter SparkSampleAwsRole.
  8. Click Create role.

Create SQS Queue vsdk-spark-sample-external-queue

  1. In the Amazon SQS console, click Create New Queue.
  2. Configure the new Queue:
    • For Queue Type, choose Standard.
    • For Queue Name, enter vsdk-spark-sample-external-queue.
  3. Select Create Queue.
  4. With the vsdk-spark-sample-external-queue SQS queue selected, select the Access policy tab.
  5. Select Edit within the Access policy (Permissions) section.
  6. Under Access policy, launch the Policy generator.
  7. Within the Policy Generator, do the following:
    • For Type of Policy, select SQS Queue Policy.
    • For Effect, select Allow.
    • For Principle, enter a * (this selects everybody).
    • For Actions, choose the following:
      • DeleteMessage
      • GetQueueURL
      • ReceiveMessage
      • SendMessage
    • Enter your specific ARN, which follows the format: arn:aws:sqs:${Region}:${Account}:${QueueName}
    • Click Add Statement
    • Click Generate Policy
    • Copy the generated policy, navigate back to your SQS Queue, paste the policy into the Access Policy field, and click Save.
  8. With your newly created SQS queue selected, copy the URL displayed in the Details section, as this will be needed later in configuring AWS in the section Create Lambda function vsdkSparkSampleValidateAndEnqueueMessage.

Create Lambda function vsdkSparkSampleValidateAndEnqueueMessage

  1. Clone or download the sample Maven project vSDK Spark AWS External Sample project from GitHub, and save on your local machine.
  2. In the AWS Lambda console, click Create function.
  3. Set the following values:
    • Select Author from scratch.
    • For Name, enter vsdkSparkSampleValidateAndEnqueueMessage.
    • For Runtime, select Java 8.
    • Under the Change default execution role drop-down, select Use an existing role.
    • For Existing role, select SparkSampleAwsRole which we created previously.
  4. Click Create function.
  5. In the Runtime settings section, click Edit and set the Handler to com.veeva.vault.LambdaHandler.
  6. Within the Code source section, select Upload from .zip or .jar file.
  7. Click Upload.
  8. From the sample Maven project vSDK Spark AWS External Sample project downloaded in step 1, select the \aws-lambda-samples\vsdkSparkSampleValidateAndEnqueueMessage.jar file.
  9. Click Save.
  10. Navigate to the Environment variables section of the Configuration tab and add the following:
    • Key: VAULT_SAMPLE_SQS_QUEUE_URL; Value: URL for the queue recorded in the Create SQS Queue section.
    • Key: VAULT_HOSTNAME; Value: {YourVault}, e.g. https://your-vault.veeva.com.
    • Key: VAULT_USER; Value: Your Vault Integration service account's username.
    • Key: VAULT_PASSWORD; Value: Your Vault Integration service account's password.
    • Key: BUCKET_NAME; Value: The name of the bucket you created in the last section.
  11. In the General configuration section of the Configuration tab, set the Timeout to 0 mins and 20 sec.
  12. Click Save.
  13. Click Functions in the breadcrumb trial at the top the page to return to the main functions page.

Create Lambda function vsdkSparkSampleProcessMessage

  1. While still in the AWS Lambda console, click Create function.
  2. Set the following values:
    • Select Author from scratch.
    • For Name, enter vsdkSparkSampleProcessMessage.
    • For Runtime, select Python 3.7.
    • Under the Change default execution role drop-down, select Use an existing role.
    • For Existing role, select SparkSampleAwsRole which we created previously.
  3. Click Create function.
  4. Within the Code source section, select Upload from .zip file.
  5. Click Upload.
  6. From the sample Maven project vSDK Spark AWS External Sample project downloaded in step 1, select the \aws-lambda-samples\vsdkSparkSampleProcessMessage.zip file.
  7. Click Save.
  8. Navigate to the Environment variables section of the Configuration tab and add the following:
    • Key: CLIENT_ID; Value: vsdk-sparksample-aws-process-message.
    • Key: VAULT_API_BURST_LIMIT_CUTOFF; Value: 200.
    • Key: VAULT_VQL_PAGE_LIMIT; Value: 500.
    • Key: VAULT_REST_API_BASE_URL; Value: https://{YourVault}/api/v22.2/.
    • Key: VAULT_USER; Value: Your Vault Integration service account's username.
    • Key: VAULT_PASSWORD; Value: Your Vault Integration service account's password.
  9. In the General configuration section of the Configuration tab, set the Timeout to 0 mins and 30 sec.
  10. Click Save.
  11. In the Triggers section of the Configuration tab, select Add trigger.
  12. In the Trigger configuration, select the SQS Queue vsdk-spark-sample-external-queue.
  13. Make sure Activate trigger is checked and click Add.

Note: Once this trigger is enabled, background processing will run in your AWS environment and count towards your billable resources. To stop this at any time, simply disable this trigger and press save.

Create API Gateway vsdk-spark-sample-external-queue

  1. Navigate to the API Gateway console, and click Create API.
  2. Select the HTTP API option and click Build.
  3. Create and configure integrations:
    • Click Add integration, choose Lambda from the drop-down, then choose the vsdkSparkSampleValidateAndEnqueueMessage lambda from the Lambda Function search box.
    • For API name, enter vSdkSparkSampleAPIGateway.
  4. Click Next and then select the POST option in the Method drop-down, using /message as the Resource Path.
  5. Click Next and the Next and Create.
  6. Record the Invoke URL displayed in the Stages section, as this will be needed later in configuring Vault during the Create Connection step.

The creation of API Gateway vsdk-spark-sample-external-queue is now complete.

Create Vault Connection

The Connection object is used to create records that define connections between different Vaults or between a Vault and an external system.

In this use case, we will create External records to link a source Vault to an external system.

In Vault
  1. Log in and navigate to Business Admin > Connections and click Create
  2. Choose the External connection type. Select Continue.
  3. Enter AWS Queue Sample API Gateway in the Name field.
  4. Enter vsdk_aws_queue_sample_api_gateway in the API Name field.
  5. Enter your API Gateway Invoke URL in the URL field. This will be the value you were asked to record when creating the AWS API Gateway earlier and will look something like https://{some-alpha-numeric-string}.execute-api.{region}.amazonaws.com/. Important: make sure /message is not included at the end of the URL.
  6. Select Save.

The connection is now ready for use.

Import Vault Packages

You must deploy the VPKs to your Vault prior to debugging these use cases.

Retrieving the Project
  1. Retrieve the sample Maven project vSDK Spark AWS External Sample project that was previously downloaded from GitHub in section Create Lambda function vsdkSparkSampleValidateAndEnqueueMessage.
  2. Run through the Getting Started guide to set up your deployment environment, if needed.
Deploying the Required Code and Components to your Vault
  1. In Vault, log in and navigate to Admin > Deployment > Inbound Packages and click Import:

    Deploy Vault components and code: Select the \deploy-vpk\vault\vsdk-spark-external-aws-sample-components.vpk file.

  2. From the Actions menu, select Review & Deploy. Vault displays a list of all components in the package.

  3. Review the prompts to deploy the package. You will receive an email when the deployment is complete.

Once the package has been deployed, you should review the configuration and understand the normal behavior so you can observe the effects of the sample trigger code.

The package includes the following components.

Objects
  • Loan approval (vsdk_loan_approval__c)
Picklists
  • Loan Period (Months) (vsdk_loan_period_months1__c)
  • Approval Status (vsdk_approval_status1__c)
VAULT JAVA SDK - Records Triggers
  • com.veeva.vault.custom.triggers.vSdkSparkExternalAwsSampleTrigger - AFTER INSERT (vSdkAwsQueueSampleTrigger.java)
VAULT JAVA SDK - Records Actions
  • Loan re-quote (vSdkSparkExternalAwsSampleAction.java)

Create Vault Queue

Within Vault, you must configure a Spark Queue to utilize Spark messaging functionality. This queue will handle messages that are produced via the Vault Java SDK Message and QueueService.

In Vault
  1. Log in and navigate to Admin > Connections > Spark Queues and click Create.
  2. Set the following values:
    • For Label, enter vSDK AWS Queue Sample.
    • For Queue Type, select Outbound.
  3. Click Save
  4. On the new Queue, scroll down to the Queue Connections section and click Create.
  5. Select the vsdk_aws_queue_sample_api_gateway and click Save.

Next Steps

  • Run the project provides details of how to run the project.
  • Code logic provides a detailed understanding of how the sample components work.