Skip to content
smehta-veeva edited this page May 13, 2021 · 60 revisions

Vault Spark is a message queue system that allows developers to use the Vault Java SDK to send and receive messages from durable queue. It can be used to allow for loosely coupled, asynchronous integration within a vault, between different vaults, or between a vault and an external system. This page describes the example use cases of the integration between a vault and an external 3rd party system along with sample code, that you can use with your vault to step through the code in the Vault Java SDK Debugger.

Introduction

The vsdk-spark-external-aws-sample project contains an example of a Vault to External integration using Spark Messaging that sends messages from a vault to an external Amazon Web Services (AWS) SQS queue for processing. The project will step through:

  • Creation of an sample 3rd party AWS Application
    • Amazon Simple Queue Service (SQS) - store Spark messages in AWS for processing
    • Amazon API Gateway - point of entry from Vault to AWS
    • Amazon Lambda - verify, enqueue and process Spark messages
  • Vault
    • External messaging - send a simple message from a source vault to AWS for processing

Business Use Case

Loan Approval

A user needs to request a loan for an item, which is done within Vault with the creation of a Loan Approval record. Basic information about the loan request must be entered such as the name of the person requiring the loan, the item, the amount and the loan duration. Once saved a request is sent to an external Finance System within AWS for approval or rejection based on the specified loan criteria. The loan approval record within Vault is then updated with an approval decision, and if approved with details of the loan.

Subsequent loan approval requests can be made for the same item via a Loan re-quote record action.

General Guidelines

Setup

Prerequisites

  1. An AWS account. If you don't have one sign up for an account at AWS Free Tier.

There are three main steps to setup this project for use.

  1. Create AWS Application Services in the external system that will process the messages from vault.
  2. Create Vault Connection record in the vault that links it to AWS.
  3. Import Vault Packages to create necessary components and Spark Message Queues.
  4. Create Vault Queue object in the vault for the Spark Messages.

The project contains a vault packages (VPK) in the "deploy-vpk" directory with the necessary Objects, Queues, and Vault Java SDK code.

Create AWS Application Services

Create AWS IAM Role

  1. In the Amazon IAM console, click Roles in the menu on the left hand side of the page.
  2. Click Create role.
  3. Select AWS service as the type of trusted entity and then select the service Lambda.
  4. Click Next: Permissions.
  5. Add the following permissions policies by searching for the policy name and when it appears clicking the check mark next to it:
    • AWSLambdaBasicExecutionRole
    • AWSLambdaSQSQueueExecutionRole
    • AmazonS3FullAccess
  • AmazonSQSFullAccess
  1. Click Next: Tags.
  2. Click Next: Review.
  3. For Role Name, enter SparkSampleAwsRole.
  4. Click Create role.

Create SQS Queue vsdk-spark-sample-external-queue

  1. In the Amazon SQS console, click Create New Queue.
  2. Configure the new Queue:
    • For Queue Name, enter vsdk-spark-sample-external-queue.
    • For Region, choose your Region.
    • For type of queue, choose Standard Queue.
  3. Select Quick-Create Queue.
  4. With the vsdk-spark-sample-external-queue SQS queue selected, select the Permissions tab.
  5. Click Add a permission.
  6. Set the following values:
    • For Effect, check Allow.
    • For Principle, choose (click) the check mark next to Everybody (*).
    • For Actions, choose (click) the check mark next to the following:
      • DeleteMessage
      • GetQueueURL
      • ReceiveMessage
      • SendMessage
  7. Select Add Permission.
  8. With the vsdk-spark-sample-external-queue SQS queue selected, select the Details tab.
  9. Record the URL displayed in the Details tab, as this will be needed later in configuring AWS in the section Create Lambda function vsdkSparkSampleValidateAndEnqueueMessage.

Create S3 Bucket vsdk-spark-external-aws-s3-bucket

  1. In the Amazon S3 console, click Create Bucket.
  2. Configure the s3 bucket:
    • For BucketName, enter vsdk-spark-external-aws-s3-bucket.
    • For Region, choose your Region.
  3. Leave the other configuration settings to their defaults and click Create Bucket.
  4. Note down the bucket name as it will be needed for the next step.

Create Lambda function vsdkSparkSampleValidateAndEnqueueMessage

First, you need to download the project
  1. Clone or download the sample Maven project vSDK Spark AWS External Sample project from GitHub, and save on your local machine.
  2. In the AWS Lambda console, click Create function.
  3. Set the following values:
    • Check Author from scratch.
    • For Name, enter vsdkSparkSampleValidateAndEnqueueMessage.
    • For Runtime, select Java 8(Corretto).
    • For Role, select Choose an existing role.
    • For Existing role, select SparkSampleAwsRole that was created earlier.
  4. Click Create function.
  5. Scroll to the Function code section and in the Code entry type dropdown, select Upload a .zip or .jar file.
  6. Click Upload.
  7. From the sample Maven project vSDK Spark AWS External Sample project downloaded in step 1, select the \aws-lambda-samples\vsdk-spark-external-aws-sample-validate-and-enque-message.jar file.
  8. Click Save.
  9. Scroll down to the Environment Variables section and add the following:
    • Key: VAULT_SAMPLE_SQS_QUEUE_URL; Value: URL for the queue recorded in the section Create SQS Queue vsdk-spark-sample-external-queue.
    • Key: VAULT_HOSTNAME; Value: {YourVault}, e.g. your-vault.veeva.com.
    • Key: VAULT_USER; Value: Your Vault Integration service account's username.
    • Key: VAULT_PASSWORD; Value: Your Vault Integration service account's password. *. Key: BUCKET_NAME; Value: The name of the bucket you created in the last section.
  10. Scroll down to the Basic settings section and set the Timeout to 0 mins and 20 sec.
  11. Click Save.
  12. Click Functions in the breadcrumb trial at the top the page to return to the main functions page.

Create Lambda function vsdkSparkSampleProcessMessage

  1. Navigate to the AWS Lambda console, click Create function.
  2. Set the following values:
    • Check Author from scratch.
    • For Name, enter vsdkSparkSampleProcessMessage.
    • For Runtime, select Python 3.7.
    • For Role, select Choose an existing role.
    • For Existing role, select SparkSampleAwsRole that was created earlier.
  3. Click Create function.
  4. Scroll to the Function code section and in the Code entry type dropdown, select Upload a .zip file.
  5. Click Upload.
  6. From the sample Maven project vSDK Spark AWS External Sample project downloaded in step 1, select the \aws-lambda-samples\vsdkSparkSampleProcessMessage.zip file.
  7. Click Save.
  8. Scroll down to the Environment Variables section and add the following:
    • Key: CLIENT_ID; Value: vsdk-sparksample-aws-process-message.
    • Key: VAULT_API_BURST_LIMIT_CUTOFF; Value: 200.
    • Key: VAULT_VQL_PAGE_LIMIT; Value: 500.
    • Key: VAULT_REST_API_BASE_URL; Value: https://{YourVault}/api/v18.3/.
    • Key: VAULT_USER; Value: Your Vault Integration service account's username.
    • Key: VAULT_PASSWORD; Value: Your Vault Integration service account's password.
  9. Scroll down to the Basic settings section and set the Timeout to 0 mins and 30 sec.
  10. Click Save.
  11. Scroll up to the Designer section and in the Add triggers section on the left hand side of the screen, select SQS
  12. In the Configure triggers section, select the SQS Queue vsdk-spark-sample-external-queue.
  13. Make sure Enable trigger is checked and click Add.

Note: Now this trigger is enabled background processing will be running in your AWS environment which will counts towards your billable resources. To stop this at any time, simply disable this trigger and press save.

Create API Gateway vsdk-spark-sample-external-queue

  1. Navigate to the API Gateway console, click Create API.
  2. Select the HTTP API option and click Build.
  3. Create and configure integrations:
    • Click Add integration and choose Lambda from the drop-down menu, then choose the vsdkSparkSampleValidateAndEnqueueMessage lambda from the Lambda Function search box.
    • For API name, enter vSdkSparkSampleAPIGateway.
  4. Click Next and then select the POST option in the Method drop-down and /message as the Resource Path,
  5. Click Next and then Next and Create
  6. Record the URL displayed next to Invoke URL at the top of the Stage Editor, as this will be needed later in configuring Vault in the step Create Connection.

The creation of API Gateway vsdk-spark-sample-external-queue is now complete.

Create Vault Connection

The Connection object is used to create records that define connections between different vaults or between a vault and an external system.

In this use case, we will create Vault to External records to link a source vault to an external system.

In vault
  1. Log in and navigate to Admin > Business Admin > Connections and click Create
  2. Choose the External connection type. Select Continue.
  3. Enter AWS Queue Sample API Gateway in the Name field.
  4. Enter vsdk_aws_queue_sample_api_gateway in the API Name field.
  5. Enter your API Gateway Invoke URL in the URL field. This will be the value you were asked to record when creating the AWS API Gateway earlier and will look something like https://12abcdefg2.execute-api.eu-west-2.amazonaws.com. Important - make sure /message is not included at the end of the URL.
  6. Select Save.

The connection is now ready for use!

Import Vault Packages

The VPKs need to be deployed to your vault prior to debugging these use cases.

First, you need to retrieve the project
  1. Retrieve the sample Maven project vSDK Spark AWS External Sample project that was previously downloaded from GitHub in section Create Lambda function vsdkSparkSampleValidateAndEnqueueMessage.
  2. Run through the Getting Started guide to set up your deployment environment.
Next, deploy the required code and components to your vault
  1. In the vault, log in and navigate to Admin > Deployment > Inbound Packages and click Import:

    Deploy vault components and code: Select the \deploy-vpk\vault\vsdk-spark-external-aws-sample-components\vsdk-spark-external-aws-sample-components.vpk file.

  2. From the Actions menu, select Review & Deploy. Vault displays a list of all components in the package.

  3. Review the prompts to deploy the package. You will receive an email when the deployment is complete.

Once the package has been deployed, you will want to review the configuration and understand the normal behavior so you can observe the effects of the sample trigger code.

Below is a rundown of the included components.

Objects
  • Loan approval (vsdk_loan_approval__c)
Picklists
  • Loan Period (Months) (vsdk_loan_period_months1__c)
  • Approval Status (vsdk_approval_status1__c)
VAULT JAVA SDK - Records Triggers
  • com.veeva.vault.custom.triggers.vSdkSparkExternalAwsSampleTrigger - AFTER INSERT (vSdkAwsQueueSampleTrigger.java)
VAULT JAVA SDK - Records Actions
  • Loan re-quote (vSdkSparkExternalAwsSampleAction.java)

Create Vault Queue

A Queue needs to be configured to utilize the Spark messaging functionality. These queues will handle messages that are produced via the Vault Java SDK Message and QueueService.

In vault
  1. Log in and navigate to Admin > Configuration > Queue Setup > Queues and click Create.
  2. Set the following values:
    • Check Author from scratch.
    • For Label, enter vSDK AWS Queue Sample.
    • For Name, enter vsdk_aws_queue_sample.
    • For Queue Type, select Outbound.
  3. Click Save
  4. On the new Queue, scroll down to the Queue Connections section and click Create.
  5. Select the vsdk_aws_queue_sample_api_gateway and click Save.

Next Steps

  • Run the project provides details of how to run the project.
  • Code logic provides a detailed understanding of how the sample components work.