Part 1: CI/CD for Application Integration

This two-part article guides developers on how to develop, test, publish, and promote code in Application Integration.

  • Part 1 (this one) focuses on building a simple integration and tests, then using integrationcli to save the integration to a code repository. It also covers how to manually publish and promote the integration using integrationcli.
  • Part 2 explores automating the entire process with CI/CD tools like Cloud Deploy and GitHub Actions, in conjunction with integrationcli.

Introduction

Application Integration is an Integration-Platform-as-a-Service (iPaaS) solution in Google Cloud that offers a comprehensive set of core integration tools to connect and manage the multitude of applications and data required to support various business operations.

Application Integration is a low-code/no-code platform that allows developers to create integration flows. Integration flows are sequences of tasks or activities that connect and coordinate the exchange of data between different systems. It is a common practice to develop these assets and promote them to various environments just like source code.

The best practice for CI/CD with Application Integration is to use a GCP project for each phase of SDLC. In this example, we will demonstrate how to automate the promotion of Application Integration and Integration Connectors artifacts between two SDLC phases - development (dev) and qa environments.

image36.png

Prerequisites

  • Application Integration and Integration Connectors are enabled and provisioned. If you have not done so already, please follow this guide
  • Basic working knowledge of Application Integration and Integration Connectors.
  • Necessary IAM permissions for Application Integration and Integration Connectors.
  • GitHub is used as the example source code repository. A GitHub account will be necessary to complete the instructions
  • Download gcloud and configure the default project
gcloud config set project $project

Principles

There are a few (opinionated) principles followed in this example:

  1. An integration flow developed in the development environment must be deployed unchanged in production. Treat the integration flow like one would source code.
  2. The portions of an integration flow that do change from one environment to another are externalized. Only those properties change between environments. For example, the HTTP URL configured for a REST Task.
  3. There must be traceability between what is stored in a source code repository and an integration flow version deployed in an environment.
  4. Maintain a single repository for each deployable service
  5. Automate the deployment between environments
  6. Test in a clone of the production environment

Steps to Automate Application Integration Deployments

In this example you will see how to store integration and connector assets to a source code repository, promote those assets from one SDLC environment to another and finally, how such deployments can be automated.

Sample Integration

We will build a sample integration with minimal complexity to demonstrate this use case. This Integration flow calls an API and publishes the response from the API to a Pub/Sub topic. This sample is meant to illustrate the use of REST and Connector tasks.

Create a topic

gcloud pubsub topics create mytopic

Create a connection for Pub/Sub

In the GCP console, navigate to “Integration Connectors” from the left menu and click the “+ CREATE NEW” button to create a new connector.

image39.png

Please change the Service Account and Project ID appropriately. Ensure the service account has privileges to publish to the PubSub topic. This may take a few mins

Create an Integration Flow as follows

1. Create a new Integration “sample” with description “Sample Integration for CI/CD” and select your appropriate region. Click “CREATE”.

image38.png

2. In the Integration designer, click the “+ CREATE” in the Variables section to create some Config variablesConfig variables enable you to externalize configuration for integration. With config variables, you can configure aspects of your integration such as connector details, authentication details, or URL endpoints that are based on the environment. Make sure you select “Config Variable for Integration” in the Variable Type dropdown. Let’s create a variable called “URL”.

image11.png

3. Similar to the step above, create another config variable and called it “CONN_NAME”.

image19.png

4. One more config variable called “TOPIC”.

image10.png

5. Now, let’s create an “input” variable named “user”. Make sure you select “Input to Integration” from the Variable Type dropdown.

image29.png

6. Lastly, we create a variable named “userOutput”. Make sure you select “Output from Integration” from the Variable Type dropdown.

image20.png

 

Once all the variables are created (3 config, 1 input and 1 output variables), you should see them in the Variables section.

image22.png

7. Add an API Trigger and wire to the REST task. In the REST Task config, select the URL config variable.

image26.png

In the REST Task, click the “+ ADD” next to “URL query string parameters”. Add “user” as the query string key and the variable “user” as the value.

image33.png

8. Add a Connectors Task, do not wire it yet. Select the PubSub connection created previously.

image23.png
Select the connection.

image2.png

Click “NEXT”. Click “Actions” as the TYPE. Click “NEXT” again.

image14.png

Select “Publish Message” for “Action” and click “NEXT”.

image17.png

Click DONE. This should configure the Connector task.

In the connector task, change the “Connection Name” to use the “CONN_NAME” variable.

image21.png

9. Add a data mapper task and wire the tasks as shown below:

image18.png

10. Click the Data Mapping task and open the Data Mapper editor. In the Data Mapper Editor, map the responseBody variable of the REST task to the input section and the “message” attribute of connectorInputPaylod to the output section. Similarly, drag the `CONFIG_TOPIC` to the input section and “topic” attribute of the connectorInputPaylod to the output as shown in the diagram below.

image15.png

11. In the 3rd row of the Mapper, drag the responseBody from the left variable pane and then click the “+” to select the “TO_JSON()” function. Click the “+” again to select “GET_PROPERTY” and type “args” in the “Value”. Again click the “+” and select “GET_PROPERTY”, type “user” in the “Value”. Finally drag and drop the “userOutput” variable to the output.

image32.png

12. Click the Back button to take you to the integration designer page. Test the integration to ensure the flow works successfully. To Test, click the “TEST” button and in the input section, provide the following values:

user john
`CONFIG_URL` https://httpbin.org/get
`CONFIG_CONN_NAME` projects/$PROJECT/locations/$REGION/connections/pubsub
`CONFIG_TOPIC` projects/$PROJECT/topics/mytopic

NOTE: Replace PROJECT and REGION with your appropriate values.

image37.png

Now hit “Test integration”. You should get a successful response.

unnamed (2).png

13. Close the execution pane . In the Variable panel on the left, click “connectorOutputPayload” and select “View Details”.

image4.png

In the Variable panel, under Variable Type, change it from “None” to “Output from Integration” and click “SAVE”.

image34.png

Similarly set the “responseBody” variable type to “Output from Integration” and click “SAVE”.

image1.png

Now Publish the Integration again with the same set of config variable values

`CONFIG_URL` https://httpbin.org/get
`CONFIG_CONN_NAME` projects/$PROJECT/locations/$REGION/connections/pubsub
`CONFIG_TOPIC` projects/$PROJECT/topics/mytopic

NOTE: Replace PROJECT and REGION with your appropriate values.

Now hit “Publish integration”. Click “Test”.

Provide the user input value and then hit “TEST INTEGRATION”.

image24.png

You should get a successful response.

image12.png

In the new response, you will see the output variable printed for both the HTTP call that was made and the response message ID from Pub/Sub and the “userOutput” variable extracted in the output.

In this example, the following details may change between environments:

  1. The URL used in the REST task
  2. The project where the PubSub connection exists (for ex CONN_NAME variable)
  3. The topic name
  4. The variable content (for ex user variable)

Testing

You can test or publish an integration from the integration editor after you have added and configured the integration with the necessary triggers, tasks, and edge connections. When you test an integration using the Google Cloud console, the integration is executed in the synchronous mode.

Create unit tests

Now let’s create some test cases. First let’s create some “unit” test cases by mocking the REST endpoint response.

1. Click the “ENABLE Editing” button to open the integration in EDIT mode.

2. Click the “TEST CASE” button and then select “+ Create a new test case”.

image3.png

3. In the “Create test case” dialog, let's provide “1_Unit-Tests” as the Test Name and “Unit test cases” as the description. Then click “CREATE”. The integration will enable the “Test Case Mode”

4. In the Test Case mode, click the “API Trigger” task and provide the Input Variable for user say “foo”.

image8.png

5. Now, click the “Call REST Endpoint” task in the integration designer and select “Mock Output” from the “Mock strategy type” dropdown.

image25.png

6. Provide the following mock data:
We will mock the response of the REST Task. Below is the mock data with the response returning user “foo”.

`Task_1_responseHeader` {"Content-Type":"application/json"}
`Task_1_responseBody` {"headers":{"host":"mock","user-agent":"Google-Cloud-Application-Integration"},"method":"GET","url":"/?user=foo","args":{"user":"foo"},"body":""}
`Task_1_responseStatus` 200 OK

image9.png

 

7. In the Assertion Strategy, click “+ ADD ASSERTION” and leave the default values.

image7.png

8. Now click the Data Mapping task in Test Case mode. Leave the Mock strategy type as “No mock”. In the “Assertion Strategy”, click “+ ADD ASSERTION”.

9. Select “Assert condition” as the type and “$user$ = $userOutput$” in the Condition.

image6.png

10. Finally select “Publish Message” Connector task in the Test case mode. Create the following test assertion.

image7.png

11. Now click “EXIT TEST CASE MODE”

12. Lets publish the integration by clicking the “PUBLISH” button if the integration is not published. Provide the config variable values from above.

`CONFIG_URL` https://httpbin.org/get
`CONFIG_CONN_NAME` projects/$PROJECT/locations/$REGION/connections/pubsub
`CONFIG_TOPIC` projects/$PROJECT/topics/mytopic

NOTE: Replace PROJECT and REGION with your appropriate values

Execute unit test

1. Click the “TEST CASE” to select “Open test cases”

2. Click “1_Unit-Tests” to open it in “Test case mode”

3. Click “TEST” and click “RUN TEST” to run the test cases. You should see a “Test execution passed” output

image30.png

4. Click the “GO TO LOGS” link to open the Execution logs and see the execution

Create integration tests

Now let’s create some integration test cases.

1. Click the “ENABLE Editing” button to open the integration in EDIT mode.

2. Click the “TEST CASE” button and then select “+ Create a new test case”

image3.png

3. In the “Create test case” dialog, let's provide “2_Integration-Tests” as the Test Name and “Integration test cases” as the description. Then click “CREATE”. The integration will enable the “Test Case Mode”

4. In the Test Case mode, click the “API Trigger” task and provide the Input Variable for user say “bar”

image28.png

5. Now, click the “Call REST Endpoint” task in the integration designer. In the Assertion Strategy, add an assertion by clicking the “+ADD ASSERTION button”. Select “Assert parameters” from the dropdown and select `Task_1_ressponseStatus` from the dropdown. Select “Equals” in the Operator and “200 OK” in the value

image5.png

6. Now click the Data Mapping Task and click “+ ADD ASSERTION”. Select “Assert condition” type and “$user$ = $userOutput$“ in the condition

image6.png

7. Finally, click “Publish Message task”, click “+ ADD ASSERTION”. Select “Assert condition” type and “exists($`Task_2_connectorOutputPayload`$)“ in the condition

image40.png

8. Now click “EXIT TEST CASE MODE”
9. Lets publish the integration by clicking the “PUBLISH” button if the integration is not published. Provide the config variable values from above.

`CONFIG_URL` https://httpbin.org/get
`CONFIG_CONN_NAME` projects/$PROJECT/locations/$REGION/connections/pubsub
`CONFIG_TOPIC` projects/$PROJECT/topics/mytopic

NOTE: Replace PROJECT and REGION with your appropriate values

Execute integration test

1. Click the “TEST CASE” to select “Open test cases”

2. Click “2_integration-Tests” to open it in “Test case mode”

3. Click “TEST” and click “RUN TEST” to run the test cases. You should see a “Test execution passed” output

image30.png

4. Click the “GO TO LOGS” link to open the Execution logs and see the execution

Prepare Github

Create a new repository. It is recommended that every integration flow uses a different repository.

mkdir app-integration-cicd-demo && cd app-integration-cicd-demo

git init
git checkout -b feature/cicd

In this step, the feature/cicd branch is created. The recommended folder structure for Integration and Connectors are as follows:

├── cloudbuild.yaml #the cloud build deployment file
└── <env>
│ ├── connectors
│ │ └── <connector-name>.json #there is one file per connector. the connector name is the file name.
│ ├── config-variables
│ │ └── <integration-name>-config.json #there is one file per integration.
│ ├── authconfigs
│ │ └── <authconfig-name>.json #there is one file per authconfig. the authconfig name is the file name.
│ ├── endpoints
│ │ └── <endpoint-name>.json #there is one file per endpoint attachment. the endpoint attachment name is the file name.
│ ├── zones
│ │ └── <zone-name>.json #there is one file per managed zone. the managed zone name is the file name.
│ ├── sfdcinstances
│ │ └── <instance-name>.json #there is one file per sfdc instance. the sfdc instance name is the file name.
│ ├── sfdcchannels
│ │ └── <instance-name_channel-name>.json #there is one file per sfdc channel. A combination of sfdc instance name and channel name is the file name.
│ ├── overrides
│ │ └── overrides.json #always name this overrides.json. there is only one file in this folder
│ ├── tests
│ │ └── test.json
├── test-config
│ │ └── test.json #config parameters for the test in the test folder. File name must match the file name in the test folder
│ └── src
│ └── <integration-name>.json #there only one file in the folder. the integration name is the file name.

Introduction to integrationcli

integrationcli is a tool that lets you interact or manage (create, delete, get, list integrations and connections) with Application Integration, Integration Connectors or Apigee Integration/Connector APIs. This example uses this tool to automate deployments. You can see other examples and options here.

Install the CLI with the following command:

curl -L https://raw.githubusercontent.com/GoogleCloudPlatform/application-integration-management-toolkit/main/downloadLatest.sh | sh -

Set integrationcli preferences:

token=$(gcloud auth print-access-token)
project=<set DEV project here>
region=<set DEV region here>

integrationcli prefs set -p $project -r $region -t $token

Create a scaffold for the Integration

integrationcli integrations scaffold -n sample --latest -f . -e dev

Where “-n sample” is the name of the integration, "-e dev" is the environment name which creates a directory under which the files are stored, "-f" is the folder to generate the artifacts and “--latest” will scaffold the version with the highest snapshot number in SNAPSHOT state. If none found, select the highest snapshot in DRAFT state; default is true (default true) . This command downloads the integration, connections used by the integration, authconfigs, sfdc instances, channels and tests in the folder structure mentioned previously.

Overrides file

Integration flows contain values that can change between environments. For example,

  • The Connector task needs to be changed as you migrate from one project to another
  • The REST task & Cloud Function tasks may have different values between environments

integrationcli makes it easy to generate an overrides file for values that typically change between environments.

The file should look like this (./app-integration-cicd-demo/dev/overrides/overrides.json) :

{
   "connection_overrides":[
      {
         "taskId":"2",
         "task":"GenericConnectorTask",
         "parameters":{
            "connectionName":"pubsub"
         }
      }
   ],
   "integration_overrides":{
      "databasePersistencePolicy":"DATABASE_PERSISTENCE_POLICY_UNSPECIFIED",
      "enableVariableMasking":false,
      "cloudLoggingDetails":{
         "cloudLoggingSeverity":"CLOUD_LOGGING_SEVERITY_UNSPECIFIED",
         "enableCloudLogging":false
      }
   }
}

The items marked in red are the values that can be overridden when the integration is promoted to other environments. Other overrides not automatically captured from the CLI may be added to the file at this time. Examples like retries, default values etc. may be added.

Check in all the content to the feature/cicd branch.

git add --all
git commit -m 'first draft'
git push

The folder structure will look like this:

├── dev
│ ├── config-variables
│ │ └── sample-config.json
│ ├── connectors
│ │ └── pubsub.json
│ └── overrides
│   └── overrides.json
│ └── test-configs
│   └── 1_Unit-Tests.json
│   └── 2_Integration-Tests.json
│ └── tests
│   └── 1_Unit-Tests.json
│   └── 2_Integration-Tests.json
└── src
    └── sample.json

Let’s now work on preparing this integration for the next higher environment, say “qa”. Copy the “dev” directory and paste it as “qa”

mkdir qa && cp -r dev/* qa
├── dev
│ ├── config-variables
│ │ └── sample-config.json
│ ├── connectors
│ │ └── pubsub.json
│ └── overrides
│   └── overrides.json
│ └── test-configs
│   └── 1_Unit-Tests.json
│   └── 2_Integration-Tests.json
│ └── tests
│   └── 1_Unit-Tests.json
│   └── 2_Integration-Tests.json
├── qa
│ ├── config-variables
│ │ └── sample-config.json
│ ├── connectors
│ │ └── pubsub.json
│ └── overrides
│   └── overrides.json
│ └── test-configs
│   └── 1_Unit-Tests.json
│   └── 2_Integration-Tests.json
│ └── tests
│   └── 1_Unit-Tests.json
│   └── 2_Integration-Tests.json
└── src
    └── sample.json

In the newly copied “qa” directory, make changes to the sample-config.json, overrides.json and pubsub.json files that are specific to the qa environment. In this example, we will change the variable to qa.

overrides.json fragment:

"integration_overrides": {
	"databasePersistencePolicy": "DATABASE_PERSISTENCE_ASYNC",
	"enableVariableMasking": true,
	"cloudLoggingDetails": {
		"cloudLoggingSeverity": "INFO",
		"enableCloudLogging": true
	}
}

Sample-config.json:

{
"`CONFIG_CONN_NAME`": "projects/$PROJECT/locations/$REGION/connections/pubsub",
"`CONFIG_URL`": "https://mocktarget.apigee.net/echo",
"`CONFIG_TOPIC`": "projects/$PROJECT/topics/mytopic"
}

Replace the PROJECT and REGION values with your “QA Project and region”

We can modify the input values of the test cases programmatically. Just update the values in the test-config directory. For examples, dev/test-configs/1_Unit-Tests.json, update to

{
	"inputParameters": {
		"user": {
			"stringValue": "foo"
		}
	}
}

NOTE: If you want the value “foo” to be something else. Make sure you update the value of the mock data in your test case as well since its set to return “foo”. The test case will fail if they both dont match

And dev/test-configs/2_Integration-Tests.json to

{
	"inputParameters": {
		"user": {
			"stringValue": "bar"
		}
	}
}

And qa/test-configs/1_Unit-Tests.json , update to

{
	"inputParameters": {
		"user": {
			"stringValue": "foo"
		}
	}
}

NOTE: If you want the value “foo” to be something else. Make sure you update the value of the mock data in your test case as well since its set to return “foo”. The test case will fail if they both dont match

And qa/test-configs/2_Integration-Tests.json to

{
	"inputParameters": {
		"user": {
			"stringValue": "bar"
		}
	}
}

Commit all the content to the feature/cicd branch.

git add .
git commit -m 'adding qa configurations'
git push

Manual deployments

In this section, we will demonstrate how to deploy the code manually using the integrationcli tool.

The artifacts in the qa folder can now be deployed to the QA environment (GCP project). Set integrationcli preferences for the QA environment:

token=$(gcloud auth print-access-token)
project=<set QA project here>
region=<set QA region here>

integrationcli prefs set -p $project -r $region -t $token

integrationcli can be used to apply changes generated by scaffold. This will automatically provision all the necessary entities in the right order

With manual testing

You can run the apply command to provision all the necessary configurations of the integration.

integrationcli integrations apply -f . -e qa --wait=true -g

NOTE: This command may take a while. Once the command is successful, you should see the “sample” integration in your QA project. You should notice the details of the Integration should show the changes you made in the overrides file.

image27.png

Now let's test the integration. Simply click the “TEST” button and give a value for the user input field. You will notice that the response is now coming from the new QA URL (mocktarget.apigee.net)

image13.png

You can also execute the test cases in this integration that also gets imported as part of the apply command. Follow the same steps to execute the tests we performed earlier.

With automated testing

Now let's test the same apply command with the “--run-test” argument that will run the test cases we have automatically

integrationcli integrations apply -f . -e qa --wait=true -g --run-tests

This will trigger the test case with the values in the test-configs directory and execute all the test cases from the tests directory. You can use these commands to set up your code promotion process or use tools like Cloud Deploy or GitHub Actions to do the code promotion which is covered in the following sections.

Next steps

Congratulations! With this post, you were able to develop an integration including testing, scaffold it to a source code repo and then use the integrationcli tool to publish and promote the code to higher environments. Now in the next part (Part 2) of the article, you will see how to use tools like Cloud Deploy and GitHub Actions to do the code promotion along with automated testing.

Version history
Last update:
‎02-28-2025 03:25 PM
Updated by: