Adding self-service capabilities to your Landing Zone

In this article, I evaluate how Application Integration supports growing your Google Cloud Landing Zone with data and increasing efficiency through self-service. After introducing Google Cloud’s Application Integration, we will illustrate this concept through the “new project request” process and store the collected data in a central landing zone metadata platform. Data stored in this platform can now be used as a central overview and offers an opportunity for new automations to manage and expand the foundation services such as project governance, security, FinOps and much more.

Disclaimer: I work at Google in the cloud team. Opinions are my own and not the views of my current employer.

Introducing Application Integration

As organizations grow their landscape to run business operations it is no longer sustainable to implement point to point integrations with custom code. Adoption of cloud and the rise of low code / Gen AI applications will lead to accelerated growth and complexity. This is where Application integration comes into the picture.

Application Integration is an Integration-Platform-as-a-Service (iPaaS) solution in Google Cloud. It allows connecting applications with point-and-click configurations instead of code. The visual interface and large set of connectors make it simple to automate business processes and workflows quickly. It comes with many out-of-the-box triggers, connections and tasks, which make it easy to set u[p and expand as needed.

0_vW8IKHXTzbsF5w46.png

You can find more information on the product landing page.

Requesting a new project

In the blog post Growing your Google Cloud Landing zone with data I evaluate how data can enrich a landing zone and set you up for a modern, scalable, future proof and user centric solution. Typically organizations set up a dedicated team to manage the landing zone and enable the different workload teams. This can be achieved by organizing automated services, orchestrating them centrally and storing the necessary data in a landing zone metadata platform.

User story: as an IT Team Lead, I want to request a new GCP project, so that my team can start their development work

User journey:

  • The user navigates to a request form and fills out the necessary information.
  • After clicking submit, the form triggers an automated workflow to register the request and take any next action that you defined in this process.
  • The request data is stored in the foundation’s data store so that it can be used for insights and other automation services.

In order to capture the request details, we will use a Google Form but many options are possible here. From a no code solution such as AppSheet, a low-code solution such as StreamLit, a custom solution or a SaaS solution (for instance Confluence, ServiceNow, etc.) — the integration connectors will allow you to use the service of choice and connect to it.

  • Create a Google Form to capture the request details.
  • We will capture a couple of fields for illustration purposes only: Requestor, Technical Owner, Business Owner and Project Name.
  • It is advised to define the information you want to collect using fields as well as structured / templated inputs. The form should not become overly complicated and new inputs should not cause regression or changes in the process for every change.

0_XpnulGd3YSEjRRJA.png

Learn more about Google Forms here.

The next step is to create the integration in the Google Cloud Platform. Get started following the guides from the documentation. (Note the documentation links that you can read to follow the detailed steps)

  • Create a new integration ‘project-request’
  • On the canvas, we start by defining the trigger for the automation. For this use case, we will use a webhook trigger. Make sure to create a service account with the required IAM roles.
  • After configuring the webhook trigger, an event listener endpoint is available from the event subscription details. This is the endpoint to be called upon submission of the user input (the Google Form created above).
  • So, the next step is to add an Apps Script to the form to call this trigger.
    - Go to the Google Form is edit mode
    - From the menu, select the script editor and add the following code, adding the event listener endpoint in the POST_URL variable and any required authentication parameters.
var POST_URL = "";

function onSubmit(e) {
var form = FormApp.getActiveForm();
var allResponses = form.getResponses();
var latestResponse = allResponses[allResponses.length - 1];
var response = latestResponse.getItemResponses();
var payload = { "eventContent": "formSubmitted", "event":{} };
for (var i = 0; i < response.length; i++) {
var question = response[i].getItem().getTitle().replace(' ','');
var answer = response[i].getResponse();
payload.event[question] = answer;
}
var options = {
"method": "POST",
"contentType": "application/json",
"payload": JSON.stringify(payload),
};
options.headers = {"Authorization": "Basic " + Utilities.base64Encode("")};
response = UrlFetchApp.fetch(POST_URL, options);
console.log(JSON.stringify(payload))
console.log(response.getResponseCode())
};
  • The next step is to create a new trigger to execute the script. Go to the triggers menu and save the configuration. Now each time the form is submitted the code will be executed and the integration trigger will be called.

0_IBwKb4z0I2MUh8n6.png

  • At this stage, you can test the connectivity between the Google Form and the integration.
  • With the connectivity test successful, let’s design the automation flow.
Register the request

For the first version, let’s register the request in a BigQuery table and send out a confirmation email.

0_OtJbTnG8Mszklsac.png

  • The Data Mapping allows us to extract the input values from the JSON object into variables. You could also restructure the input record which is provided and use the JSON schema for inserting the data into BigQuery (see example here). In this example, we will use a custom query to insert the data into BigQuery.

0_mts12aF-cBfO-mQJ.png

  • To insert the data into BigQuery, we will make use of the BigQuery Connector.
    - Connect to / or create a dataset to store the data. Make sure to add the necessary IAM roles to the service account that the connector will use.
    - Create the target table for storing the data. For this example, we are only storing the input data, but it is advised to add more details for tracking purposes such as the request date, status fields, etc. as needed.
    - In the connector, specify the insert query in the task input (custom query) and bind the variables to it.
  • Finally, add the Send Email task to the canvas to send a confirmation message for receiving the request.

0_Y9tMRiJKedno8c8w.png

From now on, new features and steps can be added as needed.

Add an approval step

For our next version, let’s add an approval task. This allows us to configure an approval-based integration to control the flow of this process. When the control reaches the Approval task, execution is halted, and all tasks after the Approval task are suspended. The integration resumes the execution only when a user manually approves or rejects the approval request.

  • Insert the approval step and configure settings such as the recipient, message and expiration. Consider updating the confirmation message according to the new workflow. Note that in our case, we have only configured handling the approval flow, in a real environment the rejection flow needs to be added.
  • Consider the logic for your automation process by making use of edges and edge conditions. Where edges will connect elements (steps) in your process, the conditions will let you specify what must be met for control of an integration to pass to the task connected by the edge. What steps need to be executed in sequence and what can be done in parallel?

0_1ErxOCBZlr-EJFs3.png

0_jlkWtZM2aQ_JJtQN.png

 0_B84CMw6W0mMYrNGQ.png

Add a Generative AI step

Finally, we will add a call to Gemini to illustrate the integration of our automated process with Generative AI. For instance, let’s create a more personalized email message created by Generative AI. Obviously, you can consider any type of use case here for your own workflows.

  • Create a Vertex AI Integration Connector
  • Add the integration connector to the canvas and configure it to the aiplatform.projects.location.endpoints.generateContent event. For additional guidance, have a look at the following video walkthrough / Github. Be careful not to use the Vertex AI — Predict task immediately from the tasks menu since it uses a different endpoint.

0_70bnYhwx8BI6d1RT.png

Adding a data mapping task before the connector will help to form the appropriate input parameters for the Gemini model.

  • Bind the value projects/$project_id/locations/&location/publishers/google/models/$model_id to the connectorInputPayload.Path parameters.model

Screenshot 2025-02-06 12.00.13 PM.png

  • Bind a prompt payload to the connectorInputPayload.Path RequestBody.contents.

Screenshot 2025-02-06 12.04.48 PM.png

Screenshot 2025-02-06 12.06.05 PM.png

  • Note that you can create a fully “rendered” JSON value with the help of the Resolve Template function:

Screenshot 2025-02-06 12.06.47 PM.png

Adding javascript task after the connector will help to get the output from the Gemini model.and use it downstream using a parameter:

function executeScript(event) {
var input = event.getParameter("`Task_7_connectorOutputPayload`");
var obj = JSON.parse(input[0]["ResponseBody"]);
var output = obj['candidates'][0]['content']['parts'][0]['text'];
event.setParameter("`GeminiOutput`", output);
}

More information about the Gemini API structure can be found here.

And much more…

Now that we have illustrated the basic concept of automation processes using Application Integration, it is your time to get started and ideate on how you can bring this to life for your organization. Interested in the technical capabilities, have a look at the documentation and start experimenting.

Tips for success

The following steps help you to successfully design and implement automation workflows that improve efficiency, reduce errors, and free up your team to focus on more valuable tasks. Remember that this is a journey, and each step taken will bring you closer to a more automated and efficient future.

  • Start with a simple process: Don’t try to automate everything at once. Ensure to make it relevant for your organization and find opportunities in existing challenges.
  • Be prepared to iterate: Automation is an ongoing process. Be ready to make changes and improvements as you learn more.
  • Invest in (meta)data management and governance from the start: Effective governance is essential for maintaining control and ensuring your automation initiatives align with your overall business objectives.
  • Involve stakeholders: Get input from the people who are currently doing the manual work.
  • Communicate effectively: Keep everyone informed about the progress of the automation initiative.

Read more in my blog post Architecting for Automation: A Holistic Approach to Drive Efficiency and Innovation in the Cloud.

Version history
Last update:
‎05-30-2025 08:43 PM
Updated by: