Customize the Gemini SDK with Apigee, Model Armor & SDP PII Prompt Masking

This article and sample shows how Apigee can easily proxy & customize the Gemini SDK and be used to add monitoring and security policies to the LLM usage for an organization, including Model Armor to mask PII data that users may inadvertently add to prompts. This can be done in 3 calls, so let's dig in.

You can find the source code here on GitHub: Apigee Gemini SDK Custom Endpoint with Model Armor & SDP Prompt Masking

Basic setup

This sample sets up the following architecture to proxy the Gemini SDK calls to the Gemini API, to add additional security or other checks to the model communication.

tyayers_0-1744729834657.png

The advantage of this approach is that Model Armor, plus additional logging, metrics, etc.., will be consistently applied for all users of the SDK with no additional effort. This makes offering AI services n an organization much more secure and scalable than if every client had ot implement this themselves. This also covers any backend model automatically with no additional effort for clients.

Sensitive Data Protection (SDP) configurtation

In order to mask the user's prompts, we need to configure SDP with the types of data types that should be identitified, and optionally how they should be masked. This configuration can be done with two commands to create the inspection and de-identification templates.

 

 

 

# create sdp regional inspection template
curl -X POST "https://dlp.googleapis.com/v2/projects/$PROJECT_ID/locations/$REGION/inspectTemplates" \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "x-goog-user-project: $PROJECT_ID" \
-H "Content-Type: application/json" \
--data-binary @- << EOF

{
	"templateId": "ma-template1",
  "inspectTemplate":{
    "displayName":"Personal info de-idintify",
    "description":"Scans for personal info.",
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"PHONE_NUMBER"
        },
        {
          "name":"US_TOLLFREE_PHONE_NUMBER"
        },
        {
          "name":"EMAIL_ADDRESS"
        },
        {
          "name":"IP_ADDRESS"
        },
        {
          "name":"STREET_ADDRESS"
        },
        {
          "name":"GENERIC_ID"
        }
      ],
      "minLikelihood":"POSSIBLE",
      "includeQuote":true
    }
  }
}
EOF

# create sdp de-identify and masking template
curl -X POST "https://dlp.googleapis.com/v2/projects/$PROJECT_ID/locations/$REGION/deidentifyTemplates" \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "x-goog-user-project: $PROJECT_ID" \
-H "Content-Type: application/json" \
--data-binary @- << EOF

{
	"templateId": "ma-template1",
  "deidentifyTemplate":{
    "displayName":"Personal info masker",
    "description":"De-identifies personal data with a series of asterisks.",
    "deidentifyConfig":{
      "infoTypeTransformations":{
        "transformations":[
          {
            "infoTypes":[
			        {
			          "name":"PHONE_NUMBER"
			        },
			        {
			          "name":"US_TOLLFREE_PHONE_NUMBER"
			        },
			        {
			          "name":"EMAIL_ADDRESS"
			        },
			        {
			          "name":"IP_ADDRESS"
			        },
			        {
			          "name":"STREET_ADDRESS"
			        },
			        {
			          "name":"GENERIC_ID"
			        }
            ],
            "primitiveTransformation":{
              "characterMaskConfig":{
                "charactersToIgnore":[
                  {
                    "charactersToSkip":"@"
                  }
                ],
                "maskingCharacter":"*"
              }
            }
          }
        ]
      }
    }
  }
}
EOF

 

 

 

Apigee proxy configuration

Now we can import an Apigee proxy for the Gemini SDK that can call Model Armor & SDP, as well as add logging, metrics & analytics to all of the LLM calls that go through the SDK. This is done transparently for the user - the SDK functions just like before, just with the added security, monitoring & other policies applied. To do the import, our last call will use the apigeecli tool which can automate anything in Apigee through simple cli calls. The -sa call also passes a service account that has the roles/modelarmor.user role to apply the masking policy in the Apigee proxy, and so with this call we can import the Apigee proxy to apply the model armor security policies, as well as additional features.

 

 

 

apigeecli apis create bundle -f apiproxy --name VertexAIProxy-v1 -o $PROJECT_ID -e $APIGEE_ENV -sa genai-service@$PROJECT_ID.iam.gserviceaccount.com --ovr -t $(gcloud auth print-access-token)

 

 

 

Client configuration

After the Apigee proxy deployment is done as documented in the repo, the app can add the proxy endpoint when creating the GoogeGenAi object, using the Google Gemini SDK. In this case the project, region and baseUrl are passed in as environment variables.

 

 

 

 

 

// create google gen ai object
const ai = new GoogleGenAI({
  vertexai: true,
  project: process.env.PROJECT_ID,
  location: process.env.REGION,
  httpOptions: {
    baseUrl: process.env.APIGEE_ENDPOINT + "/v1/vertexai",
    headers: {
      "EnableModelArmor": "true"
    }
  }
});

 

 

 

 

 

Additionally a custom header EnableModelArmor was added to easily turn the model armor masking on/off for demo purposes.

After starting the app as detailed in the repo, you can ask the model questions about your email or physical address, tax id, telephone number, etc..

Screenshot From 2025-04-15 16-20-22.png

If you start a debug trace sessoin in Apigee, you can verify that the proxy is masking the prompts before sending them to the Gemini API.

Screenshot From 2025-04-15 16-14-50.png

So for example the prompt My email is test@example.com, can you help me reset my password? will be masked to My email is ****@************, can you help me reset my password?, and only the masked prompt will be forwarded to the LLM model.

Advantages

The advantage of this approach is that it is completely transparent for the SDK users - all they have to do is set a different baseUrl, and they can safely use organizational LLM models in their apps without having to code against a new API. Also the Apigee proxy can additionally be offered through an API portal or other channels, with the SDK being an easy way for developers to get immediate access, and be able to authenticate with their GCP credentials (although other auth methods such as API keys could also easily be added at the proxy level).

Ideas or suggestions to build this demo further out with more features? Let us know in the comments.

 

 

 

                                                        

Contributors
Version history
Last update:
2 weeks ago
Updated by: