Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

APIGEE as an AI Gateway to Access Bedrock

Dear Team members,

Has anyone used APIGEE as an AI gateway to to integrate with various LLM models exposed by AWS bedrock e.g llama , titan, anthropic. 

I am trying to understand that within apigee if there are out of box policies available through which I provide credentials, modelid and prompt and proxy gives the response back. Something I see Kong offers OOB.

Kind Regards

Arijit

Solved Solved
0 3 195
2 ACCEPTED SOLUTIONS

Hi @Arijit_apigee - Great question!


We have a detailed blog post on this very topic (link here). Among other things it covers how Apigee can act as a gateway between your LLM application and models.

We also have detailed code samples and guides here for you to get hands on.

Hope this helps,

Mark.

View solution in original post

Hi @Arijit_apigee - The samples themselves are intended to be a guide or reference and are supported through this community.

View solution in original post

3 REPLIES 3

Hi @Arijit_apigee - Great question!


We have a detailed blog post on this very topic (link here). Among other things it covers how Apigee can act as a gateway between your LLM application and models.

We also have detailed code samples and guides here for you to get hands on.

Hope this helps,

Mark.

Hi Mark @markjkelly ,

This is helpful and good point to start. I see there are lot of github code samples to implement certain capabilities. Are these officially supported by APIGEE ?. What I feel is the capabilities are achieved by plumbing various niche apigee policies . 

Kind Regards

Arijit

Hi @Arijit_apigee - The samples themselves are intended to be a guide or reference and are supported through this community.