Dear Team members,
Has anyone used APIGEE as an AI gateway to to integrate with various LLM models exposed by AWS bedrock e.g llama , titan, anthropic.
I am trying to understand that within apigee if there are out of box policies available through which I provide credentials, modelid and prompt and proxy gives the response back. Something I see Kong offers OOB.
Kind Regards
Arijit
Solved! Go to Solution.
Hi @Arijit_apigee - Great question!
We have a detailed blog post on this very topic (link here). Among other things it covers how Apigee can act as a gateway between your LLM application and models.
We also have detailed code samples and guides here for you to get hands on.
Hope this helps,
Mark.
Hi @Arijit_apigee - The samples themselves are intended to be a guide or reference and are supported through this community.
Hi @Arijit_apigee - Great question!
We have a detailed blog post on this very topic (link here). Among other things it covers how Apigee can act as a gateway between your LLM application and models.
We also have detailed code samples and guides here for you to get hands on.
Hope this helps,
Mark.
Hi Mark @markjkelly ,
This is helpful and good point to start. I see there are lot of github code samples to implement certain capabilities. Are these officially supported by APIGEE ?. What I feel is the capabilities are achieved by plumbing various niche apigee policies .
Kind Regards
Arijit
Hi @Arijit_apigee - The samples themselves are intended to be a guide or reference and are supported through this community.