On this page
Amazon Bedrock
Configure Amazon Bedrock as an LLM provider in agentgateway.
Before you begin
- Set up an agentgateway proxy.
- Make sure that your Amazon credentials have access to the Bedrock models that you want to use. You can alternatively use an AWS Bedrock API key.
Set up access to Amazon Bedrock
Store your credentials to access the AWS Bedrock API.
- Create a Backend resource to configure an LLM provider that references the AI API key secret.
kubectl apply -f- <<EOF apiVersion: gateway.kgateway.dev/v1alpha1 kind: Backend metadata: name: bedrock namespace: gloo-system spec: type: AI ai: llm: bedrock: model: "amazon.nova-micro-v1:0" region: us-east-1 auth: type: Secret secretRef: name: bedrock-secret EOFReview the following table to understand this configuration. For more information, see the API reference.
Setting Description typeSet to AIto configure this Backend for an AI provider.aiDefine the AI backend configuration. The example uses Amazon Bedrock ( spec.ai.llm.bedrock).modelThe model to use to generate responses. In this example, you use the amazon.nova-micro-v1:0model. Keep in mind that some models support cross-region inference. These models begin with aus.prefix, such asus.anthropic.claude-sonnet-4-20250514-v1:0. For more models, see the AWS Bedrock docs.regionThe AWS region where your Bedrock model is deployed. Multiple regions are not supported. authProvide the credentials to use to access the Amazon Bedrock API. The example refers to the secret that you previously created. To use IRSA, omit the authsettings.Create an HTTPRoute resource to route requests through your agentgateway proxy to the Bedrock Backend. Note that Gloo Gateway automatically rewrites the endpoint that you set up (such as
/bedrock) to the appropriate chat completion endpoint of the LLM provider for you, based on the LLM provider that you set up in the Backend resource.kubectl apply -f- <<EOF apiVersion: gateway.networking.k8s.io/v1 kind: HTTPRoute metadata: name: bedrock namespace: gloo-system spec: parentRefs: - name: agentgateway-proxy namespace: gloo-system rules: - matches: - path: type: PathPrefix value: /bedrock backendRefs: - name: bedrock namespace: gloo-system group: gateway.kgateway.dev kind: Backend EOFSend a request to the LLM provider API. Verify that the request succeeds and that you get back a response from the chat completion API.
Example output:
{ "metrics": { "latencyMs": 2097 }, "output": { "message": { "content": [ { "text": "\nAn AI gateway in your Kubernetes cluster can enhance performance, scalability, and security while simplifying complex operations. It provides a centralized entry point for AI workloads, automates deployment and management, and ensures high availability." } ], "role": "assistant" } }, "stopReason": "end_turn", "usage": { "inputTokens": 60, "outputTokens": 47, "totalTokens": 107 } }