Configure Anthropic via Amazon Bedrock for Cribl Copilot
Use this topic to connect Anthropic Claude models via Amazon Bedrock as a Bring Your Own Model (BYOM) for Cribl Copilot. This page focuses on:
- What you need from the Anthropic via Amazon Bedrock side.
- How to fill in the ID, Description, and API Key fields in the Cribl UI.
For the generic flow and prerequisites to open AI Settings, start the Custom AI provider modal, and switch back to Cribl-managed large language models (LLMs), see Configure Custom AI Providers.
Prerequisites
In addition to the general prerequisites in configuring your own LLM, you need:
- An AWS account with Amazon Bedrock enabled in at least one Region.
- Access to at least one Anthropic Claude model via Bedrock (for example, Claude 3 Sonnet or Haiku).
- An API key that your organization exposes specifically for Anthropic via Bedrock access (often via an internal API gateway or proxy).
- On-prem only: Network connectivity from the Leader to the endpoint your organization uses for Anthropic via Bedrock (public internet, VPC endpoint, or corporate proxy).
Step 1: Open the Custom Provider Modal
Navigate to your AI Settings to start the configuration:
- Select Use Custom AI Providers (or Try it!) to open the configuration modal.
- Anthropic via Amazon Bedrock is the default selected provider type.
Step 2: Provide ID and Description
These fields control how the provider is identified and displayed within your Cribl environment.
- ID: Enter a short, unique identifier (for example,
anthropic-bedrock-prod). Avoid putting secrets or keys in this field. - Description: Enter a human-readable label for the provider card (for example,
Anthropic Claude via Amazon Bedrock - us-east-1).
Step 3: Provide the API Key
The API Key is the single credential used to authenticate your Copilot requests. The Cribl AI Service uses this key to authenticate when routing supported Copilot requests to Anthropic via Bedrock. This key is stored securely in Cribl secret store and is not shown to end users.
- API Key: Paste the API key or bearer token provided by your AWS team.
Depending on your environment setup, this key will be one of the following:
- Amazon Bedrock API key: A native AWS bearer token. This is the modern, streamlined way to access Bedrock without managing IAM users.
- Gateway/proxy key: A custom token generated by the internal API gateway (such as a Mulesoft or Apigee layer) that fronts your AWS environment.
- Combined AWS credential: In some specialized configurations, this might be a formatted string combining your credentials, though a dedicated API key is preferred.
If you are unsure where to get this key, ask the team that manages AI providers and Bedrock access to issue or confirm the correct key for:
- Cribl.Cloud (if you are configuring a Cribl.Cloud Workspace), or
- The on-prem Leader (if you are configuring an on-prem deployment).
Step 4: Test and Save the Configuration
Select Test Connection in the modal.
- If the test succeeds, you will see a success indicator.
- If it fails, verify:
- The API Key is correct and active.
- The key has the correct permissions to access Anthropic Claude models via Bedrock.
- On-prem: The Leader can reach the configured endpoint (no firewall or proxy issues).
When all fields are valid and the test succeeds, select Save.
After you save:
- A Custom AI provider card appears at the top of AI settings, showing your ID/Description and Anthropic via Amazon Bedrock as the provider type.
- Supported Copilot capabilities in that workspace begin using Anthropic via Bedrock as their AI backend.
Step 5: Verify Copilot Behavior
To confirm that Anthropic via Bedrock is correctly configured:
- In AI Settings, confirm that the Custom AI provider card lists Anthropic via Amazon Bedrock.
- Use a Copilot capability that supports custom providers (for example, the KQL assistant or Git commit message suggestions).
- Verify that:
- Requests succeed without provider-related errors.
- Latency and behavior align with your expectations for the Anthropic models and Regions you use.
If you see failures or unexpected behavior:
- Re-run Test Connection in the modal and review any error messages.
- Confirm with your AWS/AI team that:
- The key is valid and not rate-limited.
- Anthropic Claude models are enabled and accessible in the intended Region(s).
- For on-prem deployments: Network rules allow the Leader to reach the Anthropic/Bedrock gateway or endpoint.
Change or Stop Using Anthropic via Bedrock
- To update the Anthropic provider (for example, after a key rotation), see Edit an Existing Custom AI Provider.
- To stop using Anthropic via Bedrock, see Stop Using a Custom AI Provider.
- To use your own LLM again later, use the instructions in this topic to reopen the modal, choose Anthropic via Amazon Bedrock, and enter your configuration details.