feat: update promptCache documentation to include bedrock (anthropic models) (#442)

This commit is contained in:
Dustin Healy
2025-11-25 06:23:29 -08:00
committed by GitHub
parent 8b7de0cc6c
commit 39d7844bd7
2 changed files with 5 additions and 4 deletions

View File

@@ -512,7 +512,8 @@ preset:
> **Note:** Each parameter below includes a note on which endpoints support it.
> **OpenAI / AzureOpenAI / Custom** typically support `temperature`, `presence_penalty`, `frequency_penalty`, `stop`, `top_p`, `max_tokens`.
> **Google / Anthropic** typically support `topP`, `topK`, `maxOutputTokens`, `promptCache` (Anthropic only).
> **Google / Anthropic** typically support `topP`, `topK`, `maxOutputTokens`.
> **Anthropic / Bedrock (Anthropic models)** support `promptCache`.
> **Bedrock** supports `region`, `maxTokens`, and a few others.
#### model
@@ -709,7 +710,7 @@ preset:
#### promptCache
> **Supported by:** `anthropic`
> **Supported by:** `anthropic`, `bedrock` (Anthropic models)
> (Toggle Anthropics “prompt-caching” feature)
<OptionTable

View File

@@ -182,14 +182,14 @@ reasoning_effort, reasoning_summary, verbosity, useResponsesApi, web_search, dis
topP, topK, maxOutputTokens, thinking, thinkingBudget, web_search
```
**Anthropic Specific:**
**Anthropic, Bedrock (Anthropic models):**
Set this to `true` or `false` to toggle the "prompt-caching":
```bash
promptCache
```
More info: https://www.anthropic.com/news/prompt-caching
More info: https://www.anthropic.com/news/prompt-caching, https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-caching.html#prompt-caching-get-started
**Bedrock:**
```bash