Overridable Anthropic ClientOptions
Model name to use
Model name to use
OptionalanthropicAnthropic API key
OptionalanthropicAnthropic API URL
OptionalapiAnthropic API key
OptionalinvocationHolds any additional parameters that are valid to pass to anthropic.messages that are not explicitly specified on this class.
OptionalmaxA maximum number of tokens to generate before stopping.
OptionalmaxA maximum number of tokens to generate before stopping.
OptionalstopA list of strings upon which to stop generating.
You probably want ["\n\nHuman:"], as that's the cue for
the next turn in the dialog agent.
OptionalstreamWhether or not to include token usage data in streamed chunks.
OptionalstreamingWhether to stream the results or not
OptionaltemperatureAmount of randomness injected into the response. Ranges from 0 to 1. Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks.
OptionaltopKOnly sample from the top K options for each subsequent token. Used to remove "long tail" low probability responses. Defaults to -1, which disables it.
OptionaltopPDoes nucleus sampling, in which we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. Defaults to -1, which disables it. Note that you should either alter temperature or top_p, but not both.
Input to AnthropicChat class.