Interface PromptModelInferenceConfiguration.Builder
- All Superinterfaces:
Buildable
,CopyableBuilder<PromptModelInferenceConfiguration.Builder,
,PromptModelInferenceConfiguration> SdkBuilder<PromptModelInferenceConfiguration.Builder,
,PromptModelInferenceConfiguration> SdkPojo
- Enclosing class:
PromptModelInferenceConfiguration
@Mutable
@NotThreadSafe
public static interface PromptModelInferenceConfiguration.Builder
extends SdkPojo, CopyableBuilder<PromptModelInferenceConfiguration.Builder,PromptModelInferenceConfiguration>
-
Method Summary
Modifier and TypeMethodDescriptionThe maximum number of tokens to return in the response.stopSequences
(String... stopSequences) A list of strings that define sequences after which the model will stop generating.stopSequences
(Collection<String> stopSequences) A list of strings that define sequences after which the model will stop generating.temperature
(Float temperature) Controls the randomness of the response.The percentage of most-likely candidates that the model considers for the next token.Methods inherited from interface software.amazon.awssdk.utils.builder.CopyableBuilder
copy
Methods inherited from interface software.amazon.awssdk.utils.builder.SdkBuilder
applyMutation, build
Methods inherited from interface software.amazon.awssdk.core.SdkPojo
equalsBySdkFields, sdkFieldNameToField, sdkFields
-
Method Details
-
maxTokens
The maximum number of tokens to return in the response.
- Parameters:
maxTokens
- The maximum number of tokens to return in the response.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
stopSequences
A list of strings that define sequences after which the model will stop generating.
- Parameters:
stopSequences
- A list of strings that define sequences after which the model will stop generating.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
stopSequences
A list of strings that define sequences after which the model will stop generating.
- Parameters:
stopSequences
- A list of strings that define sequences after which the model will stop generating.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
temperature
Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.
- Parameters:
temperature
- Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
topP
The percentage of most-likely candidates that the model considers for the next token.
- Parameters:
topP
- The percentage of most-likely candidates that the model considers for the next token.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-