Class GetModelInvocationJobResponse
- All Implemented Interfaces:
SdkPojo,ToCopyableBuilder<GetModelInvocationJobResponse.Builder,GetModelInvocationJobResponse>
-
Nested Class Summary
Nested Classes -
Method Summary
Modifier and TypeMethodDescriptionbuilder()final StringA unique, case-sensitive identifier to ensure that the API request completes no more than one time.final InstantendTime()The time at which the batch inference job ended.final booleanfinal booleanequalsBySdkFields(Object obj) Indicates whether some other object is "equal to" this one by SDK fields.final <T> Optional<T> getValueForField(String fieldName, Class<T> clazz) Used to retrieve the value of a field from any class that extendsSdkResponse.final inthashCode()Details about the location of the input to the batch inference job.final StringjobArn()The Amazon Resource Name (ARN) of the batch inference job.final InstantThe time at which the batch inference job times or timed out.final StringjobName()The name of the batch inference job.final InstantThe time at which the batch inference job was last modified.final Stringmessage()If the batch inference job failed, this field contains a message describing why the job failed.final StringmodelId()The unique identifier of the foundation model used for model inference.Details about the location of the output of the batch inference job.final StringroleArn()The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference.static Class<? extends GetModelInvocationJobResponse.Builder> final ModelInvocationJobStatusstatus()The status of the batch inference job.final StringThe status of the batch inference job.final InstantThe time at which the batch inference job was submitted.final IntegerThe number of hours after which batch inference job was set to time out.Take this object and create a builder that contains all of the current property values of this object.final StringtoString()Returns a string representation of this object.final VpcConfigThe configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job.Methods inherited from class software.amazon.awssdk.services.bedrock.model.BedrockResponse
responseMetadataMethods inherited from class software.amazon.awssdk.core.SdkResponse
sdkHttpResponseMethods inherited from interface software.amazon.awssdk.utils.builder.ToCopyableBuilder
copy
-
Method Details
-
jobArn
The Amazon Resource Name (ARN) of the batch inference job.
- Returns:
- The Amazon Resource Name (ARN) of the batch inference job.
-
jobName
The name of the batch inference job.
- Returns:
- The name of the batch inference job.
-
modelId
The unique identifier of the foundation model used for model inference.
- Returns:
- The unique identifier of the foundation model used for model inference.
-
clientRequestToken
A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.
- Returns:
- A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.
-
roleArn
The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference.
- Returns:
- The Amazon Resource Name (ARN) of the service role with permissions to carry out and manage batch inference. You can use the console to create a default service role or follow the steps at Create a service role for batch inference.
-
status
The status of the batch inference job.
The following statuses are possible:
-
Submitted – This job has been submitted to a queue for validation.
-
Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:
-
Your IAM service role has access to the Amazon S3 buckets containing your files.
-
Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the
modelInputvalue matches the request body for the model. -
Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.
-
-
Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
-
Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
-
InProgress – This job has begun. You can start viewing the results in the output S3 location.
-
Completed – This job has successfully completed. View the output files in the output S3 location.
-
PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
-
Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.
-
Stopped – This job was stopped by a user.
-
Stopping – This job is being stopped by a user.
If the service returns an enum value that is not available in the current SDK version,
statuswill returnModelInvocationJobStatus.UNKNOWN_TO_SDK_VERSION. The raw value returned by the service is available fromstatusAsString().- Returns:
- The status of the batch inference job.
The following statuses are possible:
-
Submitted – This job has been submitted to a queue for validation.
-
Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:
-
Your IAM service role has access to the Amazon S3 buckets containing your files.
-
Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the
modelInputvalue matches the request body for the model. -
Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.
-
-
Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
-
Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
-
InProgress – This job has begun. You can start viewing the results in the output S3 location.
-
Completed – This job has successfully completed. View the output files in the output S3 location.
-
PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
-
Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.
-
Stopped – This job was stopped by a user.
-
Stopping – This job is being stopped by a user.
-
- See Also:
-
-
statusAsString
The status of the batch inference job.
The following statuses are possible:
-
Submitted – This job has been submitted to a queue for validation.
-
Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:
-
Your IAM service role has access to the Amazon S3 buckets containing your files.
-
Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the
modelInputvalue matches the request body for the model. -
Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.
-
-
Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
-
Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
-
InProgress – This job has begun. You can start viewing the results in the output S3 location.
-
Completed – This job has successfully completed. View the output files in the output S3 location.
-
PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
-
Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.
-
Stopped – This job was stopped by a user.
-
Stopping – This job is being stopped by a user.
If the service returns an enum value that is not available in the current SDK version,
statuswill returnModelInvocationJobStatus.UNKNOWN_TO_SDK_VERSION. The raw value returned by the service is available fromstatusAsString().- Returns:
- The status of the batch inference job.
The following statuses are possible:
-
Submitted – This job has been submitted to a queue for validation.
-
Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:
-
Your IAM service role has access to the Amazon S3 buckets containing your files.
-
Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the
modelInputvalue matches the request body for the model. -
Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.
-
-
Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.
-
Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.
-
InProgress – This job has begun. You can start viewing the results in the output S3 location.
-
Completed – This job has successfully completed. View the output files in the output S3 location.
-
PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.
-
Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.
-
Stopped – This job was stopped by a user.
-
Stopping – This job is being stopped by a user.
-
- See Also:
-
-
message
If the batch inference job failed, this field contains a message describing why the job failed.
- Returns:
- If the batch inference job failed, this field contains a message describing why the job failed.
-
submitTime
The time at which the batch inference job was submitted.
- Returns:
- The time at which the batch inference job was submitted.
-
lastModifiedTime
The time at which the batch inference job was last modified.
- Returns:
- The time at which the batch inference job was last modified.
-
endTime
The time at which the batch inference job ended.
- Returns:
- The time at which the batch inference job ended.
-
inputDataConfig
Details about the location of the input to the batch inference job.
- Returns:
- Details about the location of the input to the batch inference job.
-
outputDataConfig
Details about the location of the output of the batch inference job.
- Returns:
- Details about the location of the output of the batch inference job.
-
vpcConfig
The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see Protect batch inference jobs using a VPC.
- Returns:
- The configuration of the Virtual Private Cloud (VPC) for the data in the batch inference job. For more information, see Protect batch inference jobs using a VPC.
-
timeoutDurationInHours
The number of hours after which batch inference job was set to time out.
- Returns:
- The number of hours after which batch inference job was set to time out.
-
jobExpirationTime
The time at which the batch inference job times or timed out.
- Returns:
- The time at which the batch inference job times or timed out.
-
toBuilder
Description copied from interface:ToCopyableBuilderTake this object and create a builder that contains all of the current property values of this object.- Specified by:
toBuilderin interfaceToCopyableBuilder<GetModelInvocationJobResponse.Builder,GetModelInvocationJobResponse> - Specified by:
toBuilderin classAwsResponse- Returns:
- a builder for type T
-
builder
-
serializableBuilderClass
-
hashCode
public final int hashCode()- Overrides:
hashCodein classAwsResponse
-
equals
- Overrides:
equalsin classAwsResponse
-
equalsBySdkFields
Description copied from interface:SdkPojoIndicates whether some other object is "equal to" this one by SDK fields. An SDK field is a modeled, non-inherited field in anSdkPojoclass, and is generated based on a service model.If an
SdkPojoclass does not have any inherited fields,equalsBySdkFieldsandequalsare essentially the same.- Specified by:
equalsBySdkFieldsin interfaceSdkPojo- Parameters:
obj- the object to be compared with- Returns:
- true if the other object equals to this object by sdk fields, false otherwise.
-
toString
-
getValueForField
Description copied from class:SdkResponseUsed to retrieve the value of a field from any class that extendsSdkResponse. The field name specified should match the member name from the corresponding service-2.json model specified in the codegen-resources folder for a given service. The class specifies what class to cast the returned value to. If the returned value is also a modeled class, theSdkResponse.getValueForField(String, Class)method will again be available.- Overrides:
getValueForFieldin classSdkResponse- Parameters:
fieldName- The name of the member to be retrieved.clazz- The class to cast the returned object to.- Returns:
- Optional containing the casted return value
-
sdkFields
-
sdkFieldNameToField
- Specified by:
sdkFieldNameToFieldin interfaceSdkPojo- Returns:
- The mapping between the field name and its corresponding field.
-