Interface ListModelInvocationJobsRequest.Builder

  • Method Details

    • submitTimeAfter

      ListModelInvocationJobsRequest.Builder submitTimeAfter(Instant submitTimeAfter)

      Specify a time to filter for batch inference jobs that were submitted after the time you specify.

      Parameters:
      submitTimeAfter - Specify a time to filter for batch inference jobs that were submitted after the time you specify.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • submitTimeBefore

      ListModelInvocationJobsRequest.Builder submitTimeBefore(Instant submitTimeBefore)

      Specify a time to filter for batch inference jobs that were submitted before the time you specify.

      Parameters:
      submitTimeBefore - Specify a time to filter for batch inference jobs that were submitted before the time you specify.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • statusEquals

      ListModelInvocationJobsRequest.Builder statusEquals(String statusEquals)

      Specify a status to filter for batch inference jobs whose statuses match the string you specify.

      The following statuses are possible:

      • Submitted – This job has been submitted to a queue for validation.

      • Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:

        • Your IAM service role has access to the Amazon S3 buckets containing your files.

        • Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the modelInput value matches the request body for the model.

        • Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.

      • Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.

      • Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.

      • InProgress – This job has begun. You can start viewing the results in the output S3 location.

      • Completed – This job has successfully completed. View the output files in the output S3 location.

      • PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.

      • Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.

      • Stopped – This job was stopped by a user.

      • Stopping – This job is being stopped by a user.

      Parameters:
      statusEquals - Specify a status to filter for batch inference jobs whose statuses match the string you specify.

      The following statuses are possible:

      • Submitted – This job has been submitted to a queue for validation.

      • Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:

        • Your IAM service role has access to the Amazon S3 buckets containing your files.

        • Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the modelInput value matches the request body for the model.

        • Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.

      • Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.

      • Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.

      • InProgress – This job has begun. You can start viewing the results in the output S3 location.

      • Completed – This job has successfully completed. View the output files in the output S3 location.

      • PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.

      • Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.

      • Stopped – This job was stopped by a user.

      • Stopping – This job is being stopped by a user.

      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • statusEquals

      Specify a status to filter for batch inference jobs whose statuses match the string you specify.

      The following statuses are possible:

      • Submitted – This job has been submitted to a queue for validation.

      • Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:

        • Your IAM service role has access to the Amazon S3 buckets containing your files.

        • Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the modelInput value matches the request body for the model.

        • Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.

      • Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.

      • Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.

      • InProgress – This job has begun. You can start viewing the results in the output S3 location.

      • Completed – This job has successfully completed. View the output files in the output S3 location.

      • PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.

      • Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.

      • Stopped – This job was stopped by a user.

      • Stopping – This job is being stopped by a user.

      Parameters:
      statusEquals - Specify a status to filter for batch inference jobs whose statuses match the string you specify.

      The following statuses are possible:

      • Submitted – This job has been submitted to a queue for validation.

      • Validating – This job is being validated for the requirements described in Format and upload your batch inference data. The criteria include the following:

        • Your IAM service role has access to the Amazon S3 buckets containing your files.

        • Your files are .jsonl files and each individual record is a JSON object in the correct format. Note that validation doesn't check if the modelInput value matches the request body for the model.

        • Your files fulfill the requirements for file size and number of records. For more information, see Quotas for Amazon Bedrock.

      • Scheduled – This job has been validated and is now in a queue. The job will automatically start when it reaches its turn.

      • Expired – This job timed out because it was scheduled but didn't begin before the set timeout duration. Submit a new job request.

      • InProgress – This job has begun. You can start viewing the results in the output S3 location.

      • Completed – This job has successfully completed. View the output files in the output S3 location.

      • PartiallyCompleted – This job has partially completed. Not all of your records could be processed in time. View the output files in the output S3 location.

      • Failed – This job has failed. Check the failure message for any further details. For further assistance, reach out to the Amazon Web ServicesSupport Center.

      • Stopped – This job was stopped by a user.

      • Stopping – This job is being stopped by a user.

      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • nameContains

      ListModelInvocationJobsRequest.Builder nameContains(String nameContains)

      Specify a string to filter for batch inference jobs whose names contain the string.

      Parameters:
      nameContains - Specify a string to filter for batch inference jobs whose names contain the string.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • maxResults

      The maximum number of results to return. If there are more results than the number that you specify, a nextToken value is returned. Use the nextToken in a request to return the next batch of results.

      Parameters:
      maxResults - The maximum number of results to return. If there are more results than the number that you specify, a nextToken value is returned. Use the nextToken in a request to return the next batch of results.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • nextToken

      If there were more results than the value you specified in the maxResults field in a previous ListModelInvocationJobs request, the response would have returned a nextToken value. To see the next batch of results, send the nextToken value in another request.

      Parameters:
      nextToken - If there were more results than the value you specified in the maxResults field in a previous ListModelInvocationJobs request, the response would have returned a nextToken value. To see the next batch of results, send the nextToken value in another request.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • sortBy

      An attribute by which to sort the results.

      Parameters:
      sortBy - An attribute by which to sort the results.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • sortBy

      An attribute by which to sort the results.

      Parameters:
      sortBy - An attribute by which to sort the results.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • sortOrder

      Specifies whether to sort the results by ascending or descending order.

      Parameters:
      sortOrder - Specifies whether to sort the results by ascending or descending order.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • sortOrder

      Specifies whether to sort the results by ascending or descending order.

      Parameters:
      sortOrder - Specifies whether to sort the results by ascending or descending order.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • overrideConfiguration

      ListModelInvocationJobsRequest.Builder overrideConfiguration(AwsRequestOverrideConfiguration overrideConfiguration)
      Description copied from interface: AwsRequest.Builder
      Add an optional request override configuration.
      Specified by:
      overrideConfiguration in interface AwsRequest.Builder
      Parameters:
      overrideConfiguration - The override configuration.
      Returns:
      This object for method chaining.
    • overrideConfiguration

      Description copied from interface: AwsRequest.Builder
      Add an optional request override configuration.
      Specified by:
      overrideConfiguration in interface AwsRequest.Builder
      Parameters:
      builderConsumer - A Consumer to which an empty AwsRequestOverrideConfiguration.Builder will be given.
      Returns:
      This object for method chaining.