Interface BatchInferenceJob.Builder

  • Method Details

    • jobName

      The name of the batch inference job.

      Parameters:
      jobName - The name of the batch inference job.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • batchInferenceJobArn

      BatchInferenceJob.Builder batchInferenceJobArn(String batchInferenceJobArn)

      The Amazon Resource Name (ARN) of the batch inference job.

      Parameters:
      batchInferenceJobArn - The Amazon Resource Name (ARN) of the batch inference job.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • filterArn

      BatchInferenceJob.Builder filterArn(String filterArn)

      The ARN of the filter used on the batch inference job.

      Parameters:
      filterArn - The ARN of the filter used on the batch inference job.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • failureReason

      BatchInferenceJob.Builder failureReason(String failureReason)

      If the batch inference job failed, the reason for the failure.

      Parameters:
      failureReason - If the batch inference job failed, the reason for the failure.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • solutionVersionArn

      BatchInferenceJob.Builder solutionVersionArn(String solutionVersionArn)

      The Amazon Resource Name (ARN) of the solution version from which the batch inference job was created.

      Parameters:
      solutionVersionArn - The Amazon Resource Name (ARN) of the solution version from which the batch inference job was created.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • numResults

      BatchInferenceJob.Builder numResults(Integer numResults)

      The number of recommendations generated by the batch inference job. This number includes the error messages generated for failed input records.

      Parameters:
      numResults - The number of recommendations generated by the batch inference job. This number includes the error messages generated for failed input records.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • jobInput

      The Amazon S3 path that leads to the input data used to generate the batch inference job.

      Parameters:
      jobInput - The Amazon S3 path that leads to the input data used to generate the batch inference job.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • jobInput

      The Amazon S3 path that leads to the input data used to generate the batch inference job.

      This is a convenience method that creates an instance of the BatchInferenceJobInput.Builder avoiding the need to create one manually via BatchInferenceJobInput.builder().

      When the Consumer completes, SdkBuilder.build() is called immediately and its result is passed to jobInput(BatchInferenceJobInput).

      Parameters:
      jobInput - a consumer that will call methods on BatchInferenceJobInput.Builder
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • jobOutput

      The Amazon S3 bucket that contains the output data generated by the batch inference job.

      Parameters:
      jobOutput - The Amazon S3 bucket that contains the output data generated by the batch inference job.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • jobOutput

      The Amazon S3 bucket that contains the output data generated by the batch inference job.

      This is a convenience method that creates an instance of the BatchInferenceJobOutput.Builder avoiding the need to create one manually via BatchInferenceJobOutput.builder().

      When the Consumer completes, SdkBuilder.build() is called immediately and its result is passed to jobOutput(BatchInferenceJobOutput).

      Parameters:
      jobOutput - a consumer that will call methods on BatchInferenceJobOutput.Builder
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • batchInferenceJobConfig

      BatchInferenceJob.Builder batchInferenceJobConfig(BatchInferenceJobConfig batchInferenceJobConfig)

      A string to string map of the configuration details of a batch inference job.

      Parameters:
      batchInferenceJobConfig - A string to string map of the configuration details of a batch inference job.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • batchInferenceJobConfig

      default BatchInferenceJob.Builder batchInferenceJobConfig(Consumer<BatchInferenceJobConfig.Builder> batchInferenceJobConfig)

      A string to string map of the configuration details of a batch inference job.

      This is a convenience method that creates an instance of the BatchInferenceJobConfig.Builder avoiding the need to create one manually via BatchInferenceJobConfig.builder().

      When the Consumer completes, SdkBuilder.build() is called immediately and its result is passed to batchInferenceJobConfig(BatchInferenceJobConfig).

      Parameters:
      batchInferenceJobConfig - a consumer that will call methods on BatchInferenceJobConfig.Builder
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • roleArn

      The ARN of the Amazon Identity and Access Management (IAM) role that requested the batch inference job.

      Parameters:
      roleArn - The ARN of the Amazon Identity and Access Management (IAM) role that requested the batch inference job.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • batchInferenceJobMode

      BatchInferenceJob.Builder batchInferenceJobMode(String batchInferenceJobMode)

      The job's mode.

      Parameters:
      batchInferenceJobMode - The job's mode.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • batchInferenceJobMode

      BatchInferenceJob.Builder batchInferenceJobMode(BatchInferenceJobMode batchInferenceJobMode)

      The job's mode.

      Parameters:
      batchInferenceJobMode - The job's mode.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • themeGenerationConfig

      BatchInferenceJob.Builder themeGenerationConfig(ThemeGenerationConfig themeGenerationConfig)

      The job's theme generation settings.

      Parameters:
      themeGenerationConfig - The job's theme generation settings.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • themeGenerationConfig

      default BatchInferenceJob.Builder themeGenerationConfig(Consumer<ThemeGenerationConfig.Builder> themeGenerationConfig)

      The job's theme generation settings.

      This is a convenience method that creates an instance of the ThemeGenerationConfig.Builder avoiding the need to create one manually via ThemeGenerationConfig.builder().

      When the Consumer completes, SdkBuilder.build() is called immediately and its result is passed to themeGenerationConfig(ThemeGenerationConfig).

      Parameters:
      themeGenerationConfig - a consumer that will call methods on ThemeGenerationConfig.Builder
      Returns:
      Returns a reference to this object so that method calls can be chained together.
      See Also:
    • status

      The status of the batch inference job. The status is one of the following values:

      • PENDING

      • IN PROGRESS

      • ACTIVE

      • CREATE FAILED

      Parameters:
      status - The status of the batch inference job. The status is one of the following values:

      • PENDING

      • IN PROGRESS

      • ACTIVE

      • CREATE FAILED

      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • creationDateTime

      BatchInferenceJob.Builder creationDateTime(Instant creationDateTime)

      The time at which the batch inference job was created.

      Parameters:
      creationDateTime - The time at which the batch inference job was created.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • lastUpdatedDateTime

      BatchInferenceJob.Builder lastUpdatedDateTime(Instant lastUpdatedDateTime)

      The time at which the batch inference job was last updated.

      Parameters:
      lastUpdatedDateTime - The time at which the batch inference job was last updated.
      Returns:
      Returns a reference to this object so that method calls can be chained together.