DatasetExportJob

Describes a job that exports a dataset to an Amazon S3 bucket. For more information, see CreateDatasetExportJob.

A dataset export job can be in one of the following states:

  • CREATE PENDING CREATE IN_PROGRESS ACTIVE -or- CREATE FAILED

Types

Link copied to clipboard
class Builder
Link copied to clipboard
object Companion

Properties

Link copied to clipboard

The creation date and time (in Unix time) of the dataset export job.

Link copied to clipboard

The Amazon Resource Name (ARN) of the dataset to export.

Link copied to clipboard

The Amazon Resource Name (ARN) of the dataset export job.

Link copied to clipboard

If a dataset export job fails, provides the reason why.

Link copied to clipboard

The data to export, based on how you imported the data. You can choose to export BULK data that you imported using a dataset import job, PUT data that you imported incrementally (using the console, PutEvents, PutUsers and PutItems operations), or ALL for both types. The default value is PUT.

Link copied to clipboard

The name of the export job.

Link copied to clipboard

The path to the Amazon S3 bucket where the job's output is stored. For example:

Link copied to clipboard

The date and time (in Unix time) the status of the dataset export job was last updated.

Link copied to clipboard

The Amazon Resource Name (ARN) of the IAM service role that has permissions to add data to your output Amazon S3 bucket.

Link copied to clipboard

The status of the dataset export job.

Functions

Link copied to clipboard
inline fun copy(block: DatasetExportJob.Builder.() -> Unit = {}): DatasetExportJob
Link copied to clipboard
open operator override fun equals(other: Any?): Boolean
Link copied to clipboard
open override fun hashCode(): Int
Link copied to clipboard
open override fun toString(): String