Class KafkaSettings
- All Implemented Interfaces:
Serializable
,SdkPojo
,ToCopyableBuilder<KafkaSettings.Builder,
KafkaSettings>
Provides information that describes an Apache Kafka endpoint. This information includes the output format of records applied to the endpoint and details of transaction and control table data information.
- See Also:
-
Nested Class Summary
-
Method Summary
Modifier and TypeMethodDescriptionfinal String
broker()
A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance.static KafkaSettings.Builder
builder()
final boolean
final boolean
equalsBySdkFields
(Object obj) Indicates whether some other object is "equal to" this one by SDK fields.final <T> Optional
<T> getValueForField
(String fieldName, Class<T> clazz) final int
hashCode()
final Boolean
Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output.final Boolean
Include NULL and empty columns for records migrated to the endpoint.final Boolean
Shows the partition value within the Kafka message output unless the partition type isschema-table-type
.final Boolean
Includes any data definition language (DDL) operations that change the table in the control data, such asrename-table
,drop-table
,add-column
,drop-column
, andrename-column
.final Boolean
Provides detailed transaction information from the source database.final MessageFormatValue
The output format for the records created on the endpoint.final String
The output format for the records created on the endpoint.final Integer
The maximum size in bytes for records created on the endpoint The default is 1,000,000.final Boolean
Set this optional parameter totrue
to avoid adding a '0x' prefix to raw data in hexadecimal format.final Boolean
Prefixes schema and table names to partition values, when the partition type isprimary-key-type
.final KafkaSaslMechanism
For SASL/SSL authentication, DMS supports theSCRAM-SHA-512
mechanism by default.final String
For SASL/SSL authentication, DMS supports theSCRAM-SHA-512
mechanism by default.final String
The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.final String
The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.final KafkaSecurityProtocol
Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS).final String
Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS).static Class
<? extends KafkaSettings.Builder> final String
The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.final String
The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.final String
The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.final String
The password for the client private key used to securely connect to a Kafka target endpoint.Sets hostname verification for the certificate.final String
Sets hostname verification for the certificate.Take this object and create a builder that contains all of the current property values of this object.final String
topic()
The topic to which you migrate the data.final String
toString()
Returns a string representation of this object.Methods inherited from interface software.amazon.awssdk.utils.builder.ToCopyableBuilder
copy
-
Method Details
-
broker
A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form
broker-hostname-or-ip:port
. For example,"ec2-12-345-678-901.compute-1.amazonaws.com:2345"
. For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.- Returns:
- A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka
instance. Specify each broker location in the form
broker-hostname-or-ip:port
. For example,"ec2-12-345-678-901.compute-1.amazonaws.com:2345"
. For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.
-
topic
The topic to which you migrate the data. If you don't specify a topic, DMS specifies
"kafka-default-topic"
as the migration topic.- Returns:
- The topic to which you migrate the data. If you don't specify a topic, DMS specifies
"kafka-default-topic"
as the migration topic.
-
messageFormat
The output format for the records created on the endpoint. The message format is
JSON
(default) orJSON_UNFORMATTED
(a single line with no tab).If the service returns an enum value that is not available in the current SDK version,
messageFormat
will returnMessageFormatValue.UNKNOWN_TO_SDK_VERSION
. The raw value returned by the service is available frommessageFormatAsString()
.- Returns:
- The output format for the records created on the endpoint. The message format is
JSON
(default) orJSON_UNFORMATTED
(a single line with no tab). - See Also:
-
messageFormatAsString
The output format for the records created on the endpoint. The message format is
JSON
(default) orJSON_UNFORMATTED
(a single line with no tab).If the service returns an enum value that is not available in the current SDK version,
messageFormat
will returnMessageFormatValue.UNKNOWN_TO_SDK_VERSION
. The raw value returned by the service is available frommessageFormatAsString()
.- Returns:
- The output format for the records created on the endpoint. The message format is
JSON
(default) orJSON_UNFORMATTED
(a single line with no tab). - See Also:
-
includeTransactionDetails
Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for
transaction_id
, previoustransaction_id
, andtransaction_record_id
(the record offset within a transaction). The default isfalse
.- Returns:
- Provides detailed transaction information from the source database. This information includes a commit
timestamp, a log position, and values for
transaction_id
, previoustransaction_id
, andtransaction_record_id
(the record offset within a transaction). The default isfalse
.
-
includePartitionValue
Shows the partition value within the Kafka message output unless the partition type is
schema-table-type
. The default isfalse
.- Returns:
- Shows the partition value within the Kafka message output unless the partition type is
schema-table-type
. The default isfalse
.
-
partitionIncludeSchemaTable
Prefixes schema and table names to partition values, when the partition type is
primary-key-type
. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default isfalse
.- Returns:
- Prefixes schema and table names to partition values, when the partition type is
primary-key-type
. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default isfalse
.
-
includeTableAlterOperations
Includes any data definition language (DDL) operations that change the table in the control data, such as
rename-table
,drop-table
,add-column
,drop-column
, andrename-column
. The default isfalse
.- Returns:
- Includes any data definition language (DDL) operations that change the table in the control data, such as
rename-table
,drop-table
,add-column
,drop-column
, andrename-column
. The default isfalse
.
-
includeControlDetails
Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is
false
.- Returns:
- Shows detailed control information for table definition, column definition, and table and column changes
in the Kafka message output. The default is
false
.
-
messageMaxBytes
The maximum size in bytes for records created on the endpoint The default is 1,000,000.
- Returns:
- The maximum size in bytes for records created on the endpoint The default is 1,000,000.
-
includeNullAndEmpty
Include NULL and empty columns for records migrated to the endpoint. The default is
false
.- Returns:
- Include NULL and empty columns for records migrated to the endpoint. The default is
false
.
-
securityProtocol
Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include
ssl-encryption
,ssl-authentication
, andsasl-ssl
.sasl-ssl
requiresSaslUsername
andSaslPassword
.If the service returns an enum value that is not available in the current SDK version,
securityProtocol
will returnKafkaSecurityProtocol.UNKNOWN_TO_SDK_VERSION
. The raw value returned by the service is available fromsecurityProtocolAsString()
.- Returns:
- Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include
ssl-encryption
,ssl-authentication
, andsasl-ssl
.sasl-ssl
requiresSaslUsername
andSaslPassword
. - See Also:
-
securityProtocolAsString
Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include
ssl-encryption
,ssl-authentication
, andsasl-ssl
.sasl-ssl
requiresSaslUsername
andSaslPassword
.If the service returns an enum value that is not available in the current SDK version,
securityProtocol
will returnKafkaSecurityProtocol.UNKNOWN_TO_SDK_VERSION
. The raw value returned by the service is available fromsecurityProtocolAsString()
.- Returns:
- Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include
ssl-encryption
,ssl-authentication
, andsasl-ssl
.sasl-ssl
requiresSaslUsername
andSaslPassword
. - See Also:
-
sslClientCertificateArn
The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.
- Returns:
- The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.
-
sslClientKeyArn
The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.
- Returns:
- The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.
-
sslClientKeyPassword
The password for the client private key used to securely connect to a Kafka target endpoint.
- Returns:
- The password for the client private key used to securely connect to a Kafka target endpoint.
-
sslCaCertificateArn
The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.
- Returns:
- The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.
-
saslUsername
The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.
- Returns:
- The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.
-
saslPassword
The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.
- Returns:
- The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.
-
noHexPrefix
Set this optional parameter to
true
to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use theNoHexPrefix
endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.- Returns:
- Set this optional parameter to
true
to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use theNoHexPrefix
endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.
-
saslMechanism
For SASL/SSL authentication, DMS supports the
SCRAM-SHA-512
mechanism by default. DMS versions 3.5.0 and later also support thePLAIN
mechanism. To use thePLAIN
mechanism, set this parameter toPLAIN.
If the service returns an enum value that is not available in the current SDK version,
saslMechanism
will returnKafkaSaslMechanism.UNKNOWN_TO_SDK_VERSION
. The raw value returned by the service is available fromsaslMechanismAsString()
.- Returns:
- For SASL/SSL authentication, DMS supports the
SCRAM-SHA-512
mechanism by default. DMS versions 3.5.0 and later also support thePLAIN
mechanism. To use thePLAIN
mechanism, set this parameter toPLAIN.
- See Also:
-
saslMechanismAsString
For SASL/SSL authentication, DMS supports the
SCRAM-SHA-512
mechanism by default. DMS versions 3.5.0 and later also support thePLAIN
mechanism. To use thePLAIN
mechanism, set this parameter toPLAIN.
If the service returns an enum value that is not available in the current SDK version,
saslMechanism
will returnKafkaSaslMechanism.UNKNOWN_TO_SDK_VERSION
. The raw value returned by the service is available fromsaslMechanismAsString()
.- Returns:
- For SASL/SSL authentication, DMS supports the
SCRAM-SHA-512
mechanism by default. DMS versions 3.5.0 and later also support thePLAIN
mechanism. To use thePLAIN
mechanism, set this parameter toPLAIN.
- See Also:
-
sslEndpointIdentificationAlgorithm
Sets hostname verification for the certificate. This setting is supported in DMS version 3.5.1 and later.
If the service returns an enum value that is not available in the current SDK version,
sslEndpointIdentificationAlgorithm
will returnKafkaSslEndpointIdentificationAlgorithm.UNKNOWN_TO_SDK_VERSION
. The raw value returned by the service is available fromsslEndpointIdentificationAlgorithmAsString()
.- Returns:
- Sets hostname verification for the certificate. This setting is supported in DMS version 3.5.1 and later.
- See Also:
-
sslEndpointIdentificationAlgorithmAsString
Sets hostname verification for the certificate. This setting is supported in DMS version 3.5.1 and later.
If the service returns an enum value that is not available in the current SDK version,
sslEndpointIdentificationAlgorithm
will returnKafkaSslEndpointIdentificationAlgorithm.UNKNOWN_TO_SDK_VERSION
. The raw value returned by the service is available fromsslEndpointIdentificationAlgorithmAsString()
.- Returns:
- Sets hostname verification for the certificate. This setting is supported in DMS version 3.5.1 and later.
- See Also:
-
toBuilder
Description copied from interface:ToCopyableBuilder
Take this object and create a builder that contains all of the current property values of this object.- Specified by:
toBuilder
in interfaceToCopyableBuilder<KafkaSettings.Builder,
KafkaSettings> - Returns:
- a builder for type T
-
builder
-
serializableBuilderClass
-
hashCode
public final int hashCode() -
equals
-
equalsBySdkFields
Description copied from interface:SdkPojo
Indicates whether some other object is "equal to" this one by SDK fields. An SDK field is a modeled, non-inherited field in anSdkPojo
class, and is generated based on a service model.If an
SdkPojo
class does not have any inherited fields,equalsBySdkFields
andequals
are essentially the same.- Specified by:
equalsBySdkFields
in interfaceSdkPojo
- Parameters:
obj
- the object to be compared with- Returns:
- true if the other object equals to this object by sdk fields, false otherwise.
-
toString
Returns a string representation of this object. This is useful for testing and debugging. Sensitive data will be redacted from this string using a placeholder value. -
getValueForField
-
sdkFields
-