You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 8 Next »

Status

Current state"Adopted"

Discussion thread: here

JIRA: here 

Adopted: 1.1.0

Please keep the discussion on the mailing list rather than commenting on the wiki (wiki discussions get unwieldy fast).

Motivation

The internal.key.converter and internal.value.converter were original exposed as configs because

  1. they are actually pluggable 
  2. providing a default would require relying on the JsonConverter always being available, which until we had classloader isolation it was possible might be removed for compatibility reasons.

However, this has ultimately just caused a lot more trouble and confusion than it is worth.

Public Interfaces

WorkerConfig.java
 
public static final String INTERNAL_KEY_CONVERTER_CLASS_CONFIG = "internal.key.converter";
public static final String INTERNAL_KEY_CONVERTER_CLASS_DOC =
        "Converter class used to convert between Kafka Connect format and the serialized form that is written to Kafka." +
                " This controls the format of the keys in messages written to or read from Kafka, and since this is" +
                " independent of connectors it allows any connector to work with any serialization format." +
                " Examples of common formats include JSON and Avro." +
                " This setting controls the format used for internal bookkeeping data used by the framework, such as" +
                " configs and offsets, so users can typically use any functioning Converter implementation.";

public static final String INTERNAL_VALUE_CONVERTER_CLASS_CONFIG = "internal.value.converter";
public static final String INTERNAL_VALUE_CONVERTER_CLASS_DOC =
        "Converter class used to convert between Kafka Connect format and the serialized form that is written to Kafka." +
                " This controls the format of the values in messages written to or read from Kafka, and since this is" +
                " independent of connectors it allows any connector to work with any serialization format." +
                " Examples of common formats include JSON and Avro." +
                " This setting controls the format used for internal bookkeeping data used by the framework, such as" +
                " configs and offsets, so users can typically use any functioning Converter implementation.";
 
connect-standalone.properties
 
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
 
connect-distributed.properties
 
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter

 

Proposed Changes

We will log an warning if above configs are specified" and since the JsonConverter is the default, the configs will be removed immediately from the above example worker configuration files.

Compatibility, Deprecation, and Migration Plan

 This is a backward compatible change and the configs would be supported until next major release. However we need to verify whether users are using any other converters apart from default JsonConverter. KAFKA-3988 can help us in validating the same. If there are some users using other converters, we need to consider how would they migrate to newer versions which would no longer support them. 

Rejected Alternatives

None



  • No labels