You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

Status

Current stateUnder discussion

Discussion thread: here [Change the link from the KIP proposal email archive to your own email thread]

JIRA: here [Change the link from KAFKA-1 to your own ticket]

Please keep the discussion on the mailing list rather than commenting on the wiki (wiki discussions get unwieldy fast).

Motivation

Kafka Connect allows integration with many types of external systems.  Some of these systems may require secrets to be configured in order to access them.  Many customers have an existing Secret Management strategy and are using centralized management systems such as Vault, Keywhiz, or AWS Secrets Manager.  Vault is very popular and has been described as "the current gold standard in secret management and provisioning". These Secret Management systems may satisfy the following customer requirements:

  • No secret in cleartext at rest (such as on disk) or in transit (over the network)
  • Secrets protected by access control mechanisms
  • All access to secrets recorded in an audit log
  • Support for secret versioning and rolling
  • A common set of APIs for both applications and tools to access secrets
  • Redundancy in case of failover so that secrets are always available
  • Certification as conformant with required compliance standards

Other customers may be passing secrets into the host through various means (such as through Docker secrets), but do not want the secret to appear in cleartext in the Kafka Connect configuration. 

There is a need for secrets from all of these systems to be injected into Kafka Connect configurations, and allow the customer to specify the means of injection through a plugin.

Public Interfaces


 


public interface ConfigProvider extends Configurable, Closeable {
     
    // Initialize the provider
    void start(ConfigContext ctx);
 
    // Transform the configs by resolving all indirect references
    Map<String, String> transform(ConfigContext ctx, Map<String, String> configs);
}
 
public interface ConfigContext {
 
    // Get the initialization parameters
    Map<String, String> parameters();
 
    // The name of the connector
    String connectorName();
 
    // Schedule a reload, possibly for secrets rotation
    void scheduleConfigReload(long delayMs);
}


 

 

public interface SinkTaskContext {
    ...
 
    Map<String, String> config();
 
    ...
}
 
public interface SourceTaskContext {
    ...


    Map<String, String> config();


    ...
}

 


Briefly list any new interfaces that will be introduced as part of this proposal or any existing interfaces that will be removed or changed. The purpose of this section is to concisely call out the public contract that will come along with this feature.


A public interface is any change to the following:

  • Binary log format

  • The network protocol and api behavior

  • Any class in the public packages under clientsConfiguration, especially client configuration

    • org/apache/kafka/common/serialization

    • org/apache/kafka/common

    • org/apache/kafka/common/errors

    • org/apache/kafka/clients/producer

    • org/apache/kafka/clients/consumer (eventually, once stable)

  • Monitoring

  • Command line tools and arguments

  • Anything else that will likely break existing users in some way when they upgrade

Proposed Changes

Describe the new thing you want to do in appropriate detail. This may be fairly extensive and have large subsections of its own. Or it may be a few sentences. Use judgement based on the scope of the change.

Compatibility, Deprecation, and Migration Plan

No changes are required for existing Connectors. 

Rejected Alternatives

The current scope of this proposal is for Connectors only.  It does not address brokers nor clients.   The injection will happen at a very specific time in the lifecycle of Kafka Connect, i.e. after the configuration is stored and before the Connectors and Tasks are started.

A related feature for brokers is KIP-226, which allows for dynamic broker configuration.  It can also store passwords.  However,

  1. It currently does not work for Kafka Connect.
  2. One requirement is to not "leak" secrets to other systems, especially if the customer is already using a centralized Secret Management system.

A related feature for clients is KIP-76, which is for obtaining passwords through scripts.  However,

  1. It is not yet implemented
  2. It only applies to certain password fields.
  3. It does not allow for custom plugins.
  • No labels