Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: More connector spec

...

Using the following format:

{
    "data": [
        {
            "id": "102",
            "name": "car battery"
        }
    ],
    "database": "inventory”,
    "table": "products",
    "es": 1589374013000,
    "ts": 1589374013680,
    "type": "DELETE"
}

Kafka: Read metadata from Kafka's ConsumerRecord and use it for computation

...

PHYSICAL COLUMNS + FORMAT METADATA COLUMNS + SOURCE METADATA COLUMNS

Metadata for existing connectors and formats

For completeness, we list the first metadata for existing Flink connectors and formats. The naming follows FLIP-122 which means:

  • connector metadata has no prefix
  • formats will have a prefix using the factory identifier
  • key/value formats are prefixed with `key.` and `value.`

Note: We will use a relaxed data type validation (such as `LogicalTypeCasts#supportsAvoidingCast`) to make timestamp handling easier. So both TIMESTAMP and TIMESTAMP WITH LOCAL TIME ZONE will work in DDL. 

Kafka

KeyData typeRead/WriteNotes
topic
STRINGrWe don't allow writing to different topics for now. Maybe we will allow that in the future via a property.
partition
INTrWe don't allow writing to different partitions for now. Maybe we will allow that in the future via a property.
headers
MAP<STRING, BYTES>r/w
leader-epoch
INTr
offset
BIGINTr
timestamp
TIMESTAMP(3) [WITH LOCAL TIME ZONE]r/w
timestamp-type
STRINGr['NoTimestampType', 'CreateTime', 'LogAppendTime']

Debezium

KeyData typeRead/WriteNotes
debezium-json.schema
STRINGrPure JSON string, can be handled with Flink's built-in JSON SQL functions
debezium-json.timestamp
TIMESTAMP(3) [WITH LOCAL TIME ZONE]r
debezium-json.source
MAP<STRING, STRING>rIt seems that this is always a property list. So we can make it available as a map for easier access.

Reading and writing from key and value

...