The Airbyte Kafka source allows you to sync data from Kafka. Each Kafka topic is written to the corresponding stream.
Each Kafka topic will be output into a stream.
Currently, this connector only reads data with JSON format. More formats (e.g. Apache Avro) will be supported in the future.
To use the Kafka source, you'll need:
- A Kafka cluster 1.0 or above.
Make sure your Kafka brokers can be accessed by Airbyte.
Airbyte should be allowed to read messages from topics, and these topics should be created before reading from Kafka.
You can determine the topics from which messages are read via the
topic_patternconfiguration parameter. Messages can be read from a hardcoded, pre-defined topic.
To read all messages from a single hardcoded topic, enter its name in the
topic_patternfield e.g: setting
my-topic-namewill read all messages from that topic.
You can determine the topic partitions from which messages are read via the
You should now have all the requirements needed to configure Kafka as a destination in the UI. You can configure the following parameters on the Kafka destination (though many of these are optional or have default values):
- Bootstrap servers
- Topic pattern
- Topic partition
- Test topic
- Group ID
- Max poll records
- SASL JAAS config
- SASL mechanism
- Client ID
- Enable auto commit
- Auto commit interval ms
- Client DNS lookup
- Retry backoff ms
- Request timeout ms
- Receive buffer bytes
- Repeated calls