kafka

thumbnail
(0)

Version 4.2.1  by WSO2

Category : Siddhi IO
Supported Product Version : SP 4.0.0 SP 4.1.0 SP 4.2.0 SP 4.3.0 SP 4.4.0  

Summary

The siddhi-io-kafka extension is an extension to Siddhi. This implements siddhi kafka source and sink that can be used to receive events from a kafka cluster and to publish events to a kafka cluster. The Kafka Source receives records from a topic with a partition for a Kafka cluster which are in format such as text, XML and JSON. The Kafka Source will create the default partition for a given topic, if the topic is not already been created in the Kafka cluster. The Kafka Sink publishes records to a topic with a partition for a Kafka cluster which are in format such as text, XML and JSON. The Kafka Sink will create the default partition for a given topic, if the topic is not already been created in the Kafka cluster. The publishing topic and partition can be a dynamic value taken from the Siddhi event."


Associated Tags

kafka siddhi-io

Features of siddhi-io-kafka extension


  • Kafka Sink

    A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster.
    The events can be published in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic.
    The publishing topic and partition can be a dynamic value taken from the Siddhi event.
    To configure a sink to use the Kafka transport, the type parameter should have kafka as its value.
  • Kafka MultiDC Sink

    A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster.
    The events can be published in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic.
    The publishing topic and partition can be a dynamic value taken from the Siddhi event.
    To configure a sink to publish events via the Kafka transport, and using two Kafka brokers to publish events to the same topic, the type parameter must have kafkaMultiDC as its value.
  • Kafka Source

    A Kafka source receives events to be processed by WSO2 SP from a topic with a partition for a Kafka cluster.
    The events received can be in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic.
  • Kafka MultiDC Source

    The Kafka Multi Data Center(DC) Source receives records from the same topic in brokers deployed in two different kafka cluster.
    It will filter out all duplicate messages and try to ensurethat the events are received in the correct order by using sequence numbers.
    Events are received in a format such as text, XML JSON or Binary.
    The Kafka Source will create the default partition '0' for a given topic, if the topic is not already been created in the Kafka cluster.

Extension

Download the JAR files of released versions from Maven Central:
https://mvnrepository.com/artifact/org.wso2.extension.siddhi.io.kafka/siddhi-io-kafka


Social Sites