Skip to content

Latest commit

 

History

History
131 lines (97 loc) · 4.21 KB

README.adoc

File metadata and controls

131 lines (97 loc) · 4.21 KB

clj-kafka-x

A Clojure library for the Apache Kafka (distributed stream-processing software platform).

Uses KF protocol and does not rely on ZooKeeper.

Note
Still contains zookeeper as a dependency.

Tries to be as lightweigh as possible thus depends only on

  • org.apache.kafka/kafka_2.12 "3.4.0"

  • org.apache.kafka/kafka-clients "3.4.0"

  • org.apache.zookeeper/zookeeper "3.8.1"

but excluding jms,jmx* and logging.

Note
Some builds (for instance of v0.4.x branch) may partially (sometimes even fully) be incompatible with some versions of other libraries that also use NIO! If you’re experiencing build problems and/or your application is unexpectedly crashed on start - try check your project dependencies more deeply, may be you will need to correct existing dependencies version or to add an actual version of full [io.netty/netty-all]

Actual library info:

GitHub clj kafka x ClojarsDownloads GitHub release (latest by date) GitHub Release Date GitHub tag (latest by date) GitHub last commit

Installation

Add the following to your Leiningen’s project.clj:

[net.tbt-post/clj-kafka-x "0.7.5"]

Usage

Producer

(require '[clj-kafka-x.producer :as kp])

(with-open [p (kp/producer {"bootstrap.servers" "localhost:9092"}
                           (kp/string-serializer)
                           (kp/string-serializer))]
  @(kp/send p (kp/record "topic-a" "Hi there!")))

Consumer

(require '[clj-kafka-x.consumers.simple :as kc])

(with-open [c (kc/consumer {"bootstrap.servers" "localhost:9092"
                            "group.id" "consumer-id"}
                            (kc/string-deserializer)
                            (kc/string-deserializer))]
  (kc/subscribe c "topic-a")
  (kc/messages c))
Note
When you use multiple partitions per topic it is required to specify them explicitly when subscribing, i.e. (kc/subscribe c [{:topic "topic-a" :partitions #{0 1}} {:topic "topic-b" :partitions #{0 1 2}}])
Real-life (almost) example
(ns buzz.consumer.kafka
  (:require [clj-kafka-x.consumers.simple :as kc]
            [clojure.tools.logging :as log]))

(defn processor [msg schema] msg)
(def schema nil)
(def config {"bootstrap.servers" "localhost:9092"
             "group.id" "consumer-id"})

(defn process-message [msg]
  (let [{:keys [value topic partition offset]} msg
        processor processor ;; choose one by topic name
        schema schema]      ;; choose one by topic name
    (if (fn? processor) (processor value schema) value)))

(defn consume []
  (with-open [c (kc/consumer config
                             (kc/byte-array-deserializer)
                             (kc/byte-array-deserializer))]
    (kc/subscribe c (config/kafka-topics))
    (let [pool (kc/messages c)]
      (doseq [message pool]
        (log/warn (process-message message))))))

you may also use specific timeouts form

(defn- consume [instance process-message]
  (when-let [co (kc/consumer config
                             (kc/byte-array-deserializer)
                             (kc/byte-array-deserializer))
             messages (kc/messages
                        co
                        :timeout (:request-timeout-ms config))]
    (doall (map process-message messages))))

message count per poll execution may be specified by max.poll.records field of configuration

Manual Build

$ lein install

License

Copyright © 2016-2023

Distributed under the Apache License v 2.0