A Producer,

To start, we need to construct the ProducerRecord, which must contain the topic we want to store and the data we want to send, or we can specify special keys or partitions directly. When sending ProducerRecord, the first thing Producer does is serialize keys and values into ByteArrays for transmission over the network. The data then enters the partitioner. If ProducerRecord specifies a partition, the divider does nothing and returns the specified partition number. If no partitions are set, partitions are usually allocated based on the ProducerRecord key. Once the partition is selected, the producer knows which topic and which partition the data should go into. The data is then stored in a recordset that holds data for the same topic and partition. A separate thread will be responsible for sending the recordset to the corresponding broker. When the broker receives a message, it returns a response. If the message is successfully written to Kafka, the RecordMetadata object for the subject is returned, which contains the partition information and the offset recorded in the partition. If the broker fails to write, an error is returned. If the producer receives an error, it tries several times to resend the data. In this chapter, we’ll learn how to use Kafka Producer, and review the flow in Figure 3-1 from time to time. We will cover how to create KafkaProducer and ProducerRecord objects, how to use the default divider and serialization, how to handle errors returned by KafkaProducer, how to customize serialization and divider rules, and review many prodcuer-related configurations.

Second, the Consumer

Three,

Four, the emergence of problems

4.1, org.apache.kafka.com mon. Serialization. Serdes dependence

<! -- org.apache.kafka.common.serialization.Serdes --> <dependency> <groupId>org.apache.kafka</groupId> < artifactId > kafka_2. 10 < / artifactId > < version > 0.9.0.1 < / version > < / dependency >Copy the code

4.2 and Java. Lang. NoSuchMethodError: org.apache.kafka.com mon. Utils. Utils. WriteUnsignedInt (Ljava/nio ByteBuffer; IJ

4.3 Kafka-stream dependencies

<! <dependency> <groupId>org.apache.kafka</ artifactId> The < version > 0.10.0.1 < / version > < / dependency >Copy the code

4.4 kafka Connect dependencies

<! --> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>connect-api</artifactId> The < version > 0.10.2.1 < / version > < / dependency >Copy the code