Kafka 2.8.0 source code environment construction

Install the JDK

JDK installation is very simple, here we can install JDK 8 (although the latest version of JDK is 16, but the domestic production environment is still a large part of the 8 version). First of all, go to the JDK download address to download the JDK installation package of the corresponding system. Here I use the MAC system, and download the DMG file.

Once the download is complete, double-click the DMG file and proceed to the next step to complete the installation.

Install scala

The scala version we have installed here is Scala version 2.13.1. After the download is complete, directly decompress to the current directory.

Download gradle

Kafka code dependencies are managed by gradle. You need to download gradle 6.8 from the download location and unzip it to the current directory.

Configuring environment Variables

After installing JDK, Scala, and Gradle, we open the command line, jump to the current user’s root directory, and open bash_profile

sudo vim .bash_profile
Copy the code

Configure JAVA_HOME, SCALA_HOME, and GRADLE_HOME environment variables in the bash_profile file and add them to the PATH variable, as shown below:

Next, save the bash_profile file and run the source command to refresh the file:

source .bash_profile
Copy the code

Finally, run the Java -version, scala-version, and gradle-version commands to check whether the environment variables are configured successfully. The output shown in the following figure indicates that the configuration is successful:

Install the Zookeeper

Kafka relied on Zookeeper to store metadata prior to release 2.8.0. As of release 2.8.0, Kafka no longer relies heavily on Zookeeper and implements raft protocol to store metadata.

Here we still build a pseudo-distributed Zookeeper environment. First, go to the official website of Zookeeper to download the compressed zookeeper-3.6.3 package. After downloading, unzip the apache-Zookeeper-3.6.3-bin.tar. gz package to the current directory.

Go to ${ZOOKEEPER_HOME}/conf, copy the zoo_sample. CFG file and rename it zoo.cfg:

cp zoo_sample.cfg zoo.cfg
Copy the code

Finally, go to ${ZOOKEEPER_HOME}/bin and start the Zookeeper service:

./zkServer.sh start
Copy the code

If the Zookeeper service is successfully started, the following output is displayed:

Download kafka source code

With these base dependencies installed, we can start building the kafka source environment.

Git clone: git clone

git clone https://github.com/apache/kafka.git
Copy the code

Git checkout checkout the origin/2.8 branch:

Git checkout -b 2.8 Origin /2.8Copy the code

In order to import kafka source code into the IDEA editor, we need to run the gradle IDEA command. This command will download kafka dependencies. It takes a long time.

Finally, import kafka source code into IDEA, and get the project result as shown below:

In Kafka, many request and Response classes are generated during compilation. To generate these classes, run the./gradlew jar command.

After executing the./gradlew jar command, we need to find the generated directory in the following modules in IDEA and add the Java directory to the classpath

  • The client module:

  • The core modules:

  • The metadata module:

  • Raft module:

Verify the environment

Let’s verify that the kafka source environment is set up successfully.

First, we copy the log4j.properties configuration file from the conf directory to the SRC /main/ Scala directory, as shown below:

Next, we will modify the server.properties file in the conf directory to change the log.dir configuration item to point to the kafka-logs directory in the kafka source directory:

log.dirs=/Users/xxx/source/kafka/kafka-logs
Copy the code

The rest of the configuration in the server.properties file remains unchanged for now.

Kafka: Kafka broker: kafka Broker: kafka Broker

If the startup is successful, the console output is normal and you can see the following output:

Problems P.S. may encounter:

Adjust the dependent scope of SLf4J-log4j12 to Runtime, as shown below:

Send and consume message

We use kafka scripting tools to verify the above kafka source code environment

First, we go to the ${KAFKA_HOME}/bin directory and create a topic named test_topic with the command kafka-topics.

./kafka-topics.sh --zookeeper localhost:2181 --create --topic test_topic --partitions 1  --replication-factor 1
Copy the code

The execution effect is as follows:

We then start a consumer on the command line to consume test_topic with kafka-console-consumer.sh as follows:

./kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test_topic
Copy the code

The execution effect is as follows:

The command line will hang all the time. When a message is received, it will be printed in the command line.

Next, we use the kafka-console-producer.sh command to start a command line that generates data to test_topic.

./kafka-console-producer.sh --bootstrap-server localhost:9092 --topic test_topic --property parse.key=true
Copy the code

The execution effect is as follows:

The command line will hang, and when we type a message and press Enter, the message will be sent totest_topicIn this topic.

Kafka’s broker, producer, and consumer are all started. In the producer, type a message “Hello YangSizheng” with a key. \t (key); \t (value);

After we type message and press Enter, we can receive the message in the consumer, as shown in the picture below:

conclusion

This class we focus on kafka 2.8.0 version of the source code environment.

  • First, we downloaded and installed the JDK, Scala, Gradle and other basic software, and configured their environment variables.
  • We then installed Zookeeper and started the Zookeeper service.
  • Next, we downloaded the latest Kafka source code with Git and compiled it to start kafka Broker
  • Finally, kafka sends and consumes messages using producer and Consumer commands to validate the source environment.

Thank you for watching. Articles and videos about the course will also be posted

  • Wechat account: Yang Sizheng
  • The original address: xxxlxy2008. Making. IO/kafka/E3%2%…
  • B station: Kafka2.8.0 source code environment construction