1. If the package is executed by Spark, go to the Jars folder under Spark to check whether the jar package is available

You can see I have it here, so it’s not a jar package problem

2. Check maven packages in your local tests to see if there is a version problem

My version is consistent, so it is not a version problem. Then, what causes the failure of creation consumption

Kafka link

As you can see, Kafka uses a cluster. The three links are configured with hosts. Let’s look at the hosts that we run on the node

However, because of our previous carelessness, my other two nodes did not configure the hosts file of the Kafka cluster. When all my nodes added the address of the Kafka link, it worked fine