1. Configure the Java environment

Upload the compressed package and decompress it
tar -zxvf jdk-8u201-linux-x64.tar.gz -C /usr/local/java
Copy the code
Configuring Environment Variables
vi /etc/profile

export JAVA_HOME=/usr/local/java
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib:$CLASSPATH
export JAVA_PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin
export PATH=$PATH:${JAVA_PATH}
Copy the code
Making the configuration Effective
source /etc/profile
Copy the code

2. Install the zookeeper

Uploading a Compressed Package

Decompressing a Compressed Package

Tar -zxvf zookeeper-3.4.14.tar.gz -c /opt/baixw/servers/Copy the code

Create data and log directories

#Create a zK data directoryThe mkdir -p/opt/baixw/servers/zookeeper - 3.4.14 / data#Create a directory for zK log filesThe mkdir -p/opt/baixw/servers/zookeeper - 3.4.14 / data/logs#Example Modify the ZK configuration fileCD/opt/baixw/servers/zookeeper - 3.4.14 / conf#The file name
mv zoo_sample.cfg zoo.cfg
Copy the code

Modifying a Configuration File

Edit the zoo.cfg file
vim zoo.cfg
# update datadir
dataDir=/ opt/baixw/servers/zookeeper - 3.4.14 / data
Copy the code

Configuring Environment Variables

vi /etc/profile

#ZOOKEEPERExport ZOOKEEPER_HOME = / opt/baixw/servers/zookeeper - 3.4.14 export PATH = $PATH: ${ZOOKEEPER_HOME} / bin export ZOO_LOG_DIR=/var/baixw/zookeeper/dataCopy the code

Making the configuration Effective

source /etc/profile
Copy the code

Start and verify

zkServer.sh status
zkServer.sh start
zkServer.sh status
Copy the code

3. Installation of kafka

Upload the JAR package and decompress it

Tar -zxvf kafka_2.12-1.0.2. TGZ -c /opt/baixw/servers/Copy the code

Configuring Environment Variables

vim /etc/profile


#KAFKAExport KAFKA_HOME = / opt/baixw/servers/kafka_2. 12-1.0.2 export PATH = $PATH: ${KAFKA_HOME} / binCopy the code

Configuring the configuration file

cd / opt/baixw/servers/kafka_2. 12-1.0.2 / config
vi server.properties


log.dir=/ opt/baixw/servers/kafka_2. 12-1.0.2 / kafka - logs
zookeeper.connect=localhost:2181/myKafka
Copy the code

Start Kafka (start ZooKeeper first)

Kafka - server - start. Sh - daemon/opt/baixw/servers/kafka_2 12-1.0.2 / config/server propertiesCopy the code

4. Install Nginx and configure Nginx to integrate Kafka

The installation of the Nginx

Before installation, ensure that GCC, pcc-devel, zlib-devel, and openssl-devel are installed in the system
yum -y install gcc pcre-devel zlib-devel openssl openssl-devel
Copy the code
Download Nginx
https://nginx.org/download/
Copy the code

Go to the top link, select the version you want, right click, copy the link

Download Nginx:
## download NginxCD/opt/baixw/servers/softwore wget HTTP: / / https://nginx.org/download/nginx-1.9.9.tar.gzCopy the code

Configure Nginx – kafka

Nginx-kafka module integration

Install the nginx-Kafka plug-in

Nginx can write data directly to Kafka.

1. Install git
	yum install -y git
Copy the code
2. Switch to /usr/local/src directory, and then the KAFka C client source clone to the local
cd /opt/baixw/servers/
git clone https://github.com/edenhill/librdkafka
Copy the code
3. Enter librdKafka and compile
cd librdkafka 
yum install -y gcc gcc-c++ pcre-devel zlib-devel 
./configure 

make 
make install
Copy the code
/usr/local/ SRC /usr/local/ SRC/nginx/kafka
cd /opt/baixw/servers/
git clone https://github.com/brg-liuwei/ngx_kafka_module
Copy the code
5. Configure Nginx
## decompressionTar -zxvf nginx-1.9.9.tar.gz -c /opt/baixw/servers/
#Go to the nginx directoryCD/opt/baixw/servers/nginx - 1.9.9
## configuration
./configure --add-module=/opt/baixw/servers/ngx_kafka_module/

## compiler
make
make install
Copy the code

Note: The compiled file is not in /opt/baixw/servers/nginx-1.9.9. The compiled file is in /usr/local/nginx

6. Modify the nginx configuration file. For details, see the nginx.conf file in the current directory
mv /usr/local/nginx/conf/nginx.conf /usr/local/nginx/conf/nginx-kafka.conf
vi /usr/local/nginx/conf/nginx-kafka.conf
Copy the code
#user nobody;
worker_processes  1;

#error_log logs/error.log;
#error_log logs/error.log notice;
#error_log logs/error.log info;

#pid logs/nginx.pid;


events {
    worker_connections  1024;
}


http {
    include       mime.types;
    default_type  application/octet-stream;

    #log_format main '$remote_addr - $remote_user [$time_local] "$request" '
    # '$status $body_bytes_sent "$http_referer" '
    # '"$http_user_agent" "$http_x_forwarded_for"';
    #access_log logs/access.log main;
    sendfile        on;
    #tcp_nopush on;
    #keepalive_timeout 0;
    keepalive_timeout  65;
    #gzip on;
   
		# # kfaka configuration
    kafka;
    kafka_broker_list bigDate1:9092; 	
    # # specified kafka_broker_list kafka cluster IP | host: port;
		##location can be divided by topic URL

    server {
        listen       80;
        server_name  node-6.xiaoniu.com;
        #charset koi8-r;
        #access_log logs/host.access.log main;

    ## Listen to Kafka requests
    	location = /kafka/access {
                kafka_topic access;
        }

        #error_page 404 /404.html;

        # redirect server error pages to the static page /50x.html
        #
        error_page   500 502 503 504  /50x.html;
        location = /50x.html {
            roothtml; }}}Copy the code
7. Start the nginx
/usr/local/nginx/sbin/nginx -c /usr/local/nginx/conf/nginx-kafka.conf 
Copy the code

Error: cannot find kafka. So.1 file

error while loading shared libraries: librdkafka.so.1: cannot open shared object file: No such file or directory
Copy the code

The reason is that the library compilation is not loaded

8. Load the SO library
#Load /usr/upon startuplocal/lib
echo "/usr/local/lib" >> /etc/ld.so.conf

#Manual loading
ldconfig
Copy the code
9. The test

Start the nginx test by pinging the nginx port and seeing if kafka consumers can consume the nginx data

curl localhost/kafka/access -d "test nginx to kafka"
curl localhost/kafka/access -d "baixw111"
Copy the code