Big data practice first, for small white programming

The operation of the Ubuntu

1. Download the file

1.1 Downloading an Ubuntu Image file

Note: Version 16.04 or 18.04 is recommended

You can download it at pub.mirrors.aliyun.com

The download page is displayed. Select the version, as shown in the following figure. Select ubuntu-x.x.x-desktop-amd64.iso

1.2 Downloading and Installing the VMware

Baidu search can be installed casually, it is recommended to install the disk is relatively large place, other options can be default, online to find broken decoding.

1.3 Installing Ubuntu on VMware

Select the ISO file to download

And I’m just messing around with this, better keep the password simple, I’m using root

The name of the virtual machine can be set as Hadoop, and the location of the virtual machine is best not to put C disk, put other larger disk,

And just keep pressing next

Then wait for the installation to finish.

After go in

Just keep going

Select the city

Just mess with the password

And then there you go

1.4 Installing VMware Tools

Click the VM to install the vm. And then just reboot it.

2. Install the required configurations

2.1 in the source

We need to switch sources first

Find this Software & Update

Find the source you want to change

Can be

2.2 install vim

sudo apt-get install vim
sudo apt-get update
Copy the code

2.3 install.net – tools

2.4 installation jdk1.8

Against 2.4.1 create

Create a file to put the JDK package in

2.4.2 Configuring the Environment

vim ~/.bashrc
Copy the code

Configure the environment like this

Exit and save the. Bashrc file. Run the source command to make it take effect. If you can view the Java version number in any directory or run the Java command, the JDPK configuration is successful

2.5 the open – server

sudo apt-get install openssh-server 
Copy the code

3 Configure a static IP address

3.1 set up

Vmware menu “VIRTUAL Machine” → “Settings” → “Network Adapter” → “Bridge Mode”

3.2 configuration

In the virtual Network Editor, set the external network to which the virtual network is connected. Vmware menu bar “Edit” → “Virtual Network Editor”

3.3 Modifying the Configuration File

3.3.1 interfaces file

sudo vim /etc/network/interfaces
Copy the code

3.3.2 DNS service file

sudo vim /etc/systemd/resolved.conf
Copy the code

3.4 Restarting a NIC

sudo /etc/init.d/networking restart
Copy the code

Ok

Test Ping Baidu and find it works.

Check whether the IP address has been changed.

4 installation IDEA

Find the download center to install.

You can run Hello World, that’s fine

5 Configure SSH login without password

5.1 Verifying that the SSH service is Installed and Started

use

DPKG -- l | grep SSH ps - e | grep SSHCopy the code

Verify that the SSH service is installed and started

Enter the command in the user root directory

ssh-keygen -t rsa
Copy the code

You don’t have to log in so just press enter

Id_rsa is the private key, and id_rsa.pub is the public key

Create an authorized_keys file to store the public key of the remote no-login machine, and then add the local public key to the authorized_keys file to realize the local no-login. Finally, grant valid permission to the authorized_keys file.

The last

ssh localhost
Copy the code

Login without password is successful.

6 Set up a Hadoop cluster

6.1 the cloning

Follow the instructions to complete the cloning. It is recommended to name them Hadoop1, Hadoop2

Then in accordance with the requirements of the address change, to prevent the same cause conflict.

I am the host Hadoop 192.168.43.200

Hadoop – Clone1192.168.43.201

Hadoop – Clone2 192.168.43.202

6.2 use xftp6

Using xftp6, copy the public key file id_rsa.pub of each machine to the other two machines separately.

In addition, perform the SSH login without password in Step 5 for the two cloned machines.

6.3 Putting the private key into the other two machines

6.4 Checking WHETHER SSH Is Successful

Note that SSH is followed by your host and mine is 031904102 so SSH 031904102, 031904102 is the name of my primary node.

Successful Hadoop cluster is set up!