Chapter 1 Basic concepts of TensorFlow

TensorFlow Tutorial: Understanding Deep Learning

TensorFlow tutorial: Tensor Basics

TensorFlow tutorial: Session basics

TensorFlow TensorFlow Graph tutorial

What are artificial intelligence, machine learning and deep learning

The term “artificial intelligence” is very popular nowadays. But the concept of “artificial intelligence” was first proposed at the Dartmouth Conference in 1956. However, due to the influence of computing power at that time, the technology has not been able to have a good breakthrough. Until now, in the era of continuous development of computer hardware equipment, with the support of big data, “artificial intelligence” has experienced explosive growth.

At present, artificial intelligence is used in all aspects, from our various meitu software and speech recognition software to unmanned car driving. Artificial intelligence is not that far away. It has already begun to infiltrate our lives.

In the field of artificial intelligence, there are two further concepts:

1. Machine learning

2. Deep learning

This figure well illustrates the difference between the two. The machine learning algorithm benefits from the extraction of artificial features. Here’s an example:

When we are dealing with spam classification, we will extract the characteristics of pre-selection and screening in advance, such as: the address of the sender, whether the title contains some keywords and so on, and then learn the weight through the algorithm, so as to predict whether a future mail is spam.

For the deep learning algorithm, we still through spam classification, for example, in the deep learning algorithms, we no longer artificially to extract the characteristic value, we just tell the computer, this is some spam, this some not spam, then computer by neural network to extract the characteristic values of dynamic adjustment of weight. The result is a model that predicts whether a future message will be spam.

This diagram illustrates the relationship between artificial intelligence, machine learning and deep learning.

What is a deep neural network

In fact, the “deep learning” we currently mention is basically a synonym for deep neural network. Deep neural network is similar to bionic machine learning at the earliest, which tries to imitate the learning mechanism of human brain. So it’s often compared to neurons in the human body.

On the left is a human neuron, and on the right is a simple neural network structure.

We can see that the values of the input are weighted and processed by the threshold function to get an output.

Just like our sense of touch, when we touch an apple, the sense of touch is the input value, processed and transmitted by neurons, that allows us to perceive (output) that we’re touching an apple.

Later, I will introduce the structure and meaning of the figure on the right in detail when introducing neural network algorithms. It doesn’t matter if you can’t understand it now.

In order to give you an intuitive feeling of deep neural network, I would like to recommend you to play with the Tensorflow PlayGround provided by Google.

How can we play on the PlayGround happily

Play Ground is an intuitive neural network training tool provided by Tensorflow. It can help us preliminarily feel the flow of neural network training.

The diagram above gives an overview of the elements involved in Play Ground:

Data types: We can choose different data types for training

Physical signs extraction: We can choose to input different features to feel the impact on the training results

Current training steps: We can see the current result is the result of how many steps we have trained

Learning rate: I will introduce the concept of learning rate in more detail later in the course of introducing how to design neural networks. Here we can adjust the value of learning speed to intuitively feel the impact of learning speed on the result

Activation function: in many practical problems, we can’t go through a linear function, we can through the activation function, forced the linear function “turned gay”, become a nonlinear function, in order to solve the problem of nonlinear, about this concept, we are learning to the activation function will be introduced later

Hidden layer: We can see that we now have a very simple two-layer fully connected neural network (all the nodes in one layer are connected to the nodes in the next layer). We can also increase or decrease the number of nodes in a hidden layer by clicking a button. Visualize the size of the hidden layer and its node size on the result

Model results: This is where we can visually see the results of the model’s classification of data.

Test sample: Our deep neural network learning, most importantly, the ability to predict future data. So we often prepare test samples in addition to training samples. These samples do not participate in the model training process, and are “unknown” to the model and have never been learned. Therefore, when we train a model, we often need a test sample to examine the model’s ability to judge unknown samples and determine whether the model is a “good model”.

In the diagram above, we can see that we have a two-layer fully connected neural network, and after 206 steps, we have a very good model that can almost accurately distinguish blue dots from yellow dots (e.g. In practical problems, for example, we regard the yellow dots as spam, and the blue dots as normal mail), at which point our model has certain judgment ability for future samples.

analyse

In this section, we have a preliminary understanding of the relationship between artificial intelligence, machine learning and deep learning, and understand that artificial intelligence has been applied in all aspects of our life, which is also a major trend in the future.

Later, we got some more intuitive concepts and understandings of deep learning by playing the PlayGround of TensorFlow. Plant a small seed for our future study.

In the next section, I’ll show you how to install Tensorflow (but that’s not the point) and the three basic Tensorflow concepts:

Tensor

Session tensors

3