The target

This article aims to introduce the basic knowledge and practical examples of TensorFlow. We hope that you will become familiar with the basic operation of TensorFlow after learning it

Simple recurrent neural network

import tensorflow as tf from tensorflow.examples.tutorials.mnist import input_data mnist = input_data.read_data_sets('MNIST', One_hot =True) BATCH_size = 64 N_batches = mnist.train.num_examples // batch_size N_classes = 10 # embedding_size = 28 # input dimension x = tf.placeholder(tF.float32, [None, 784]) y = tf.placeholder(tf.float32, [None, 10]) weights = tf.Variable(tf.random_normal([hidden_size, n_classes], Biases = tf.Variable(tf.zeros([n_classes])) def RNN(x, w, b): inputs = tf.reshape(x, shape = [-1, steps, Embedding_size]) # image to form a timing sequence cell = tf. The contrib. The RNN. BasicRNNCell (hidden_size) hidden_size _ # output per RNN hidden dimensions, final_state = tf.nn.dynamic_rnn(cell, inputs, Result = tf.nn.softmax(tf.matmul(final_state, W) + b) # return result predict = RNN(x, weights, Loss = tf.reduce_mean(tf.nn. Softmax_cross_entropy_with_logits (labels=y, Logits =predict)) # loss opt = tf.train.adamoptimizer (0.001). Minimize (loss) # define optimizer correct = tf.argmax(y,1), Tf.argmax (predict,1)) accuracy = tf.reduce_mean(tf.cast(correct, tF.float32)) # accuracy with tf.session () as sess: sess.run(tf.global_variables_initializer()) total_batch = 0 last_batch = 0 best = 0 for epoch in range(100): for _ in range(n_batches): xx, yy = mnist.train.next_batch(batch_size) sess.run(opt, {x:xx, y:yy}) acc, l = sess.run([accuracy, loss], {x:mnist.test.images, y:mnist.test.labels}) if acc > best: best = acc last_batch = total_batch print('eopch:%d, acc:%f, loss:%f'%(epoch, acc, l)) if total_batch - last_batch > 5: Print ('early stop') break total_batch += 1Copy the code

Results output

Eopch :0, ACC :0.878200, Loss :1.589309 EOPch :1, ACC :0.907200, Loss :1.556449 eOPch :2, ACC :0.917600, Loss :1.546643 eOPch :4, Acc :0.933500, Loss :1.528619 EOPCH :5, ACC :0.950100, Loss :1.512312 EOPCH :6, ACC :0.951100, Loss :1.511004 Early StopCopy the code

The point a

The basics of recurrent neural networks and their variants can be found on the Internet, there are many learning materials, and you can also refer to my previous article: juejin.cn/post/697234…

Point 2

Compared with the previous single multi-hidden layer network + Dropout accuracy of 97.8%, the accuracy rate of the recurrent neural network is only 95%, which is not as high as that of the convolutional neural network, indicating that CNN has inherent advantages in image processing and that the recurrent neural network is suitable for processing text data.

In this paper, the reference

Reference for this article: blog.csdn.net/qq_19672707…