Contrib. Learn# # contrib. Learn# # contrib ## contrib ## contrib #

preface

The classification of Iris data sets is mainly based on official documents. Using tf. Contrib. Learn. Tf. Contrib. Learn fast to build a deep web classifier,

steps

  1. Importing CSV Data
  2. Build a network classifier
  3. Training network
  4. Calculate the test set accuracy rate
  5. Classify new samples

data

The Iris dataset contains 150 rows of data with three different Iris variety classifications. Each line of data gives four characteristic information and one classification information. The data has now been divided into training sets and test sets

The network structures,

1. Import TensorFlow and Numpy first

    
  1. from __future__ import absolute_import
  2. from __future__ import division
  3. from __future__ import print_function
  4. import tensorflow as tf
  5. import numpy as np

2. Import data

    
  1. Define data address
  2. IRIS_TRAINING = "iris_training.csv"
  3. IRIS_TEST = "iris_test.csv"
  4. # import data
  5. training_set = tf.contrib.learn.datasets.base.load_csv_with_header(
  6. filename=IRIS_TRAINING,
  7. target_dtype=np.int,
  8. features_dtype=np.float32)
  9. test_set = tf.contrib.learn.datasets.base.load_csv_with_header(
  10. filename=IRIS_TEST,
  11. target_dtype=np.int,
  12. features_dtype=np.float32)

Load_csv_with_header () takes three arguments

  • Filename, data address
  • Target_dtype, numpy datatype of target value (iris target value is 0,1,2, so np.int)
  • Features_dtype, numpy datatype for feature values.

3. Set up the network structure

    
  1. Each row of data has four characteristics, all of which are real-value
  2. feature_columns = [tf.contrib.layers.real_valued_column("", dimension=4)]
  3. # layer 3 DNN, 3 classification problem
  4. classifier = tf.contrib.learn.DNNClassifier(feature_columns=feature_columns,
  5. hidden_units=[10, 20, 10],
  6. n_classes=3,
  7. model_dir="iris_model")

Parameter interpretation

  • Feature_columns eigenvalue
  • Hidden_units =[10, 20, 10]. Three hidden layers, containing hidden neurons 10, 20, 10 in sequence
  • N_classes Number of classes
  • Model_dir Model save address

4. Training data

    
  1. classifier.fit(x=training_set.data, y=training_set.target, steps=2000)

Steps refers to training times

5. Calculation accuracy

    
  1. accuracy_score = classifier.evaluate(x=test_set.data, y=test_set.target)["accuracy"]
  2. print('Accuracy: {0:f}'.format(accuracy_score))

The run result is

    
  1. Accuracy: 0.966667

6. Make predictions for new samples

    
  1. # Classify two new flower samples.
  2. new_samples = np.array(
  3. [[6.4, 3.2, 4.5, 1.5], [5.8, 3.1, 5.0, 1.7]], dtype = float)
  4. y = list(classifier.predict(new_samples, as_iterable=True))
  5. print('Predictions: {}'.format(str(y)))

The running results are as follows:

    
  1. Prediction: [1 2]

The complete code

    
  1. from __future__ import absolute_import
  2. from __future__ import division
  3. from __future__ import print_function
  4. import tensorflow as tf
  5. import numpy as np
  6. IRIS_TRAINING = "iris_training.csv"
  7. IRIS_TEST = "iris_test.csv"
  8. training_set = tf.contrib.learn.datasets.base.load_csv_with_header(
  9. filename=IRIS_TRAINING,
  10. target_dtype=np.int,
  11. features_dtype=np.float32)
  12. test_set = tf.contrib.learn.datasets.base.load_csv_with_header(
  13. filename=IRIS_TEST,
  14. target_dtype=np.int,
  15. features_dtype=np.float32)
  16. feature_columns = [tf.contrib.layers.real_valued_column("", dimension=4)]
  17. classifier = tf.contrib.learn.DNNClassifier(feature_columns=feature_columns,
  18. hidden_units=[10, 20, 10],
  19. n_classes=3,
  20. model_dir="iris_model")
  21. classifier.fit(x=training_set.data,
  22. y=training_set.target,
  23. steps=2000)
  24. accuracy_score = classifier.evaluate(x=test_set.data,
  25. y=test_set.target)["accuracy"]
  26. print('Accuracy: {0:f}'.format(accuracy_score))
  27. new_samples = np.array(
  28. [[6.4, 3.2, 4.5, 1.5], [5.8, 3.1, 5.0, 1.7]], dtype = float)
  29. y = list(classifier.predict(new_samples, as_iterable=True))
  30. print('Predictions: {}'.format(str(y)))

reference


Alipay scans code to reward