·DNN linear fitting

PaddlePaddle introduction

  • PaddlePaddle is an open source deep learning framework provided by Baidu that enables developers and enterprises to implement their AI ideas safely and quickly
  • The project team brings together the world’s top deep learning scientists and is committed to providing the best deep learning r&d experience for developers and enterprises
  • The framework itself is easy to learn, easy to use, secure, efficient four features, is the most suitable for Chinese developers and enterprises deep learning tools

code

# to load the library
import paddle.fluid as fluid
import numpy
# define data
train_data=numpy.array([[1.0], [2.0], [3.0], [4.0]]).astype('float32')
y_true = numpy.array([[2.0], [4.0], [6.0], [8.0]]).astype('float32')
# Define the network
x = fluid.layers.data(name="x",shape=[1],dtype='float32')
y = fluid.layers.data(name="y",shape=[1],dtype='float32')

l1 = fluid.layers.fc(input=x,size=2,act="relu")
y_predict = fluid.layers.fc(input=l1,size=1,act=None)
# Define the loss function
avg_cost = fluid.layers.mean(fluid.layers.square_error_cost(input=y_predict,label=y))
# Define optimization methods
sgd_optimizer = fluid.optimizer.Adam(learning_rate=0.01)
sgd_optimizer.minimize(avg_cost)
Parameter initialization
cpu = fluid.core.CPUPlace()
exe = fluid.Executor(cpu)
exe.run(fluid.default_startup_program())
Start training with 100 iterations
for i in range(1.2001):
    outs = exe.run(
        feed={'x':train_data,'y':y_true},
        fetch_list=[y_predict.name,avg_cost.name])
    if(i%100= =0) :# output loss
        print(i," steps Loss is",outs[1])

# Observations
print("Final Pre \n",outs[0])
Copy the code

out

(paddle) C:\Files\DATAs\prjs\python\paddle\demo>C:/Files/APPs/RuanJian/Miniconda3/envs/paddle/python.exe C: / Files/DATAs/PRJS/python/paddle/demo/liner. Py 100 steps Loss is [19.995567] 200 steps Loss is [1.1098802] 300 steps Loss is [0.4495614] 400 Steps Loss is [0.31467533] 500 Steps Loss is [0.1992905] 600 Steps Loss is [0.11252441] 700 Steps Loss is [0.05591184] 800 Steps Loss is [0.02425095] 900 Steps Loss is [0.00916326] 1000 Steps Loss is [0.00302502] 1100 Steps Loss is [0.0008769] 1200 Steps Loss is [0.00022424] 1300 Steps Loss is [5.0713417E-05] 1400 Steps Loss is [1.0143418E-05] 1500 Steps Loss is [1.7896114E-06] 1600 Steps Loss is [2.7729777E-07] 1700 Steps Loss is [3.7570317E-08] 1800 Steps Loss is [4.49603E-09] 1900 Steps Loss is [4.896634E-10] 2000 Steps Loss is [6.7430506E-11] Final Pre [[2.0000126] [4.0000057] [5.999998] [7.9999914]]Copy the code