Data Science Asked by Ana Smile on August 13, 2020
I am trying to make a simple neural network with one dependent and one independent variable. Could you maybe give me a tutorial or help me with the implementation of a neural network with one dependent and one independent variable. So far I have the following code, however my predictions are not good although the error is minimized. Should I scale X and Y or do I have some mistake?
Thank you in advance
import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np
x=[(i*i)+0.2 for i in range(1000)]
y=[i for i in range(1000)]
x_train=np.reshape(x,(-1,1))
y_train=np.reshape(y,(-1,1))
x_test=x_train[:,-10:]
y_test=y_train[:,-10:]
plt.scatter(x_train,y_train)
plt.show()
X=tf.placeholder(tf.float32,[None,1])
Y=tf.placeholder(tf.float32,[None,1])
n_inputs=1
n_hidden_1=20
n_hidden_2=20
n_outputs=1
weights={
"h1": tf.Variable(tf.random_normal([n_inputs,n_hidden_1])),
"h2": tf.Variable(tf.random_normal([n_hidden_1,n_hidden_2])),
"out": tf.Variable(tf.random_normal([n_hidden_2,n_outputs]))
}
biases={
"b1": tf.Variable(tf.random_normal([n_hidden_1])),
"b2": tf.Variable(tf.random_normal([n_hidden_2])),
"out": tf.Variable(tf.random_normal([n_outputs]))
}
def neural_net(x):
layer_1=tf.add(tf.matmul(x,weights["h1"]),biases["b1"])
layer_1=tf.nn.relu(layer_1)
layer_2=tf.add(tf.matmul(layer_1,weights["h2"]),biases["b2"])
layer_2=tf.nn.relu(layer_2)
layer_3=tf.matmul(layer_2,weights["out"])+biases["out"]
return layer_3
Y_pred=neural_net(X)
loss=tf.losses.mean_squared_error(X,Y_pred)
optimizer=tf.train.AdamOptimizer(learning_rate=0.01)
train_op=optimizer.minimize(loss)
epochs=1000
init=tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
for i in range(epochs):
sess.run(train_op,feed_dict={X:x_train,Y:y_train})
loss_op=sess.run(loss,feed_dict={X:x_train,Y:y_train})
if i%10==0:
print("Epoch "+str(i)+" loss "+str(loss_op))
pred=sess.run(Y_pred,feed_dict={X:x_test})
plt.plot(pred,color="red")
plt.plot(y_test,color="blue")
plt.show()
plt.scatter(pred,y_test)
plt.show()
for i in range(len(pred)):
print(str(pred[i])+" "+str(y_test[i]))
Your predictions are not actually that bad. At the very last line of your code, print the expected value too at each line (that is x_test[i]= y_test[i]^2+0.2).
Answered by serali on August 13, 2020
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP