Ariel University
Course number: 7061510-1
Lecturer: Dr. Amos Azaria
Edited by: Moshe Hanukoglu
Date: First Semester 2018-2019
Based on presentations by Dr. Amos Azaria
Installation, see: link
tensorflow is build as graph, meaning you can create nodes as constant, variable and function. You can connect the nodes and do action on them.
import tensorflow as tf
Create two nodes as constant.
a = tf.constant(3)
b = tf.constant(4)
When we print a node the result is an information about the node.
a
C is a node that contains a multiplication operation between nodes a and b
c = a*b
c
When we want to display the value of multiplication we need to create a session and run it.
sess = tf.Session()
sess.run(a)
sess.run(c)
Variable is a node like a constant but the difference between them is that vriable, as his name, can change his value.
var1 = tf.Variable(3)
var2 = tf.Variable(4)
c2 = var1 * var2
print(var1)
print(c2)
sess.run(tf.global_variables_initializer())
sess.run(var1)
sess.run(c2)
Creating two nodes is one counter and the other is a constant containing the step value.
The third node contains the function we want to do.
import tensorflow as tf
x = tf.Variable(1)
step = tf.constant(2)
update = tf.assign(x, x+step)
In each time that we run sess.run(update) we increace the value of x by 2.
sess = tf.Session()
sess.run(tf.global_variables_initializer())
for i in range(4):
print(sess.run(update))
Display the value of x.
sess.run(x)
To initialise the value of x to the start value, write
sess.run(tf.global_variables_initializer())
sess.run(x)
import tensorflow as tfx = tf.constant([7.01, 3.02, 4.99, 8.])
y_ = tf.constant([14.01, 6.01, 10., 16.04])
m = tf.Variable(0.) #note the dot
Definition of loss function
y = m*x
loss = tf.reduce_mean(tf.pow(y - y_, 2))
Select the GradientDescentOptimizer optimization method. Also define the size of the step $(\alpha)$ and delivery of the loss function.
update = tf.train.GradientDescentOptimizer(0.0001).minimize(loss)
sess = tf.Session()
sess.run(tf.global_variables_initializer())
Run 1000 epochs
for _ in range(0,1000):
sess.run(update)
print(sess.run(m))
If we do not want to load the data at the start of the run but during the run, we can use PlaceHolders node.
PlaceHolders gets a type of the data and the dimension of the data.
x = tf.placeholder(tf.float32, [None, 1])
y_ = tf.placeholder(tf.float32, [None, 1]).
.
.for _ in range(0,1000):
sess.run(update, feed_dict = {x:[[7.01], [3.02], [4.99], [8.]], y_:[[14.01], [6.01], [10.], [16.04]]})
We can use tensorflow.train.Saver to save the data and weights we get during the run.
This help us to resume training.
A good practice is to save checkpoints every X updates.
saver = tf.train.Saver()
saver.save(sess, filename)
saver.restore(sess, filename)
If you want to see more link
We will demonstrate the use of these commands in one of the previous examples.
import tensorflow as tf
import numpy as npfeatures = 2
x = tf.placeholder(tf.float32, [None, features])
y_ = tf.placeholder(tf.float32, [None, 1])
W = tf.Variable(tf.zeros([features,1]))
b = tf.Variable(tf.zeros([1]))
data_x = np.array([[2,4],[3,9],[4,16],[6,36],[7,49]])
data_y = np.array([[70],[110],[165],[390],[550]])y = tf.matmul(x,W) + b
loss = tf.reduce_mean(tf.pow(y - y_, 2))
update = tf.train.GradientDescentOptimizer(0.001).minimize(loss)
saver = tf.train.Saver()with tf.Session() as sess:
# initialize all of the variables in the session
sess.run(tf.global_variables_initializer()) for i in range(1000):
sess.run(update, feed_dict={x: data_x, y_: data_y})
if 1 % 100 == 0:
print('Iteration:', i, ' W:', sess.run(W), ' b:', sess.run(b), ' loss:', loss.eval(session=sess, feed_dict={x: data_x, y_: data_y})) # Save the variable in the file
saved_path = saver.save(sess, './saved_variable')
print('model saved in {}'.format(saved_path))
with tf.Session() as sess:
# Restore the saved vairable
saver.restore(sess, './saved_variable')
# Print the loaded variable
a_out, b_out = sess.run([W, b])
print('W = ', a_out)
print('b = ', b_out)
TensorBoard can help visualize the built graph along with presenting different charts.
To create a graph.
The function gets a path for store the graph and the session
tf.summary.FileWriter('./my_graph', sess.graph)
In order to open the browser of graph write at the terminal.
tensorboard --port=8008 --logdir ./my_graph/
In order to give a name to graph
m = tf.Variable(<data>, name = <graphName>)
import tensorflow as tfx = tf.placeholder(tf.float32, [None, 1])
y_ = tf.placeholder(tf.float32, [None, 1])
m = tf.Variable(0.)
y = m*x
loss = tf.reduce_mean(tf.pow(y - y_, 2))
update = tf.train.GradientDescentOptimizer(0.0001).minimize(loss)
msum = tf.summary.scalar('msum', m)
losssum = tf.summary.scalar('losssum', loss)
merged = tf.summary.merge_all()
sess = tf.Session()
file_writer = tf.summary.FileWriter('./my_graph', sess.graph)
sess.run(tf.global_variables_initializer())
data_dict = {x:[[7.01], [3.02], [4.99], [8.]], y_:[[14.01], [6.01], [10.], [16.04]]}
for i in range(0,1000):
[_,curr_sammary] = sess.run([update,merged], feed_dict = data_dict)
file_writer.add_summary(curr_sammary, i)file_writer.close()
print(sess.run(m))