Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
59 views

Tensorflow Tutorial: Benedict Diederich

TensorFlow is an open source machine learning library developed by Google. It uses dataflow graphs to represent numeric computation and can run on CPUs and GPUs. Tensors are multi-dimensional arrays that represent inputs, outputs, and parameters in the computation graph. TensorFlow computations are expressed as stateful dataflow and explicitly evaluated in sessions using variables, placeholders, and feed dictionaries. It allows for automatic differentiation to compute gradients for optimization of loss functions.

Uploaded by

gutou jiang
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views

Tensorflow Tutorial: Benedict Diederich

TensorFlow is an open source machine learning library developed by Google. It uses dataflow graphs to represent numeric computation and can run on CPUs and GPUs. Tensors are multi-dimensional arrays that represent inputs, outputs, and parameters in the computation graph. TensorFlow computations are expressed as stateful dataflow and explicitly evaluated in sessions using variables, placeholders, and feed dictionaries. It allows for automatic differentiation to compute gradients for optimization of loss functions.

Uploaded by

gutou jiang
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

TensorFlow Tutorial

Benedict Diederich
What is TensorFlow?

- Python based (..as well as C++, Haskell, Java and Go APIs)


- Open Source library: machine learning
- by Google
- used for research and production at Google products (Translation, Image
Recognition, etc.)
- License Apache 2.0 open source license
- multiple CPUs and GPUs
- Huge Community (https://github.com/tensorflow/tensorflow)
- Codes available (Github)
- „Model-ZOO“ pretrained networks available 2
What is TensorFlow?

1. provides primitives for defining functions on tensors


2. automatically computing their derivatives
Computes math on tensors (like numpy but with GPU-support!)

What is a „Tensor“?
- multilinear maps from vector spaces to the real numbers
- examples: Scalar, Vector, Matrix
TensorFlow vs. Numpy

Numpy Tensorflow
TensorFlow requires explicit evaluation!
Tensorflow Computation Graph

• Express a numeric computation as a graph.


• Graph nodes are operations which have any number of inputs and outputs
• Graph edges are tensors which flow between nodes
Tensorflow Computation Graph - Session
• “A Session object encapsulates the environment in which Tensor objects are evaluated”

• “TensorFlow programs are usually structured into a construction phase, that assembles a graph,
and an execution phase that uses a session to execute ops in the graph.” - TensorFlow docs
never changes its value(s)

Graph creation
- builds graph sequentially
- adds nodes

Graph computation
Graph evaluation (gives result)
Tensorflow Computation Graph
input 1 = tf.constant(3.0)
input 2 = tf.constant(2.0)
input 3 = tf.constant(5.0)
intermed = tf.add(input2, input 3)
mul = tf.multiply(intermed, input1)

result = mul.eval()

Mathematical operations:
computations that will act on tensors
• MatMul: Multiply two matrix values
• Add: Add elementwise (with broadcasting)
• ReLU: Activate with elementwise rectified linear function
• etc.
TensorFlow Variables

“When you train a model you use variables to hold and update
parameters. Variables are in-memory buffers containing tensors” -
Was previously a constant

TensorFlow variables
must be initialized
- nodes which output their current value before they have
- State is retained across multiple executions values!
of a graph
- e.g. parameters, gradient stores, eligibility
traces, …
- value(s) can be updated
Tensorflow Placeholders and FeedDictionairies

tf.placeholder variables
- dummy nodes that provide entry points for data to
computational graph
- value is fed in at execution time
- inputs, variable learning rates, …

feed_dict
- python dictionary mapping from tf.placeholder vars to
data (numpy arrays, lists, etc.)
Tensorflow Placeholders and FeedDictionairies
Google Example: Graph Creation
Google Example: Graph Evalution
Example: Linear Regression in TensorFlow
Example: Linear Regression in TensorFlow
Example: Linear Regression in TensorFlow
Example: Linear Regression in TensorFlow
Example: Linear Regression in TensorFlow
Example: Linear Regression in TensorFlow
Concept: Auto-Differentiation

• Linear regression example computed L2 loss for a linear regression


system.
• tf.train.Optimizer creates an optimizer:
• tf.train.AdagradOptimizer, tf.train.AdadeltaOptimizer, tf.train.MomentumOptimizer,
tf.train.FtrlOptimizer, etc.
• tf.train.Optimizer.minimize(loss, var_list) adds optimization
operation to computation graph.
• Automatic differentiation computes gradients without user input
• TensorFlow nodes in computation graph have attached gradient operations

• Use backpropagation (using node-specific gradient ops) to


compute required gradients for all variables in graph.
Workflow for TensorFlow

1. Build a graph
§ Graph contains:
- parameter specifications
- model architecture
- optimization process, …
2. Initialize a session
3. Fetch and feed data with Session.run
 Compilation
 Optimization
 Auto-Differentiation etc.
Sources

• Lecture Slides CS224d: TensorFlow Tutorial Bharath Ramsundar


• https://en.wikipedia.org/wiki/TensorFlow
• Talk: Introduction to TensorFlow -
https://docs.google.com/presentation/d/1oB_U_JagxWQdQJlLD80
XlNk6fuv42bHV0hfDAPGAbrc/edit#slide=id.gd2d423435_0_112
• tensorflow.org
• Tensorflow Short intro presentation: bit.ly/stanford-tf-tutorial
• Great Cours: http://cs231n.github.io/
• Great Cours: https://de.udacity.com/course/deep-learning--ud730/

You might also like