Computational Graphs in Deep Learning Unit v4 Deep Leaerning
Computational Graphs in Deep Learning Unit v4 Deep Leaerning
1. Forward computation
2. Backward computation
A variable is represented by a node in a graph. It could be a scalar, vector, matrix, tensor, or even
another type of variable.
A function argument and data dependency are both represented by an edge. These are similar to
node pointers.
A simple function of one or more variables is called an operation. There is a set of operations that are
permitted. Functions that are more complex than these operations in this set can be represented by
combining multiple operations.
Here, we have three operations, addition, subtraction, and multiplication. To create a computational
graph, we create nodes, each of them has different operations along with input variables. The direction
of the array shows the direction of input being applied to other nodes.
We use cookies to ensure you have the best browsing experience on our website. By using our site, you acknowledge that you have
Got It !
read and understood our Cookie Policy & Privacy Policy
We can find the final output value by initializing input variables and accordingly computing nodes of the
graph.
Computations of the neural network are organized in terms of a forward pass or forward propagation
step in which we compute the output of the neural network, followed by a backward pass or backward
propagation step, which we use to compute gradients/derivatives. Computation graphs explain why it is
organized this way.
If one wants to understand derivatives in a computational graph, the key is to understand how a change
in one variable brings change on the variable that depends on it. If a directly affects c, then we want to
know how it affects c. If we make a slight change in the value of a how does c change? We can term this
as the partial derivative of c with respect to a.
Graph for backpropagation to get derivatives will look something like this:
We have to follow chain rule to evaluate partial derivatives of final output variable with respect to input
variables: a, b, and c. Therefore the derivatives can be given as :
We use cookies to ensure you have the best browsing experience on our website. By using our site, you acknowledge that you have
read and understood our Cookie Policy & Privacy Policy
This gives us an idea of how computational graphs make it easier to get the derivatives using
backpropagation.
Here's a complete roadmap for you to become a developer: Learn DSA -> Master
Frontend/Backend/Full Stack -> Build Projects -> Keep Applying to Jobs
And why go anywhere else when our DSA to Development: Coding Guide helps you do this in a single
program! Apply now to our DSA to Development Program and our counsellors will connect with you for
further guidance & support.
Previous Next