Neural Network Tutorial

C++ Neural Network Backpropagation tutorial from scratch.

In this playlist, I teach the neural network architecture and the learning processes to make the ANN able to learn from a dataset.

The Tutorials are divided in each part of the neural network and we start coding it in C++ in Visual Studio 2017. Once you have completed the tutorial you will be able to design your own neural network and optimize it.

  1. Introduction
  2. Neuron & Layer
  3. Input & Output Layer
  4. Hidden Layer
  5. Neural network
  6. Training
  7. FeedForward & Backpropagation


Download theory paper: Backpropagation

C++ Source code uploaded in GitHub: Neural Network Tutorial

Theory

DEFINITION

w_{i j}^{k}: weight for node j in layer l_k for incoming node i.
b_{i}^{k}: bias for node i in layer l_k.
a_{i}^{k}: product sum plus bias (activation) for node i in layer l_k.
o_i^k: output for node i in layer l_k.
n_k: number of nodes i layer l_k.

g: activation function for the hidden layer nodes.
g_o: activation function for the output layer nodes.

FEED-FORWARD

    \[ a_i^k=b_i^k+\sum_{j=1}^{n_{k-1}}{w^k_{ji}o_j^{k-1}} \]

    \[ o_i^k=g(a_{i}^{k}) \]

COST FUNCTION

    \[ E=\frac{1}{2}(y-o_i^k)^2 \]

BACKPROPAGATION

    \[ \frac{\partial E}{\partial w_{i j}^{k}} = \frac{\partial E}{\partial o_{i}^{k}} \frac{\partial o_{i}^{k}}{\partial a_{i}^{k}} \frac{\partial a_{i}^{k}}{\partial w_{i j}^{k}} \]

    \[ \delta_j^k \equiv \frac{\partial E}{\partial a_{j}^{k}} = \frac{\partial E}{\partial o_{i}^{k}} \frac{\partial o_{i}^{k}}{\partial a_{i}^{k}} \]

    \[ \frac{\partial a_{i}^{k}}{\partial w_{i j}^{k}} = \frac{\partial}{w_{i j}^{k}}(b_i^k+\sum_{j=1}^{n_{k-1}}{w^k_{ji}o_j^{k-1}}) = o_i^{k-1} \]

    \[ \frac{\partial E}{\partial w_{i j}^{k}} = \delta^k_j o_i^{k-1} \]

OUTPUT LAYER

    \[ \delta_j^k = \frac{\partial E}{\partial a_{j}^{k}} = (g_o(a_j^l)-y)g'_o(a_j^l) \]

    \[ \frac{\partial E}{\partial w_{i j}^{k}} = \delta_j^k o_i^{k-1} = (g_o(a_j^l)-y) g'_o(a_j^l) o_i^{l-1} \]

HIDDEN LAYER

where l ranges from 1 to r_{k+1}, number of nodes in the next layer.

    \[ \delta_j^k = \frac{\partial E}{\partial a_{j}^{k}} = \sum_{l=1}^{n_{k+1}}{\delta_l^{k+1}\frac{\partial a_l^{k+1}}{\partial a_j^k}} \]

    \[ a_l^{k+1} = \sum_{j=1}^{n_k}{ w^{k+1}_{j l} g(a_j^k)} \]

    \[ \frac{\partial a_l^{k+1}}{\partial a_j^k} = w_{j l}^{k+1}g'(a_j^k) \]

    \[ \delta_j^k = \sum_{l=1}^{n_{k+1}}{\delta_l^{k+1}w_{j l}^{k+1}g'(a_j^k)} = g'(a_j^k) \sum_{l=1}^{n_{k+1}}{\delta_l^{k+1}w_{j l}^{k+1}} \]

    \[ \frac{\partial E}{\partial w_{i j}^{k}} = \delta_j^k o_i^{k-1} = g'(a_j^k)o_i^{k-1}\sum_{l=1}^{n_{k+1}}w_{j l}^{k+1}\delta_k^{k+1} \]

 

11 thoughts on “Neural Network Tutorial”

  1. Nice post. I learn something new and challenging on blogs I
    stumbleupon on a daily basis. It will always be exciting to read articles from other authors and use a little something from their sites.

  2. Please let me know if you’re looking for a writer for your weblog.

    You have some really great posts and I think I would
    be a good asset. If you ever want to take some of the
    load off, I’d absolutely love to write some articles for your blog
    in exchange for a link back to mine. Please shoot me an email
    if interested. Kudos!

  3. Wow that was unusual. I just wrote an really long comment but after I clicked submit my comment didn’t appear. Grrrr… well I’m not writing all that over again. Anyways, just wanted to say wonderful blog!

  4. When I originally commented I clicked the “Notify me when new comments are added” checkbox andnow each time a comment is added I get several emails with the same comment.Is there any way you can remove me from that service?Thank you!

  5. Nice post. I study something more challenging on different blogs everyday. It should all the time be stimulating to read content from other writers. I’d choose to make use of some with the content on my blog. Naturally, I’ll give you a hyperlink in your net blog. Thanks for sharing.

  6. I was very pleased to find this website. I wanted to thanks for your time for this wonderful read!! I definitely enjoying every little bit of it and I have you bookmarked to check out new stuff you blog post.

  7. You really make it seem so easy with your presentation but I find this matter to be actually something that I think I would never understand. It seems too complicated and very broad for me. I’m looking forward for your next post, I will try to get the hang of it!

Leave a Reply

Your email address will not be published. Required fields are marked *