Chapter 1: Introduction
Chapter Goal: Describe the book, the TensorFlow infrastructure, give instructions on how to setup a system for deep learning projects
No of pages : 30-50
Sub -Topics
1. Goal of the book
2. Prerequisites
3. TensorFlow Jupyter Notebooks introduction
4. How to setup a computer to follow the book (docker image?)
5. Tips for TensorFlow development and libraries needed (numpy, matplotlib, etc.)
6. The problem of vectorization of code and calculations
7. Additional resources
Chapter 2: Single Neurons
Chapter Goal: Describe what you can achieve with neural networks with just one neuron.
No of pages: 50-70
Sub -Topics
8. Overview of different parts of a neuron
9. Activation functions (ReLu, sigmoid, modified ReLu, etc.) and their difference (which one is for which task better)
10. The new google activation function SWISH (https://arxiv.org/abs/1710.05941?utm_campaign=Artificial%2BIntelligence%2Band%2BDeep%2BLearning%2BWeekly&utm_medium=email&utm_source=Artificial_Intelligence_and_Deep_Learning_Weekly_35)
11. Optimization algorithm discussion (gradient descent)
12. Linear regression
13. Basic Tensorflow introduction
14. Logistic regression
15. Regression (linear and logistic) with tensorflow
16. Practical case discussed in details
17. The difference between regression and classification for one neuron
18. Tips for TensorFlow implementation
Chapter 3: Fully connected Neural Network with more neurons
Chapter Goal: Describe what is a fully connected neural network and how to implement one (with one or more layers, etc.), and how to perform classification (binary and multi-class and regression)
No of pages: 30-50
Sub -Topics
1. What is a tensor
2. Dimensions of involved tensors (weights, input, etc.) (with tips on TensorFlow implementation)
3. Distinctions between features and labels
4. Problem of initialization of weights (random, constant, zeros, etc.)
5. Second tutorial on tensorflow
6. Practical case discussed in details
7. Tips for TensorFlow implementation
8. Classification and regression with such networks and how the output layer is different
9. Softmax for multi-class classification
10. Binary classification
Chapter 4: Neural networks error analysis
Chapter Goal: Describe the problem of identifying the sources of errors (variance, bias, data skewed, not enough data, overfitting, etc.)
No of pages: 50-70
Sub -Topics
1. Train, dev and test dataset - why do we need three? Do we need four? What can we detect with different datasets and how to use them or size them?
2. Sources of errors (overfitting, bias, variance, etc.)
3. What is overfitting, a discussion
4. Why is overfitting important with neural networks?
5. Practical case discussion
6. A guide on how to perform error analysis
7. A practical example with a complete error analysis
8. The problem of different datasets (train, dev, test, etc.) coming from different distributions
9. Data augmentation techniques and examples
10. How to deal with too few data
11. How to split the datasets (train, dev, test)? Not 60/20/20 but more 98/1/1 when we have a LOT of data.
12. Tips for TensorFlow implementation
Chapter 5: Dropout technique
Chapter Goal: Describe what dropout is, when to employ it
No of pages: 30-50
Sub -Topics
1. What is dropout ?
2. When we need to
Umberto is currently the head of Innovation in BI & Analytics at a leading health insurance company in Switzerland, where he leads several strategic initiatives that deal with AI, new technologies and machine learning. He worked as data scientist and lead modeller in several big projects in healthcare and has extensive hands-on experience in programming and designing algorithms. Before that he managed projects in BI and DWH enabling data driven solutions to be implemented in complicated productive environments. He worked extensively with neural networks the last two years and applied deep learning to several problems linked to insurance and client behaviour (like customer churning). He presented his results on deep learning at international conferences and internally gained a reputation for his huge experience with Python and deep learning.