Coursera - Deep Learning Specialization

Coursera - Deep Learning Specialization

Coursera - Deep Learning Specialization

Overview

Deep Learning is a the basis of general artifical intelligence. At the heart of this field are ‘neural nets’, which are mathematical nodes connected in specific ways to generate desired outcomes. The ways neural nets are connected hasn’t been developed into closed-form processes yet and the desired outcome is specific. The most infamous is Alpha Go.

Andrew Ng is a professor at Stanford and CEO of DeepLearning.AI, which is on a mission to make deep learning education more accessible to the world. Ng teamed up with Coursera to share his tips, tricks, and techniques. I audited the courses in this specialization to better familiarize myself with the lesson Ng and team provide. The specialization consists of five courses: Neural Networks and Deep Learning, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization , Structuring Machine Learning Projects, Convolutional Neural Networks, Sequence Models

To apply and test my skills I have completed the course assignments. However, we are not allowed to share our code publicly. To make up for this I have created similar projects that I can share.

Neural Networks and Deep Learning

Week 1 - Understand Neural Nets

Imagine trying to predict the price of a home. We already know the bigger the house the more expensive. But also know to consider other features such as location (e.g. Mount Vernon, IL, vs Orange Beach, CA), distance from schools, number of bedrooms, and size of yard (lot minus house). There are certainly more variables and more complex relations to explore. We can use these ‘‘features’’ with a neural net.

The neural net find increasingly complex relationships between these features. Each feature and the relationships have their own unique weights of consideration. After considering everything we compare it against truth. Our goal is to minimize error between what we predict the house to be and what it actually costs. And: we do this with backpropagation

Backpropagation is like doing a calculation in reverse which then changes the initial values of the calculation. The goal of back propagation is to tune the calculations so we can accurately predict the house price. The challenge is to be able to do this for any house price, even if we have never encountered the area, size, etc. The key in this exercise is the weights.

Week 2 - Build NN for Logistic Regression

Original assignment is to build a logistic regression neural network using jupyter notebook to classify cat images. Instead I built a LogRegNN in PyTorch to classify MNIST images. Code can also classify CIFAR10 dataset. However, the network is very simple so it does not perform well. Moreover, the batches and epoches are set log because of low compute power. Therefore, the CIFAR10 dataset may have significant improvements with larger batches and more epoches.

Well, the answer is actually none of the above. With a Google Colab GPU model I was able to increase batch size to 32, reduce classes to Cat vs. Non-Cat, and increase epoches to 7. This generated a final accuracy of 51%. I also experimented with batch_size = 200 and 7 epoches and similar results.

In conclusion, a simple logistic regression model cannot accurately classify complex images. However, a simple logistic regression model can achieve \(>95\%\) accuracy on MNIST, which is relatively simple images. The better approach would be a convolutionary neural network, which could allow the neurons to derive bigger shapes, such as arcs which could depict eyes or ears.

Week 3 - Planar Data Classification with Shallow NN

Week 4 - Image Classification with Deep NN

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization

Structuring Machine Learning Projects

Convolutional Neural Networks

Sequence Models


© 2021. All rights reserved.