# What progress did I make?

Hell yeeeeaaah ? I’ve made strong positive progress with my MLSA and I am super happy! I can finally feel the strength of the momentum I’ve been aspiring for. I’ve gotten better with prioritising my learning, maintaining consistency and keeping myself enthused. Without further ado, here’s what I’ve been up to:

## Machine learning

### Kaggle learn: Pandas

This was a great introduction to the capabilities of the pandas library. Whilst I knew very basic pandas e.g. selecting rows, columns etc, I was less knowledgable about intermediate concepts. Now, I’m more comfortable with maps, understand grouping and sorting a lot better and am more familiar with how to combine dataframes together (handhake) There’s a lot to the pandas library but I liked this brief tour for getting started!

### Kaggle learn: Intermediate ML

I enjoyed this fascinating exploration of more advanced machine learning concepts. Dropping data and imputation were compared for dealing with missing values. I also learned about how categorical data could be transformed into numeric form via encoding then explored the basics scikit-learn pipelines. I especially liked how pipelines made experimenting so much smoother. Finally, I explored the intuition behind cross validation, the mechanics of XGBoost and got an overview of data leakage!

### Neural networks

The neural networks playlist by 3Blue1Brown gave me an excellent high level understanding of how neural networks work. Although there are only 4 videos, each one is packed with intuitive information covering the fundamentals of neural networks. I learned about the architecture of a feed forward neural network and how the backpropagation algorithm updates the weights in a neural network via gradient descent.

### Make your first GAN with Pytorch

I’d started this wonderful book before my hiatus and managed to finish it toward the end of 2020, it was fantastic! Going through this book made me realise that I really like project based resources! By the end of my explorations, I had built various simple generative adversarial networks in Pytorch and created my first human faces. Here’s a more in-depth review of the book.

### Digit recogniser

With the knowledge I gained from *Make your first GAN with Pytorch*, I was able to create a convolutional neural network model for the digit recogniser competition. I managed to get 98% category accuracy which I’m pleased with. It was really cool seeing how the predictive power of my model improved as I transitioned from using a basic feed foward neural network to a more optimised version (using binary cross entropy loss function, Adam optimiser and LeakyReLU) and finally to incorporating a CNN architecture. Here’s my notebook for the competition.

### Deep learning with Pytorch: A 60 minute blitz

I explored this blitz to get a feel for the capabilities of Pytorch since my only exposure to the framework was from *Make your first GAN with Pytorch*. Whilst I did manage to get a rough overview of the different parts of the library, there were quite a few things that I didn’t grok but this doesn’t phase me since my interest was fired up by what I saw. I understood the section on tensors and autograd relatively well but there were components in the neural network section e.g. max_pool2d that I didn’t get. More learning needed!

## Mathematics

### Essence of calculus

Another epic playlist of videos by 3Blue1Brown helped to refresh my calculus intuitions. Quite a few of the ideas which made sense to me in my school days had become rusty and forgotten but this helped rekindle my awareness of how calculus helps us in understanding the dynamics of change. As I want to be able to read machine learning papers, it’s important that I re-establish my mathematical foundations and this was a solid step in laying the first brick.

### Essence of linear algebra

I really rate this linear algebra overview by 3Blue1Brown especially because geometric interpretations are used a lot to help understanding. I liked the repeated emphasis that * maths is not a spectator sport* and the encouragement to play with these ideas via pen and paper. I will be revisiting this playlist and the calculus one as well to help solidify intuitions after I explore some more hands-on math courses!

## Software engineering

### Writing READMes

This was a relatively short but helpful course about how to write READMEs effectively. I was able to extend my markdown knowledge which will help me in creating better jupyter notebooks and documenting future projects that I make so others can understand and help me improve my work. I also got introduced to this nifty markdown preview tool.

# What have I paused?

### Grokking deep learning

I’ve put this on the back burner as I’m shifting to the fastai book instead. Why? I’ve realised that to make the fastest progress, it’s best for me to prioritise resources that have a top-down approach to learning first. The fasai book gets you using deep learning from the outset then gradually delves into the details. However, grokking deep learning is bottom-up, teaching you the details/pieces then showing you how to put them together towards the end. Whilst both approaches have their merit, I now gravitate towards what I feel is the most active and impactful for me: top-down ? I intend to revisit grokking deep learning at a later time.

### Deep reinforcement learning nanodegree

Although I had made some progress with this nanodegree, I realised my foundations in probability, machine learning & deep learning were not strong enough! As a result, I was not making much headway! Instead of stalling, I’m pausing this and will continue when I’m more adept with the prerequisites. Onwards and upwards!

# What am I exploring now?

I’ve recently started playing through the fastai book which also comes with video lectures and an active forum so you can connect and learn from other students and machine learning practitioners. For the fastai course, you only need to have some programming experience in python. Interested and want to know more? Check out the course page. The cool thing is you don’t have to buy the book (though it does help support the course creators) but if you’re short on cash, you can also work through the free online jupyter notebooks! I love this democratisation of AI. So far I’ve set up my cloud environment on paperspace, started training a cnn learner and have watched some of the first lecture. If you’d like to try paperspace out and help me in my machine learning journey, please use my referral link or code **A1WLX0T** to get $10 free credit. Thanks!

# What did I learn from the challenges I’ve conquered?

- Tenacity is the most important quality to make anything happen in life. It’s critical to keep pursuing your vision even if sometimes it feels like zero progress is being made
- Focus is critical to learning effectively! My current strategy is to have one priority resource that I’m playing through with any other resources being on cruise control
- At this point in my journey, top down approach is best for me to get stuck in, creating and exploring with a view to use bottom-up resources later on
- Make it a habit! Everything that humans do revolves around habits so it’s best to figure out how to make your learning a process. I’ve been using the #100DaysOfMLCode challenge which has really helped me to build and sustain my commitment

# What are my next steps?

- Study more of fastai
- Update curriculum page to share helpful resources with others
- Build a weekly update blog post habit

Pingback: Machine learning scholar adventure: Chapter 3 – Machine play