Hacker Newsnew | past | comments | ask | show | jobs | submit | trybackprop's commentslogin

A blog for learning all things ML – concepts, interview tips, online ML courses: https://www.trybackprop.com


I made a list of all the free resources I used to study ML and deep learning to become an ML engineer at FAANG, so I think it'll be helpful to follow these resources: https://www.trybackprop.com/blog/top_ml_learning_resources (links in the blog post)

Fundamentals Linear Algebra – 3Blue1Brown's Essence of Linear Algebra series, binged all these videos on a one hour train ride visiting my parents

Multivariable Calculus – Khan Academy's Multivariable Calculus lessons were a great refresher of what I had learned in college. Looking back, I just needed to have reviewed Unit 1 – intro and Unit 2 – derivatives.

Calculus for ML – this amazing animated video explains calculus and backpropagation

Information Theory – easy-to-understand book on information theory called Information Theory: A Tutorial Introduction.

Statistics and Probability – the StatQuest YouTube channel

Machine Learning Stanford Intro to Machine Learning by Andrew Ng – Stanford's CS229, the intro to machine learning course, published their lectures on YouTube for free. I watched lectures 1, 2, 3, 4, 8, 9, 11, 12, and 13, and I skipped the rest since I was eager to move onto deep learning. The course also offers a free set of course notes, which are very well written.

Caltech Machine Learning – Caltech's machine learning lectures on YouTube, less mathematical and more intuition based

Deep Learning Andrej Karpathy's Zero to Hero Series – Andrej Karpathy, an AI researcher who graduated with a Stanford PhD and led Tesla AI for several years, released an amazing series of hands on lectures on YouTube. highly highly recommend

Neural networks – Stanford's CS231n course notes and lecture videos were my gateway drug, so to speak, into the world of deep learning.

Transformers and LLMs Transformers – watched these two lectures: lecture from the University of Waterloo and lecture from the University of Michigan. I have also heard good things about Jay Alammar's The Illustrated Transformer guide

ChatGPT Explainer – Wolfram's YouTube explainer video on ChatGPT

Interactive LLM Visualization – This LLM visualization that you can play with in your browser is hands down the best interactive experience with an LLM.

Financial Times' Transformer Explainer – The Financial Times released a lovely interactive article that explains the transformer very well.

Residual Learning – 2023 Future Science Prize Laureates Lecture on residual learning.

Efficient ML and GPUs How are Microchips Made? – This YouTube video by Branch Education is one of the best free educational videos on the internet, regardless of subject, but also, it's the best video on understanding microchips.

CUDA – My FAANG coworkers acquired their CUDA knowledge from this series of lectures.

TinyML and Efficient Deep Learning Computing – 2023 lectures on efficient ML techniques online.

Chip War – Chip War is a bestselling book published in 2022 about microchip technology whose beginning chapters on the invention of the microchip actually explain CPUs very well


Wow, thanks for the links to all the resources. Lot of interesting stuff for me to learn!


I wrote a post about how to build image search using OpenAI's CLIP, and I included a Google Colab notebook that walks you through step by step the coding process. My article also gives the reader all the fundamental linear algebra and ML they need to know. Enjoy!


Thanks for the feedback!


It's nitpicking until you someone learns about Hadamard product and confusion sets in.

a∘b = a_ij × b_ij

a·b = aᵀb = a_ji × b_ij


From my experience with 5 years in “software engineering” and then 7 years in “machine learning”, what matters is the most is that you like what you do so that you can bring your A game to work. That’ll separate you from the average engineer and management and peers will take notice. Yes it’s very tough to find a job right now, and there will always be down cycles. But I’ve noticed the best engineers are able to stay afloat even during downturns because they’ve built up a reputation for being a good engineer. Plus, you can always transition into ML if you work hard enough. Even with an ML degree, you’re not guaranteed to find an ML job these days. I actually wrote a blog post about how folks transitioned into ML that you might find useful: https://www.trybackprop.com/blog/2024_06_09_you_dont_need_a_...


Yeah there's a huge difference in quality between engineers who like engineering and those who just got into it for the paycheck


It’s tricky to notice the difference sometimes. I like programming and read all the tech books out there plus I spent time on pet projects. At work I just do my work as a professional but always keeping a distance and not getting too involved or passionate about it.


I just posted part 2: https://news.ycombinator.com/item?id=40846513. It covers the dot product and embeddings and features more visuals and two interactive playgrounds to reinforce the concepts learned. Hope you find it useful.


Thanks! I built the interactive playgrounds with React in TypeScript, a lot of SVG manipulation. All of this was built with either Python for visuals or React in general.


Any plans for future articles?


Learn the linear algebra you need to know for AI/ML. This covers the dot product both algorithmically and visually and applies it to machine learning embeddings. This article also contains visualizations and two interactive playgrounds, the Interactive Dot Product Playground [0] and the Interactive Embedding Explorer [1] (best viewed with laptop or desktop!) to reinforce the concepts that are taught.

[0] https://www.trybackprop.com/blog/linalg101/part_2_dot_produc...

[1] https://www.trybackprop.com/blog/linalg101/part_2_dot_produc...


I actually wrote a blog post about this for experienced software engineers like you who are thinking of transitioning to ML, so I wanted to share it here: https://www.trybackprop.com/blog/2024_06_09_you_dont_need_a_...

I write about various engineers who now work at Meta, Google, Amazon, and OpenAI who made the switch. You can see what strategies and tactics they used to do it.

1) It's "wise" if you find during your personal hours you are enjoying hacking on it. Before I made the switch, I spent a year studying the material on nights and weekends so that was m my first data point that perhaps this is something I wanted to do full time.

2) Yes, I have! And I've been an ML engineer for 7 years now after I made the switch. For context, I'm an ML tech lead at FAANG. Prior to that, I worked in infrastructure and product.

3) One piece of advice I got on this years ago is to join a team adjacent to ML work so that you can get familiar with what production ML looks like. You can also start practicing ML thinking on Kaggle.com.

P.S. You can check out other posts in my blog for resources to learn AI/ML and the math needed for this career, such as my Linear Algebra 101 for AI/ML series: https://www.trybackprop.com/blog/linalg101/part_1_vectors_ma... (includes interactive quizzes, fundamentals of vectors/matrices, and a quick intro to PyTorch, an open source ML framework widely used in industry)


How much math is truly necessary to work as MLE in a company where you do not need to write papers but need to deliver working ML systems?


Just basic stats, basic calculus etc to working with data and use ML algos/ML techniques etc.


Thank you for the kind words! Please feel free to leave any feedback if you have any. Thanks!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: