Tuesday Tip #1: Speed up your grid search 🔎


Hi Reader!

Welcome to the first issue of “Tuesday Tips,” a new series in which I’ll share a data science tip with you every Tuesday!

These tips will come from all over the data science spectrum: Machine Learning, Python, data analysis, NLP, Jupyter, and much more!

I hope they will help you to learn something new, work more efficiently, or just motivate and inspire you ✨


👉 Tip #1: Speed up your hyperparameter search

In supervised Machine Learning, “hyperparameter tuning” is the process of tuning your model to make it more effective. For example, if you’re trying to improve your model’s accuracy, you want to find the model parameters that maximize its accuracy score.

One common way to tune your model is through a “grid search”, which basically means that you define a set of parameters you want to try out, and your model evaluation procedure (like cross-validation) checks every combination of those parameters to see which one works the best.

Sounds great, right?

Well, one big problem with grid search is that if your model is slow to train or you have a lot of parameters you want to try, this process can take a LONG TIME.

So what’s the solution? I've got two solutions for you:

1. If you’re using GridSearchCV in scikit-learn, use the “n_jobs” parameter to turn on parallel processing. Set it to -1 to use all processors, though be careful about using that setting in a shared computing environment!

🔗 2-minute demo of parallel processing

2. Also in scikit-learn, swap out RandomizedSearchCV for GridSearchCV. Whereas grid search checks every combination of parameters, “randomized search” checks random combinations of parameters. You specify how many combinations you want to try (based on how much time you have available), and it often finds the “almost best” set of parameters in far less time than grid search!

🔗 5-minute demo of randomized search

How helpful was today’s tip?

🤩🙂😐


If you enjoyed this issue, please forward it to a friend! 📬

See you next Tuesday!

- Kevin

P.S. Shout-out to my long-time pal, Ben Collins, who inspired and encouraged me to start this series. He has been sharing weekly Google Sheets tips for almost 5 years! Check out his site if you want to improve your Sheets skills!

Learn Artificial Intelligence from Data School 🤖

Join 25,000+ intelligent readers and receive AI tips every Tuesday!

Read more from Learn Artificial Intelligence from Data School 🤖

Hi Reader, I'm thrilled to announce that my new book, Master Machine Learning with scikit-learn, is now on sale! Buy from Amazon I poured my heart and soul into making this the highest quality and most practical Machine Learning book available. Publishing this book is a dream come true, and I'd be grateful if you'd consider picking up a copy! 🙏 Option 1: Get the paperback from Amazon ($19) Although most technical books of this size (300+ pages) tend to sell for at least $39, I've priced the...

Hi Reader, A few months ago, I announced that my new book, Master Machine Learning with scikit-learn, would be published in December. Since then, my personal life has undergone some dramatic changes 🥴 During the transition, it has been challenging to focus on anything other than bare life essentials 🍽️ 🛌 🚿 Thankfully, my life has begun to steady (yay!), and so in the past few weeks I've been able to wrap up some key pieces of the project! ✅ I'm thrilled to hold in my hands the FINAL proof...

Hi Reader, happy new year! 🎉 I wanted to share with you the three most important articles I found that look back at AI progress in 2025 and look forward at what is coming in 2026 and beyond. I’ve extracted the key points from each article, but if you have the time and interest, I’d encourage you to read the full articles! 💠 The Shape of AI: Jaggedness, Bottlenecks and Salients By Ethan Mollick “Jaggedness” describes the uneven abilities of AI: It’s superhuman in some areas and far below human...