Hi Reader!
Welcome to the first issue of “Tuesday Tips,” a new series in which I’ll share a data science tip with you every Tuesday!
These tips will come from all over the data science spectrum: Machine Learning, Python, data analysis, NLP, Jupyter, and much more!
I hope they will help you to learn something new, work more efficiently, or just motivate and inspire you ✨
In supervised Machine Learning, “hyperparameter tuning” is the process of tuning your model to make it more effective. For example, if you’re trying to improve your model’s accuracy, you want to find the model parameters that maximize its accuracy score.
One common way to tune your model is through a “grid search”, which basically means that you define a set of parameters you want to try out, and your model evaluation procedure (like cross-validation) checks every combination of those parameters to see which one works the best.
Sounds great, right?
Well, one big problem with grid search is that if your model is slow to train or you have a lot of parameters you want to try, this process can take a LONG TIME.
So what’s the solution? I've got two solutions for you:
1. If you’re using GridSearchCV in scikit-learn, use the “n_jobs” parameter to turn on parallel processing. Set it to -1 to use all processors, though be careful about using that setting in a shared computing environment!
🔗 2-minute demo of parallel processing
2. Also in scikit-learn, swap out RandomizedSearchCV for GridSearchCV. Whereas grid search checks every combination of parameters, “randomized search” checks random combinations of parameters. You specify how many combinations you want to try (based on how much time you have available), and it often finds the “almost best” set of parameters in far less time than grid search!
🔗 5-minute demo of randomized search
How helpful was today’s tip?
If you enjoyed this issue, please forward it to a friend! 📬
See you next Tuesday!
- Kevin
P.S. Shout-out to my long-time pal, Ben Collins, who inspired and encouraged me to start this series. He has been sharing weekly Google Sheets tips for almost 5 years! Check out his site if you want to improve your Sheets skills!
Join 25,000+ intelligent readers and receive AI tips every Tuesday!
Hi Reader, Hope you’ve had a nice summer! ☀️ As for me, I’ve been finishing my first ever book! I can’t wait to tell you about it and invite you to be part of the launch… stay tuned 👀 Today's email focuses on a single important topic: AI’s impact on your mental health 🧠 Read more below! 👇 Sponsored by: Morning Brew The 5-Minute Newsletter That Makes Business Make Sense Business news doesn't have to be dry. Morning Brew gives you the biggest stories in business, tech, and finance with quick...
Hi Reader, Most of us access Large Language Models (LLMs) through a web interface, like ChatGPT or Claude. It’s highly convenient, though there are two potential drawbacks: Cost: Some amount of usage is free, but heavy usage (or access to premium models) costs money. Privacy: Depending on the service, your chats may be used to train future models. (Or at the very least, your chats may be accessed if ordered by a court.) One solution is to run an LLM locally, which has gotten much easier with...
Hi Reader, Here are your top AI stories for the week: AI-driven education in 2 hours per day Add yourself to an AI-generated TV show AI models send "subliminal messages" to one another Read more below! 👇 Sponsored by: Superhuman AI Find out why 1M+ professionals read Superhuman AI daily. AI won't take over the world. People who know how to use AI will. Here's how to stay ahead with AI: Sign up for Superhuman AI. The AI newsletter read by 1M+ pros. Master AI tools, tutorials, and news in just...