Hi Reader!
Welcome to the first issue of “Tuesday Tips,” a new series in which I’ll share a data science tip with you every Tuesday!
These tips will come from all over the data science spectrum: Machine Learning, Python, data analysis, NLP, Jupyter, and much more!
I hope they will help you to learn something new, work more efficiently, or just motivate and inspire you ✨
In supervised Machine Learning, “hyperparameter tuning” is the process of tuning your model to make it more effective. For example, if you’re trying to improve your model’s accuracy, you want to find the model parameters that maximize its accuracy score.
One common way to tune your model is through a “grid search”, which basically means that you define a set of parameters you want to try out, and your model evaluation procedure (like cross-validation) checks every combination of those parameters to see which one works the best.
Sounds great, right?
Well, one big problem with grid search is that if your model is slow to train or you have a lot of parameters you want to try, this process can take a LONG TIME.
So what’s the solution? I've got two solutions for you:
1. If you’re using GridSearchCV in scikit-learn, use the “n_jobs” parameter to turn on parallel processing. Set it to -1 to use all processors, though be careful about using that setting in a shared computing environment!
🔗 2-minute demo of parallel processing
2. Also in scikit-learn, swap out RandomizedSearchCV for GridSearchCV. Whereas grid search checks every combination of parameters, “randomized search” checks random combinations of parameters. You specify how many combinations you want to try (based on how much time you have available), and it often finds the “almost best” set of parameters in far less time than grid search!
🔗 5-minute demo of randomized search
How helpful was today’s tip?
If you enjoyed this issue, please forward it to a friend! 📬
See you next Tuesday!
- Kevin
P.S. Shout-out to my long-time pal, Ben Collins, who inspired and encouraged me to start this series. He has been sharing weekly Google Sheets tips for almost 5 years! Check out his site if you want to improve your Sheets skills!
Join 25,000+ intelligent readers and receive AI tips every Tuesday!
Hi Reader, This week, I've got a short tip about AI agents, followed by some Data School news... 👉 Tip #56: What are AI agents? Google is calling 2025 "the agentic era," DeepLearning.AI says "the agentic era is upon us," and NVIDIA's founder says "one of the most important things happening in the world of enterprise is agentic AI." Clearly AI agents are a big deal, but what exactly are they? Simply put, an AI agent is an application that uses a Large Language Model (LLM) to control its...
Hi Reader, Last week, I launched a brand new course: Build an AI chatbot with Python. 120+ people enrolled, and a few have already completed the course! 👏 Want to join us for $9? 👉 Tip #55: Should you still learn to code in 2025? You’ve probably heard that Large Language Models (LLMs) are excellent at writing code: They are competitive with the best human coders. They can create a full web application from a single prompt. LLM-powered tools like Cursor and Copilot can autocomplete or even...
Hi Reader, The Python 14-Day Challenge starts tomorrow! Hope to see you there 🤞 👉 Tuesday Tip: My top 5 sources for keeping up with AI I'll state the obvious: AI is moving incredibly FAST 💨 Here are the best sources I follow to keep up with the most important developments in Artificial Intelligence: The Neuron (daily newsletter) My top recommendation for a general audience. It’s fun, informative, and well-written. It includes links to the latest AI news and tools, but the real goldmine is...