Tuesday Tip #49: How confident are your predictions? 🤔


Hi Reader,

I appreciate everyone who has emailed to check on me and my family post-Helene!

It has been more than 6 weeks since the hurricane, and most homes in Asheville (mine included) still don't have clean, running water. We're hopeful that water service will return within the next month.

In the meantime, we're grateful for all of the aid agencies providing free bottled water, free meals, places to shower, and so much more. ❤️

Thanks for allowing me to share a bit of my personal life with you!

Now, back to the Data Science. 😄


🔗 Link of the week

The Present Future: AI's Impact Long Before Superintelligence (Ethan Mollick)

A short, compelling article demonstrating the impact that today's multimodal models can achieve when interacting with the real world!


👉 Tip #49: Calculating the confidence of your classifier

Although Generative AI is the focus of everyone's attention (I'm even working on a GenAI course! 😲), supervised Machine Learning is still the optimal tool for solving most real-world predictive problems.

Today's tip answers the question: How certain is my classification model about its predictions?

It comes directly from my course, Master Machine Learning with scikit-learn.


Let's say you need to predict whether individual users are likely to buy your product. You might build a classifier that outputs "1" if they are likely to buy, and "0" otherwise:

But what if there were 50,000 users who are likely to buy, but you can only afford to market to 500 of them?

In that case, you would use your marketing budget to reach the 500 users who are most likely to buy.

In Machine Learning terms, we're looking for the users with the highest "predicted probability" of buying. Here's how we would find these users:

Here's what we did:

  • The "predict_proba" method output a 2-dimensional array, in which column 0 shows the probability of class "0" (not likely to buy) and column 1 shows the probability of class "1" (likely to buy).
  • We used NumPy's slicing notation to extract column 1.

Because we're using a well-calibrated classifier called logistic regression, these predicted probabilities can be directly interpreted as the model's confidence in each prediction.

In this example, the model thinks the 8th user is the most likely to buy since they have the highest predicted probability (0.612).

Conclusion:

If you had 50,000 users and you needed to choose which 500 users to target, you would calculate the predicted probability for all 50,000 and then select the 500 users with the highest probabilities!


Did you enjoy this short lesson? There are 148 more video lessons like this in my newest ML course, Master Machine Learning with scikit-learn!


👋 See you next week!

If you liked this week's tip, please share it with a friend! It really helps me out.

- Kevin

P.S. I spent my entire life savings on pasta 🍝

Did someone AWESOME forward you this email? Sign up here to receive more Data Science tips!

Learn Data Science from Data School 📊

Join 25,000+ aspiring Data Scientists and receive Python & Data Science tips every Tuesday!

Read more from Learn Data Science from Data School 📊

Hi Reader, Next week, I’ll be offering a Black Friday sale on ALL of my courses. I’ll send you the details this Thursday! 🚨 👉 Tip #50: What is a "method" in Python? In Python, a method is a function that can be used on an object because of the object's type. For example, if you create a Python list, the "append" method can be used on that list. All lists have an "append" method simply because they are lists: If you create a Python string, the "upper" method can be used on that string simply...

Hi Reader, Regardless of whether you enrolled, thanks for sticking with me through the launch of my new course! 🚀 I've already started exploring topics for the next course... 😄 🔗 Link of the week git cheat sheet (PDF) A well-organized and highly readable cheat sheet from Julia Evans, the brilliant mind behind Wizard Zines! 👉 Tip #48: Three ways to set your environment variables in Python I was playing around with Mistral LLM this weekend (via LangChain in Python), and I needed to set an...

Hi Reader, Last week, I announced that a new course is coming soon and invited you to guess the topic. Hundreds of guesses were submitted, and four people who guessed correctly got the course for free! (I've already notified the winners.) I'll tell you about the course next week. In the meantime, I've got a new Tuesday Tip for you! 👇 🔗 Link of the week OpenAI just unleashed an alien of extraordinary ability (Understanding AI) If you're curious about what makes OpenAI's new "o1" models so...