profile

Learn Data Science from Data School πŸ“Š

Tuesday Tip #37: Use scikit-learn's magical model ✨

Published 3 months agoΒ β€’Β 1 min read

Hi Reader,

How's your January going? I've been playing a lot of PokΓ©mon (the card game) with my 7 year-old... actually I'm the one who's always bugging him to play πŸ˜‚

Are you a PokΓ©mon fan? Meowscarada ex, anyone?


πŸ”— Link of the week

​Introduction to Polars (Practical Business Python)

Have you heard of Polars? In short, it's a high-performance, memory-efficient alternative to pandas. If you're new to Polars, this blog post walks through basic Polars code and compares it to pandas.


πŸ‘‰ Tip #37: Simplify data preprocessing with this scikit-learn model

When performing supervised Machine Learning, one of the keys to success is effective data preprocessing, which can require a lot of thought and planning.

However, there's a scikit-learn model which has two magical properties that significantly reduce your preprocessing burden. (And you've probably never even heard of it!)

It's called Histogram-Based Gradient Boosted Trees (HGBT). Here are its magical properties:

  1. Native support for missing values
  2. Native support for categorical features

What exactly does that mean? I'll explain below! πŸ‘‡


1️⃣ Native support for missing values

When your training data contains missing values, normally you have to impute all missing values as part of the data preprocessing step. (The only alternative is to drop samples or features with missing values, which can mean losing valuable training data!)

But if you use HGBT, it will handle the missing values without imputation or dropping any data!

Here's a minimal example from the scikit-learn documentation:

Here's a longer example from one of my videos.

Note: Decision Trees also support missing values as of scikit-learn 1.3, and Random Forests support missing values as of scikit-learn 1.4.


2️⃣ Native support for categorical features

When your training data contains unordered categorical features, normally you have to one-hot encode them as part of the data preprocessing step.

But if you use HGBT, it will handle the categorical features without encoding!

Starting in scikit-learn 1.4, HGBT will also infer which features are categorical directly from the data types of a pandas DataFrame!

Here's an example from the scikit-learn documentation:

Here's a longer comparison of native categorical support versus one-hot encoding and ordinal encoding.


Going further

If you want to learn more HGBT, check out the scikit-learn user guide.

Or if you're new to scikit-learn, check out one of my FREE scikit-learn courses!


πŸ‘‹ Until next time

Did you like this week’s tip? Please send it to a friend or share this link on social. It really helps me out! πŸ™Œ

See you next Tuesday!

- Kevin

P.S. Microsoft Excel World Championship (with live commentary)​

Did someone AWESOME forward you this email? Sign up here to receive Data Science tips every week!

Learn Data Science from Data School πŸ“Š

Kevin Markham

Join 25,000+ aspiring Data Scientists and receive Python & Data Science tips every Tuesday!

Read more from Learn Data Science from Data School πŸ“Š

Hi Reader, Last week, I recorded the FINAL 28 LESSONS πŸŽ‰ for my upcoming course, Master Machine Learning with scikit-learn. That's why you didn't hear from me last week! πŸ˜… I edited one of those 28 videos and posted it on YouTube. That video is today's tip, which I'll tell you about below! πŸ‘‰ Tip #45: How to read the scikit-learn documentation In order to become truly proficient with scikit-learn, you need to be able to read the documentation. In this video lesson, I’ll walk you through the five...

4 days agoΒ β€’Β 1 min read

Hi Reader, happy Tuesday! My recent tips have been rather lengthy, so I'm going to mix it up with some shorter tips (like today's). Let me know what you think! πŸ’¬ πŸ”— Link of the week A stealth attack came close to compromising the world's computers (The Economist) If you haven't heard about the recent "xz Utils backdoor", it's an absolutely fascinating/terrifying story! In short, a hacker (or team of hackers) spent years gaining the trust of an open-source project by making helpful...

18 days agoΒ β€’Β 1 min read

Hi Reader, Today's tip is drawn directly from my upcoming course, Master Machine Learning with scikit-learn. You can read the tip below or watch it as a video! If you're interested in receiving more free lessons from the course (which won't be included in Tuesday Tips), you can join the waitlist by clicking here: Yes, I want more free lessons! πŸ‘‰ Tip #43: Should you discretize continuous features for Machine Learning? Let's say that you're working on a supervised Machine Learning problem, and...

25 days agoΒ β€’Β 2 min read
Share this post