Tuesday Tip #37: Use scikit-learn's magical model ✨


Hi Reader,

How's your January going? I've been playing a lot of Pokémon (the card game) with my 7 year-old... actually I'm the one who's always bugging him to play 😂

Are you a Pokémon fan? Meowscarada ex, anyone?


🔗 Link of the week

Introduction to Polars (Practical Business Python)

Have you heard of Polars? In short, it's a high-performance, memory-efficient alternative to pandas. If you're new to Polars, this blog post walks through basic Polars code and compares it to pandas.


👉 Tip #37: Simplify data preprocessing with this scikit-learn model

When performing supervised Machine Learning, one of the keys to success is effective data preprocessing, which can require a lot of thought and planning.

However, there's a scikit-learn model which has two magical properties that significantly reduce your preprocessing burden. (And you've probably never even heard of it!)

It's called Histogram-Based Gradient Boosted Trees (HGBT). Here are its magical properties:

  1. Native support for missing values
  2. Native support for categorical features

What exactly does that mean? I'll explain below! 👇


1️⃣ Native support for missing values

When your training data contains missing values, normally you have to impute all missing values as part of the data preprocessing step. (The only alternative is to drop samples or features with missing values, which can mean losing valuable training data!)

But if you use HGBT, it will handle the missing values without imputation or dropping any data!

Here's a minimal example from the scikit-learn documentation:

Here's a longer example from one of my videos.

Note: Decision Trees also support missing values as of scikit-learn 1.3, and Random Forests support missing values as of scikit-learn 1.4.


2️⃣ Native support for categorical features

When your training data contains unordered categorical features, normally you have to one-hot encode them as part of the data preprocessing step.

But if you use HGBT, it will handle the categorical features without encoding!

Starting in scikit-learn 1.4, HGBT will also infer which features are categorical directly from the data types of a pandas DataFrame!

Here's an example from the scikit-learn documentation:

Here's a longer comparison of native categorical support versus one-hot encoding and ordinal encoding.


Going further

If you want to learn more HGBT, check out the scikit-learn user guide.

Or if you're new to scikit-learn, check out one of my FREE scikit-learn courses!


👋 Until next time

Did you like this week’s tip? Please send it to a friend or share this link on social. It really helps me out! 🙌

See you next Tuesday!

- Kevin

P.S. Microsoft Excel World Championship (with live commentary)

Did someone AWESOME forward you this email? Sign up here to receive Data Science tips every week!

Learn Artificial Intelligence from Data School 🤖

Join 25,000+ intelligent readers and receive AI tips every Tuesday!

Read more from Learn Artificial Intelligence from Data School 🤖

Hi Reader, Until 8 PM ET tonight, you can get the All-Access Pass for $99: Here's everything you need to know: Access all existing courses for one year ($700+ value) Includes new courses launched during your subscription Includes e-book version of Master Machine Learning (coming soon) Additional discounts available Lock in this price forever 30-day refund policy Get the Pass for $99 Questions? Please let me know! - Kevin

Hi Reader, I wanted to share with you three limited-time resources for improving your Python skills... 1️⃣ Algorithm Mastery Bootcamp 🥾 Are you looking for an intense, 12-day Python bootcamp? My friend Rodrigo Girão Serrão is running a new Algorithm Mastery Bootcamp, and it starts in just 5 days! In the bootcamp, you'll solve 24 real programming challenges and participate in daily live sessions to discuss and compare solutions. It's a great way to strengthen your problem-solving muscles 💪 I...

Hi Reader, Last week, I launched the All-Access Pass, which gives you access to ALL of Data School's courses for one year. Through Black Friday, you can buy the pass for $99, after which the price will increase. Here are the included courses: Build an AI chatbot with Python ($9) Create your first AI app in 60 minutes using LangChain & LangGraph! ⚡ Build AI agents with Python ($99) Develop the skills to create AI apps that can think and act independently 🤖 Conda Essentials for Data Scientists...