Tuesday Tip #37: Use scikit-learn's magical model โœจ


Hi Reader,

How's your January going? I've been playing a lot of Pokรฉmon (the card game) with my 7 year-old... actually I'm the one who's always bugging him to play ๐Ÿ˜‚

Are you a Pokรฉmon fan? Meowscarada ex, anyone?


๐Ÿ”— Link of the week

โ€‹Introduction to Polars (Practical Business Python)

Have you heard of Polars? In short, it's a high-performance, memory-efficient alternative to pandas. If you're new to Polars, this blog post walks through basic Polars code and compares it to pandas.


๐Ÿ‘‰ Tip #37: Simplify data preprocessing with this scikit-learn model

When performing supervised Machine Learning, one of the keys to success is effective data preprocessing, which can require a lot of thought and planning.

However, there's a scikit-learn model which has two magical properties that significantly reduce your preprocessing burden. (And you've probably never even heard of it!)

It's called Histogram-Based Gradient Boosted Trees (HGBT). Here are its magical properties:

  1. Native support for missing values
  2. Native support for categorical features

What exactly does that mean? I'll explain below! ๐Ÿ‘‡


1๏ธโƒฃ Native support for missing values

When your training data contains missing values, normally you have to impute all missing values as part of the data preprocessing step. (The only alternative is to drop samples or features with missing values, which can mean losing valuable training data!)

But if you use HGBT, it will handle the missing values without imputation or dropping any data!

Here's a minimal example from the scikit-learn documentation:

Here's a longer example from one of my videos.

Note: Decision Trees also support missing values as of scikit-learn 1.3, and Random Forests support missing values as of scikit-learn 1.4.


2๏ธโƒฃ Native support for categorical features

When your training data contains unordered categorical features, normally you have to one-hot encode them as part of the data preprocessing step.

But if you use HGBT, it will handle the categorical features without encoding!

Starting in scikit-learn 1.4, HGBT will also infer which features are categorical directly from the data types of a pandas DataFrame!

Here's an example from the scikit-learn documentation:

Here's a longer comparison of native categorical support versus one-hot encoding and ordinal encoding.


Going further

If you want to learn more HGBT, check out the scikit-learn user guide.

Or if you're new to scikit-learn, check out one of my FREE scikit-learn courses!


๐Ÿ‘‹ Until next time

Did you like this weekโ€™s tip? Please send it to a friend or share this link on social. It really helps me out! ๐Ÿ™Œ

See you next Tuesday!

- Kevin

P.S. Microsoft Excel World Championship (with live commentary)โ€‹

Did someone AWESOME forward you this email? Sign up here to receive Data Science tips every week!

Learn Artificial Intelligence from Data School ๐Ÿค–

Join 25,000+ intelligent readers and receive AI tips every Tuesday!

Read more from Learn Artificial Intelligence from Data School ๐Ÿค–

Hi Reader, This week, I've got a short tip about AI agents, followed by some Data School news... ๐Ÿ‘‰ Tip #56: What are AI agents? Google is calling 2025 "the agentic era," DeepLearning.AI says "the agentic era is upon us," and NVIDIA's founder says "one of the most important things happening in the world of enterprise is agentic AI." Clearly AI agents are a big deal, but what exactly are they? Simply put, an AI agent is an application that uses a Large Language Model (LLM) to control its...

Hi Reader, Last week, I launched a brand new course: Build an AI chatbot with Python. 120+ people enrolled, and a few have already completed the course! ๐Ÿ‘ Want to join us for $9? ๐Ÿ‘‰ Tip #55: Should you still learn to code in 2025? Youโ€™ve probably heard that Large Language Models (LLMs) are excellent at writing code: They are competitive with the best human coders. They can create a full web application from a single prompt. LLM-powered tools like Cursor and Copilot can autocomplete or even...

Hi Reader, The Python 14-Day Challenge starts tomorrow! Hope to see you there ๐Ÿคž ๐Ÿ‘‰ Tuesday Tip: My top 5 sources for keeping up with AI I'll state the obvious: AI is moving incredibly FAST ๐Ÿ’จ Here are the best sources I follow to keep up with the most important developments in Artificial Intelligence: The Neuron (daily newsletter) My top recommendation for a general audience. Itโ€™s fun, informative, and well-written. It includes links to the latest AI news and tools, but the real goldmine is...