Hi Reader,
How's your January going? I've been playing a lot of Pokémon (the card game) with my 7 year-old... actually I'm the one who's always bugging him to play 😂
Are you a Pokémon fan? Meowscarada ex, anyone?
Introduction to Polars (Practical Business Python)
Have you heard of Polars? In short, it's a high-performance, memory-efficient alternative to pandas. If you're new to Polars, this blog post walks through basic Polars code and compares it to pandas.
When performing supervised Machine Learning, one of the keys to success is effective data preprocessing, which can require a lot of thought and planning.
However, there's a scikit-learn model which has two magical properties that significantly reduce your preprocessing burden. (And you've probably never even heard of it!)
It's called Histogram-Based Gradient Boosted Trees (HGBT). Here are its magical properties:
What exactly does that mean? I'll explain below! 👇
When your training data contains missing values, normally you have to impute all missing values as part of the data preprocessing step. (The only alternative is to drop samples or features with missing values, which can mean losing valuable training data!)
But if you use HGBT, it will handle the missing values without imputation or dropping any data!
Here's a minimal example from the scikit-learn documentation:
Here's a longer example from one of my videos.
Note: Decision Trees also support missing values as of scikit-learn 1.3, and Random Forests support missing values as of scikit-learn 1.4.
When your training data contains unordered categorical features, normally you have to one-hot encode them as part of the data preprocessing step.
But if you use HGBT, it will handle the categorical features without encoding!
Starting in scikit-learn 1.4, HGBT will also infer which features are categorical directly from the data types of a pandas DataFrame!
Here's an example from the scikit-learn documentation:
Here's a longer comparison of native categorical support versus one-hot encoding and ordinal encoding.
If you want to learn more HGBT, check out the scikit-learn user guide.
Or if you're new to scikit-learn, check out one of my FREE scikit-learn courses!
Did you like this week’s tip? Please send it to a friend or share this link on social. It really helps me out! 🙌
See you next Tuesday!
- Kevin
P.S. Microsoft Excel World Championship (with live commentary)
Did someone AWESOME forward you this email? Sign up here to receive Data Science tips every week!
Join 25,000+ intelligent readers and receive AI tips every Tuesday!
Hi Reader, Here are your top AI stories for the week: ChatGPT can weaken your brain Claude shares nerve gas recipe Amsterdam ends AI experiment due to bias Read more below! 👇 Sponsored by: Brain.fm Transform Your Focus With Brain.fm I know you're always on the hunt for tools that genuinely improve your life—which is why I'm excited to introduce you to Brain.fm's groundbreaking focus music. Brain.fm's patented audio technology was recently validated in a top neuroscience journal, showing how...
Hi Reader, Last week, I invited you to help me test Google's Data Science Agent in Colab, which promises to automate your data analysis. Does it live up to that promise? Let's find out! 👇 Sponsored by: Morning Brew Business news you’ll actually enjoy Join 4M+ professionals who start their day with Morning Brew—a free daily newsletter that makes business, tech, and finance news genuinely enjoyable to read and hard to forget. Each morning, it breaks down complex stories in plain English—cutting...
Hi Reader, Today I'm trying something brand new! I wrote short summaries of the 5 most important AI stories this week, and also turned it into a video: Watch the video I'd love to know what you think! 💬 AI-generated TV ad airs during NBA finals Prediction market Kalshi just aired this AI-generated ad on network TV during the NBA finals. It was created in just two days by one person using Google's new Veo 3 video model, plus scripting help from Google's Gemini chatbot. Expect to see many more...