Tip #60: Run AI models locally (it's private & free!)


Hi Reader,

Most of us access Large Language Models (LLMs) through a web interface, like ChatGPT or Claude. It’s highly convenient, though there are two potential drawbacks:

  • Cost: Some amount of usage is free, but heavy usage (or access to premium models) costs money.
  • Privacy: Depending on the service, your chats may be used to train future models. (Or at the very least, your chats may be accessed if ordered by a court.)

One solution is to run an LLM locally, which has gotten much easier with the release of Ollama’s new app. I’ll tell you all about it below! 👇


Sponsored by: Superhuman AI

Find out why 1M+ professionals read Superhuman AI daily.

AI won't take over the world. People who know how to use AI will.

Here's how to stay ahead with AI:

  1. Sign up for Superhuman AI. The AI newsletter read by 1M+ pros.
  2. Master AI tools, tutorials, and news in just 3 minutes a day.
  3. Become 10X more productive using AI.

Join 1 million pros and start learning AI


🧠 Run AI models locally with Ollama

Ollama is a popular open-source project that makes it easy to run LLMs directly on your local machine. Here are the main benefits of Ollama:

  • Cost: Models are free to download and use.
  • Privacy: Your data and prompts never leave your machine.
  • Offline access: Once models are downloaded, they can be used without Internet access.
  • Multiple providers: Hundreds of open source models are available from dozens of providers.
  • Easy to switch: You can try out many different models from a single interface.

Until recently, Ollama was only available for the command line, but last week they released an app that gives you a graphical interface for chatting with models:

You can also chat with your files, and some models also support image understanding.

Here’s how to get started with Ollama:

  1. Download the app for Mac, Windows, or Linux.
  2. Open the app and start a chat with any of the models listed in the dropdown.
  3. Wait for the selected model to download, after which the model will respond.

The only challenging aspect is selecting which model to use! There are many models to choose from, and most of them have multiple parameter sizes.

For example, here’s the model page for gemma3:

Models size can range from less than 1 GB to hundreds of GBs! Generally, larger models will be higher quality, but they will also require more resources to run.

Just a warning: If you have older hardware, it may not be possible to find a model that runs at an acceptable speed while giving high-quality results.

But if you find one that works well for you, it’s pretty exciting to be able to run models locally and for free! 🎉


Thanks for reading, and feel free to share it with a friend! 🤝

New readers can subscribe here. 💌

- Kevin

P.S. Distract your LLM with cats 🐈

Learn Artificial Intelligence from Data School 🤖

Join 25,000+ intelligent readers and receive AI tips every Tuesday!

Read more from Learn Artificial Intelligence from Data School 🤖

Hi Reader, I'm thrilled to announce that my new book, Master Machine Learning with scikit-learn, is now on sale! Buy from Amazon I poured my heart and soul into making this the highest quality and most practical Machine Learning book available. Publishing this book is a dream come true, and I'd be grateful if you'd consider picking up a copy! 🙏 Option 1: Get the paperback from Amazon ($19) Although most technical books of this size (300+ pages) tend to sell for at least $39, I've priced the...

Hi Reader, A few months ago, I announced that my new book, Master Machine Learning with scikit-learn, would be published in December. Since then, my personal life has undergone some dramatic changes 🥴 During the transition, it has been challenging to focus on anything other than bare life essentials 🍽️ 🛌 🚿 Thankfully, my life has begun to steady (yay!), and so in the past few weeks I've been able to wrap up some key pieces of the project! ✅ I'm thrilled to hold in my hands the FINAL proof...

Hi Reader, happy new year! 🎉 I wanted to share with you the three most important articles I found that look back at AI progress in 2025 and look forward at what is coming in 2026 and beyond. I’ve extracted the key points from each article, but if you have the time and interest, I’d encourage you to read the full articles! 💠 The Shape of AI: Jaggedness, Bottlenecks and Salients By Ethan Mollick “Jaggedness” describes the uneven abilities of AI: It’s superhuman in some areas and far below human...