I’ll never pay for AI again

Online AI tools are getting ridiculously good, but as soon as you start using them, you'll realize that their subscription costs quickly add up. The free versions are limited in access and, more often than not, only give you access to inferior models. When I built my local coding AI for VS Code, I thought I was done paying for GitHub Copilot. However, between privacy concerns and ridiculously high subscription costs, I realized that I don't need to pay for AI anymore if I can just run models locally. Local AI models are getting quite good Free tools that rival big-name AI services The open-source AI landscape has exploded with some genuinely impressive models. While there's no one-size-fits-all solution like GPT-5, there are models like Llama, Mistral, Gemma, and DeepSeek that are incredibly good at specific tasks. And since you're running these models locally, you can pick and choose which model you want for a specific ta

I’ll never pay for AI again

Online AI tools are getting ridiculously good, but as soon as you start using them, you'll realize that their subscription costs quickly add up. The free versions are limited in access and, more often than not, only give you access to inferior models.

When I built my local coding AI for VS Code, I thought I was done paying for GitHub Copilot. However, between privacy concerns and ridiculously high subscription costs, I realized that I don't need to pay for AI anymore if I can just run models locally.

Local AI models are getting quite good

Free tools that rival big-name AI services

The open-source AI landscape has exploded with some genuinely impressive models. While there's no one-size-fits-all solution like GPT-5, there are models like Llama, Mistral, Gemma, and DeepSeek that are incredibly good at specific tasks. And since you're running these models locally, you can pick and choose which model you want for a specific task without the hassle.

These models also stack up rather well against their paid counterparts. In many cases, a well-chosen open-source model running locally can match the performance of a web-based AI like ChatGPT or GitHub Copilot for specific tasks.

Local AI model running on VS Code.
Yadullah Abidi / MakeUseOf
Credit: Yadullah Abidi / MakeUseOf

The variety is staggering, meaning you'll have hundreds, if not more, AI models to play around with. Just keep in mind that the greater the number of parameters, the higher an AI model's hardware requirements will be.

As a general rule of thumb, you need at least 8GB of RAM available to run the 7B models, 16GB for 13B models, and 32GB for 33B models. It also helps to have a ton of disk space, as some of these AI models can be quite big.

You can run AI models locally on just about any modern machine. Models with lower parameter counts won't be able to handle more complex tasks, but will be faster to use. The key is picking the right model for your requirements.

Running local AI with LM Studio

A simple app that makes local AI effortless

Running these models is also surprisingly easy. You can enjoy the benefits of a local LLM with plenty of apps, including LM Studio and Ollama.

These tools let you access a wide range of open-source AI models that you can download and run as per your requirements. I prefer LM Studio as it comes with a rather easy-to-use GUI interface, but feel free to try other options to find what works best for you.

Installing and setting up LM Studio with an AI model is a rather easy process. Download LM Studio from the official website and run the installer. Follow these steps after running LM Studio for the first time:

  1. You may be prompted to go through a setup wizard. This can be skipped by clicking the grey Skip button on the top-right.
  2. Once the main interface loads up, LM Studio should automatically start downloading any drivers or updates it needs. Wait for these to finish before proceeding.
  3. Click the magnifying glass icon to open the Discover tab and search for the model you want to download. Click the green Download button at the bottom left to proceed.
  4. Once the model is done downloading, head over to the Chat section and click the dropdown at the top of the display to load your downloaded model.

At this point, you should be able to start chatting with the downloaded AI model. Depending on the model you're using, you might be able to get it to read and analyze files as well. LM Studio clearly marks which model has what capabilities, so you can find models with image reading or generating capabilities as well.

I can finally use AI for sensitive work

Keep your data private

I already stacked free chatbots so I never pay a cent for AI, but these were still web tools that trained on my data. Which means using AI for sensitive work like coding was out of the picture. Besides, most AI chatbots have usage policies that prohibit you from using the AI's output for commercial purposes until you get the appropriate subscription.

Running AI locally frees from such restrictions. The conversations with my AI chatbot never leave my PC, which means I can comfortably integrate AI into projects I never would have before. There's no data collection, no training on your inputs, and no corporate oversight of your creative process. There is an AI chatbot that protects your privacy, but it's not quite the same as running your AI locally.

LM Studio with a Deepseek R1 chat.
Yadullah Abidi / MakeUseOf
Credit: Yadullah Abidi / MakeUseOf

This isn't just about data privacy either. Running AI models locally gives me creative freedom without worrying about data retention policies or potential misuse. I can use AI for sensitive work projects, personal writing, and any experimental ideas without worry. For professionals handling confidential information, the privacy aspect alone justifies the switch.

Break free from AI subscriptions

Own your AI setup

If you're tired of paying monthly subscriptions for AI tools, try downloading LM Studio or any of its alternatives and spending an afternoon playing with AI models. The Mistral 7B Instruct and Gemma 3 are good models to start with.

You don't need to go cold turkey with all your subscriptions right away. Keep the important ones as you experiment, then gradually reduce them as you find local models that meet your needs. I was able to cancel pretty much all of my subscriptions within the first month.

Local AI has benefits that online tools don't, and in my opinion, that's well worth the trade-off of losing some performance or intelligence. I won't pay for AI subscriptions because I don't have to. Neither do you.

Share

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0