Mission RHW

Local vs cloud AI:
where should your data live?

Where your AI runs determines who can see your data, what you pay every month, and what happens when the internet goes down. Most people have not been told there is a choice.

Reading time · 6 minutes For you if · you handle private client information and are wondering how AI fits with that

If your mental model of AI is “it is a thing on the internet that I talk to,” you are describing a specific version of it. There is another version that runs on your own computer, never sends anything anywhere, and works whether or not you have a connection. Most people do not know about the second version because the companies selling the first version have a large marketing budget.

The choice between them is one of the most consequential decisions in a business automation build. It determines your data exposure, your monthly costs, and whether you actually own the thing you paid for.

Chapter 01

What “cloud AI” actually means.

Cloud AI means the model runs on someone else's server. ChatGPT, Claude, Gemini — when you type a message, that message travels from your device to a data centre owned by a large company, is processed there, and an answer comes back. The model is not on your machine. You are borrowing computing power from someone else.

For a lot of tasks, this is fine. Drafting a tweet, summarising a news article, generating a product description. Nothing in those tasks is sensitive. You would not care if someone at the data centre read them.

The problem comes when you send something you would care about. A client's session notes. An asylum application. Medical records. Financial information under professional confidence. When that data travels to someone else's server, you have created an exposure that your client did not consent to and that you, depending on your profession, may be legally responsible for.

Chapter 02

What “local AI” means.

Local AI means the model runs on a computer you own — your laptop, a dedicated machine in your office, a server on your own network. The data never leaves the building. There is no internet transmission involved in the processing step. You can run it with the router unplugged and it still works.

The models that run locally are capable enough for most business tasks. Document reading, summarising, drafting, classification, answering questions about a specific set of files. The same core capabilities you would use cloud AI for, but on hardware you own.

Tom · Immigration adviser — when we built his case review system, there was no question about where the AI would run. His clients' evidence included asylum claims, family histories, and medical evidence. That data stays on his office machine. It is processed there. The only thing that leaves the building is the summary brief he chooses to share with his client. He can demonstrate data residency to any regulator who asks.

Chapter 03

The cost difference over time.

Cloud AI charges per use. Usually per thousand words processed, or per request, or per minute of audio. At low volumes this is negligible. At business volumes, especially for document-heavy workflows, it accumulates.

Local AI has no running cost beyond electricity. The model is on the machine. There is no API call. There is no monthly bill from an AI company. If you process ten documents or ten thousand, the cost is the same.

The hardware cost is the one-off investment. A machine capable of running a good local AI model costs roughly what a high-end laptop costs. For a business that processes significant volumes, the payback period is short. For a solo practice, the decision is sometimes as much about privacy as about cost.

Chapter 04

When cloud makes sense.

Local AI is not the answer to every situation. The cloud has real advantages worth naming honestly.

  • The data is not sensitive. Marketing content, public research, general writing assistance. Nothing that would cause a problem if it appeared on someone else's screen. Cloud is fine here.
  • You need the most capable model available. The frontier models — the ones that perform best on complex reasoning — are cloud-only at present. For tasks requiring the highest capability, the cloud is where to look.
  • You don't have the hardware. Local AI needs a reasonably capable machine. If you are working on an old laptop, cloud is the practical option until you have the infrastructure for local.
  • You want speed without investment. Cloud AI is ready today with no setup. Local AI requires initial configuration. For a proof-of-concept or a short-term project, cloud makes sense.
Chapter 05

Red flags in how builders talk about this.

  • “It's encrypted, so it's private.”

    Encryption protects data in transit. It does not change whose server the data lands on or who processes it. A builder who conflates the two is either confused or hoping you will be.

  • They cannot tell you which AI service processes your data.

    Every cloud AI build uses a specific service. If the builder cannot name it, link to its privacy policy, and point to the relevant clause on data usage, they have not done this work. That is a problem before you share anything.

  • They assume cloud without asking about your data.

    Cloud is the default because it is faster and easier to build against. A good builder asks about your data before they choose where the AI runs. A builder who proposes a cloud solution for sensitive data without discussing this has not thought about your situation specifically.

The short version.

If the data is public or not sensitive, cloud AI is convenient and capable. If the data is private — client records, legal files, health information, financial data under professional duty — local AI is the only defensible choice, both ethically and in most jurisdictions legally.

If you are not sure how your data should be classified, the article on where your data should live covers the three buckets in plain English.