← Back to Blog

What Changes When AI Runs Locally Instead of in the Cloud

• By John Britton

In healthcare, AI is often discussed as if it means one thing, a cloud-based tool that sends information to remote servers for processing. That makes sense, because those are the systems most people have encountered. But it leaves out an important distinction. Some AI tools run in the cloud, while others can run locally on the computer in front of you. From a privacy and HIPAA standpoint, those are not the same situation.

That difference affects whether protected health information has to travel over the internet, whether another company may be handling PHI on your behalf, and whether issues like vendor data practices and business associate agreements enter the picture. When local AI is not even on the radar, AI privacy in healthcare can start to sound like one single issue with one single answer, when it is not.

Cloud AI

People often hear “AI” and think of popular chatbots like ChatGPT or Claude. These are massive generative large language models that usually run in the cloud, meaning the model is running on remote hardware in a data center rather than on your own computer. Your prompt is sent over the internet to that remote system, and its response then travels back to where you see it in your chatbot window. The benefits are that these models are extremely smart and capable of doing a wide variety of tasks. Some downsides include cost, privacy concerns, and questions about how user data is handled.

In clinical work, that setup has specific implications for PHI and HIPAA. If information about our clients is traveling through the cloud, that raises important questions about where the data goes, who can access it, and whether a BAA may be required. Many HIPAA-compliant AI solutions address this by encrypting data as it travels through the cloud and by establishing business associate agreements where appropriate. Of course, no protection is guaranteed, but these steps are meant to reduce risk and clarify how client data is handled. Companies then decide whether they store, access, or otherwise handle that data, and they are expected to explain those practices to users in their privacy policies.

Local AI

Local AI means the AI “brains,” also known as the model weights, run on the computer in front of you rather than in a data center. The processing happens on your machine, meaning no information needs to be sent through the cloud over the internet for the model to respond. It also means the model relies on your own computer’s hardware and available memory to run locally. This adds an extra layer of privacy to your interactions with the AI model. It does not automatically make a local AI tool HIPAA compliant, which depends not only on the technical environment but also on user behaviors. It is also still important to understand the privacy policy of whatever software is being used to run the local AI model and whether any of your data is shared with that company.

As a side note, this same idea can apply to speech-to-text. In LocalScribe, dictation and session recording transcription also rely on local speech-to-text models whose “brains” live on the computer, which is why those model weights have to be downloaded in order for those features to work.

In healthcare, local AI changes the privacy and compliance conversation. In cloud-based solutions, another company may be handling PHI on your behalf, typically with encryption and other safeguards in place, which is why a business associate agreement is often required. If the AI processing stays local and no PHI is being shared with a vendor through their servers, then a BAA may not be needed in the same way. For clinicians who are especially privacy conscious and want to reduce the possibility of data leakage, mishandling, or unnecessary outside access, keeping that processing on-device adds an extra layer of protection.

LocalScribe as an Example

Let’s use the analogy of a text editor on your computer. If you use a text editor on your computer to write a report, use spell check, and rely on other tools to refine and create that documentation, then a BAA would not typically be needed because that information is not being handled by another company. As the user, you still have responsibility for safeguarding where that document lives and how it is stored, such as in an EHR. LocalScribe can be viewed as a more powerful text editor that helps refine and shape a document on your device, with polished writing as the output. Because that processing stays on the clinician’s own computer rather than being handled through a vendor’s servers, a business associate agreement may not be needed in the same way it often is for a cloud-based AI service, though HIPAA considerations still depend on how information is handled, stored, and used within a given workflow.

If deciding to use AI in clinical practice, it is worth understanding where the AI “brains” live, where data has to travel, and whether anyone else has access to that data at any point. The question is not just whether a tool uses AI, but how that AI is being run and where client information goes in the process.

Subscribe for future posts

If you want new writing at the intersection of AI and psychology, ethics, and implementation of AI in clinical practice, subscribe on Substack.

Subscribe on Substack

The views expressed here are my own and do not necessarily reflect the views of any current or future employer, training site, academic institution, or affiliated organization.