How to Access a Local Ollama API Over LAN on Windows

Expose Ollama API to your local network on Windows by setting the host, allowing firewall ports, and verifying with curl.

If you want other devices in the same LAN to access your local Ollama API, follow these steps.

Set the listening host

First, set Ollama to listen on all network interfaces:

OLLAMA_HOST=0.0.0.0:11434

Open the firewall

In Windows Firewall advanced settings, create an inbound rule and allow the target port (for example 8080):

  1. Press Win + S, search and open “Windows Defender Firewall”.
  2. Click “Advanced settings”.
  3. Select “Inbound Rules” -> “New Rule…”.
  4. Choose “Port”, then click “Next”.
  5. Select protocol (usually TCP), enter the target port in “Specific local ports” (for example 8080), then click “Next”.
  6. Choose “Allow the connection”, then click “Next”.
  7. In “Profile”, select Domain, Private, and Public, then click “Next”.
  8. Name the rule (for example OpenPort8080) and click “Finish”.

Run Ollama

Ollama run 模型

Access the model through API

1
2
3
4
curl http://192.168.x.xxx:11434/api/generate -d '{
  "model": "gemma4",
  "prompt": "这个是什么模型?"
}'
记录并分享
Built with Hugo
Theme Stack designed by Jimmy