<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <title>MacOS on KnightLi Blog</title>
        <link>https://www.knightli.com/en/tags/macos/</link>
        <description>Recent content in MacOS on KnightLi Blog</description>
        <generator>Hugo -- gohugo.io</generator>
        <language>en</language>
        <lastBuildDate>Mon, 06 Apr 2026 09:38:00 +0800</lastBuildDate><atom:link href="https://www.knightli.com/en/tags/macos/index.xml" rel="self" type="application/rss+xml" /><item>
        <title>Ollama Default Model Storage Path and Migration Guide (Avoid Filling Up C Drive)</title>
        <link>https://www.knightli.com/en/2026/04/06/ollama-model-storage-path-and-migration/</link>
        <pubDate>Mon, 06 Apr 2026 09:38:00 +0800</pubDate>
        
        <guid>https://www.knightli.com/en/2026/04/06/ollama-model-storage-path-and-migration/</guid>
        <description>&lt;p&gt;When running local LLMs, the system drive is often the first thing to run out of space. Ollama stores models in user or system directories by default, so your C drive can fill up quickly without path planning.&lt;/p&gt;
&lt;h2 id=&#34;common-default-ollama-model-directories&#34;&gt;Common Default Ollama Model Directories
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;Windows: &lt;code&gt;C:\Users\&amp;lt;username&amp;gt;\.ollama\models&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;macOS: &lt;code&gt;~/.ollama/models&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Linux: &lt;code&gt;/usr/share/ollama/.ollama/models&lt;/code&gt; (may vary by installation method)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;windows-move-the-model-directory-to-a-non-system-drive&#34;&gt;Windows: Move the Model Directory to a Non-System Drive
&lt;/h2&gt;&lt;p&gt;A practical choice is moving model storage to a path like &lt;code&gt;D:\OllamaModels&lt;/code&gt;. The key is setting the &lt;code&gt;OLLAMA_MODELS&lt;/code&gt; system environment variable.&lt;/p&gt;
&lt;h2 id=&#34;1-create-the-target-directory&#34;&gt;1. Create the Target Directory
&lt;/h2&gt;&lt;p&gt;For example, create: &lt;code&gt;D:\OllamaModels&lt;/code&gt;&lt;/p&gt;
&lt;h2 id=&#34;2-configure-the-system-environment-variable&#34;&gt;2. Configure the System Environment Variable
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;Variable name: &lt;code&gt;OLLAMA_MODELS&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;Variable value: &lt;code&gt;D:\OllamaModels&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You can set it in &amp;ldquo;System Properties -&amp;gt; Advanced -&amp;gt; Environment Variables&amp;rdquo;, or with an admin PowerShell command:&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;div class=&#34;chroma&#34;&gt;
&lt;table class=&#34;lntable&#34;&gt;&lt;tr&gt;&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code&gt;&lt;span class=&#34;lnt&#34;&gt;1
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;
&lt;td class=&#34;lntd&#34;&gt;
&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-powershell&#34; data-lang=&#34;powershell&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;no&#34;&gt;System.Environment&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;]::&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;SetEnvironmentVariable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s2&#34;&gt;&amp;#34;OLLAMA_MODELS&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;D:\OllamaModels&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt; &lt;span class=&#34;s2&#34;&gt;&amp;#34;Machine&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;
&lt;/div&gt;
&lt;/div&gt;&lt;h2 id=&#34;3-restart-ollama-or-reboot-the-system&#34;&gt;3. Restart Ollama (or Reboot the System)
&lt;/h2&gt;&lt;p&gt;After setting the variable, restart the Ollama service/app. If you&amp;rsquo;re unsure whether it has taken effect, rebooting the PC is the most reliable option.&lt;/p&gt;
&lt;h2 id=&#34;4-verify-the-new-path-is-active&#34;&gt;4. Verify the New Path Is Active
&lt;/h2&gt;&lt;p&gt;Pull any model and check whether new files appear under &lt;code&gt;D:\OllamaModels&lt;/code&gt;.&lt;/p&gt;
&lt;h2 id=&#34;5-clean-up-the-old-directory-after-confirmation&#34;&gt;5. Clean Up the Old Directory (After Confirmation)
&lt;/h2&gt;&lt;p&gt;Once models work correctly in the new location, remove old files to reclaim C drive space.&lt;/p&gt;
&lt;h2 id=&#34;faq&#34;&gt;FAQ
&lt;/h2&gt;&lt;h3 id=&#34;still-writing-to-c-drive-after-configuration&#34;&gt;Still Writing to C Drive After Configuration
&lt;/h3&gt;&lt;ul&gt;
&lt;li&gt;Confirm the variable is a system variable, not a temporary session variable.&lt;/li&gt;
&lt;li&gt;Confirm the Ollama process was restarted.&lt;/li&gt;
&lt;li&gt;Verify the variable name is exactly &lt;code&gt;OLLAMA_MODELS&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id=&#34;do-i-need-to-migrate-existing-model-files&#34;&gt;Do I Need to Migrate Existing Model Files
&lt;/h3&gt;&lt;p&gt;If you want to avoid re-downloading, stop Ollama, copy existing model files to the new directory, then restart Ollama and verify.&lt;/p&gt;
&lt;!-- ollama-related-links:start --&gt;
&lt;h2 id=&#34;related-posts&#34;&gt;Related Posts
&lt;/h2&gt;&lt;ul&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;https://www.knightli.com/en/2026/04/05/google-gemma-4-model-comparison/&#34; &gt;Gemma 4 Model Comparison and Selection&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;https://www.knightli.com/en/2026/04/05/llm-quantization-guide-fp16-q4-q2/&#34; &gt;LLM Quantization Guide (FP16/Q8/Q5/Q4/Q2)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;https://www.knightli.com/en/2026/04/06/uninstall-ollama-on-linux/&#34; &gt;Completely Uninstall Ollama on Linux&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class=&#34;link&#34; href=&#34;https://www.knightli.com/en/2026/04/06/check-ollama-model-loaded-on-gpu/&#34; &gt;How to Check Whether Ollama Uses GPU&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;!-- ollama-related-links:end --&gt;
</description>
        </item>
        
    </channel>
</rss>
