If you download a model with the Hugging Face command-line tool, for example:
|
|
the downloaded model files usually do not appear in the current directory. Instead, they are saved into Hugging Face’s default cache directory.
Default cache locations
huggingface-cli download follows the Hugging Face Hub cache mechanism. The default paths are:
| System | Default cache directory |
|---|---|
| Linux / macOS | ~/.cache/huggingface/hub |
| Windows | C:\Users\用户名\.cache\huggingface\hub |
On Windows, if your username is knightli, the default path is roughly:
|
|
Why the model is not in the current directory
This is expected behavior. By default, huggingface-cli download stores repository files in the cache directory so they can be reused later without downloading them again.
Inside the cache directory, repositories are usually stored in a structure like this:
|
|
For example, a model repository may be cached as:
|
|
The actual model files are usually under the snapshots subdirectory inside that cache folder.
How to download to a specific directory
If you want the model files to appear in a directory you choose, add --local-dir to huggingface-cli download:
|
|
This is more convenient when working with local inference tools such as llama.cpp, Ollama, or LM Studio, because the model files are easier to find later.
Summary
- Default cache directory:
~/.cache/huggingface/hub - Default Windows directory:
C:\Users\用户名\.cache\huggingface\hub - Use
--local-dirif you want to save files to a specific location - If you are only looking for a
.gguffile, first check thesnapshotssubdirectory under the corresponding cached model repository