张芷铭的个人博客

Hugging Face 模型下载支持三种方式:Transformers API、huggingface-cli、手动下载。

方法对比

方法适用场景优势
Transformers API日常开发自动下载+缓存
huggingface-cli批量下载断点续传、筛选文件
手动下载网络受限灵活可控

Transformers API

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
from transformers import AutoModel, AutoTokenizer

model_name = "bert-base-chinese"

# 基础用法
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)

# 自定义缓存路径
model = AutoModel.from_pretrained(
    model_name,
    cache_dir="./my_cache",
    local_files_only=False  # True=离线模式
)

huggingface-cli

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
pip install huggingface-hub

# 下载完整模型
huggingface-cli download bert-base-chinese --local-dir ./bert-base-chinese

# 仅下载指定文件
huggingface-cli download bert-base-chinese \
  --local-dir ./bert-base-chinese \
  --include "*.bin" "config.json"

# 私有模型登录
huggingface-cli login

手动下载

  1. 打开模型主页(如 https://huggingface.co/bert-base-chinese
  2. 点击「Files and versions」下载所需文件
  3. 本地加载:
1
model = AutoModel.from_pretrained("./bert-base-chinese", local_files_only=True)

关键配置

1
2
3
4
5
6
# 缓存目录(默认 ~/.cache/huggingface/hub)
export HF_HOME=/path/to/cache

# 代理加速
export HTTP_PROXY=http://127.0.0.1:7890
export HTTPS_PROXY=http://127.0.0.1:7890
1
2
# 私有模型
model = AutoModel.from_pretrained(model_name, token="your_hf_token")

Comments