Native API使用word2vec TENCENT_AILAB_EMBEDDING_LARGE_100报错

Loading word2vec from cache e[5me[33m…e[0me[0mFailed to load https://ai.tencent.com/ailab/nlp/en/data/tencent-ailab-embedding-zh-d100-v0.2.0.tar.gz#tencent-ailab-embedding-zh-d100-v0.2.0.txt.
If the problem still persists, please submit an issue to https://github.com/hankcs/HanLP/issues
When reporting an issue, make sure to paste the FULL ERROR LOG below.
================================ERROR LOG BEGINS================================
OS: Windows-10-10.0.19044-SP0
Python: 3.9.5
PyTorch: 1.11.0+cpu
HanLP: 2.1.0-beta.26
Traceback (most recent call last):
File “C:\Users\admin\Desktop\word2vec.py”, line 5, in
word2vec = hanlp.load(hanlp.pretrained.word2vec.TENCENT_AILAB_EMBEDDING_LARGE_100)
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp_init_.py”, line 43, in load
return load_from_meta_file(save_dir, ‘meta.json’, verbose=verbose, **kwargs)
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\utils\component_util.py”, line 171, in load_from_meta_file
raise e from None
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\utils\component_util.py”, line 99, in load_from_meta_file
obj.load(save_dir, verbose=verbose, **kwargs)
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\common\torch_component.py”, line 178, in load
self.model = self.build_model(
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\layers\embeddings\word2vec.py”, line 219, in build_model
model = embed.module(self.vocabs)
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\layers\embeddings\word2vec.py”, line 142, in module
embed = build_word2vec_with_vocab(self.embed,
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\layers\embeddings\util.py”, line 103, in build_word2vec_with_vocab
embed = index_word2vec_with_vocab(embed, vocab, extend_vocab, unk, lowercase, init, normalize)
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\layers\embeddings\util.py”, line 36, in index_word2vec_with_vocab
pret_vocab, pret_matrix = load_word2vec_as_vocab_tensor(filepath)
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\utils\torch_util.py”, line 237, in load_word2vec_as_vocab_tensor
word2vec, dim = load_word2vec(path, delimiter, cache)
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp\utils\torch_util.py”, line 192, in load_word2vec
word2vec, dim = load_pickle(binpath)
File “C:\Users\admin\AppData\Local\Programs\Python\Python39\lib\site-packages\hanlp_common\io.py”, line 18, in load_pickle
return pickle.load(f)
EOFError: Ran out of input
=================================ERROR LOG ENDS=================================