We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cuda版本: nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2021 NVIDIA Corporation Built on Thu_Nov_18_09:45:30_PST_2021 Cuda compilation tools, release 11.5, V11.5.119 Build cuda_11.5.r11.5/compiler.30672275_0
transformers版本: Name: transformers Version: 4.45.2 Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow Home-page: https://github.com/huggingface/transformers Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors) Author-email: [email protected] License: Apache 2.0 License Location: /anaconda3/envs/xinf/lib/python3.10/site-packages Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm Required-by: auto_gptq, autoawq, chattts, compressed-tensors, FlagEmbedding, nemo_text_processing, optimum, peft, sentence-transformers, transformers-stream-generator, vllm
python版本: 3.10.15 操作系统: linux
1.2.1
xinference-local --host 0.0.0.0 --port 9997
The text was updated successfully, but these errors were encountered:
之前 deepseek-r1-distill 模型会输出 <think> 标签,在 gradio 里会被吞掉,因此做了 html.escape。
<think>
inference/xinference/core/chat_interface.py
Lines 141 to 145 in 5abfc84
这个需要看下怎么解。
Sorry, something went wrong.
No branches or pull requests
System Info / 系統信息
cuda版本:
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2021 NVIDIA Corporation
Built on Thu_Nov_18_09:45:30_PST_2021
Cuda compilation tools, release 11.5, V11.5.119
Build cuda_11.5.r11.5/compiler.30672275_0
transformers版本:
Name: transformers
Version: 4.45.2
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: [email protected]
License: Apache 2.0 License
Location: /anaconda3/envs/xinf/lib/python3.10/site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
Required-by: auto_gptq, autoawq, chattts, compressed-tensors, FlagEmbedding, nemo_text_processing, optimum, peft, sentence-transformers, transformers-stream-generator, vllm
python版本:
3.10.15
操作系统:
linux
Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?
Version info / 版本信息
1.2.1
The command used to start Xinference / 用以启动 xinference 的命令
xinference-local --host 0.0.0.0 --port 9997
Reproduction / 复现过程
Expected behavior / 期待表现
The text was updated successfully, but these errors were encountered: