🍉
Pinned Loading
-
-
-
ChatGLM2-6B
ChatGLM2-6B PublicForked from THUDM/ChatGLM2-6B
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
Python
-
microsoft/DeepSpeed
microsoft/DeepSpeed PublicDeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
-
microsoft/Megatron-DeepSpeed
microsoft/Megatron-DeepSpeed PublicForked from NVIDIA/Megatron-LM
Ongoing research training transformer language models at scale, including: BERT & GPT-2
-
huggingface/optimum-habana
huggingface/optimum-habana PublicEasy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.