|
|
--- |
|
|
license: mit |
|
|
--- |
|
|
|
|
|
# ChatGLM-6B Mirror |
|
|
ChatGLM-6B is an open source, bilingual conversational language model based on the General Language Model (GLM) architecture with 6.2 billion parameters. Combined with model quantization techniques, it can be deployed locally on consumer-grade graphics cards (as low as 6GB of video memory at INT4 quantization level). ChatGLM-6B uses similar technology to ChatGPT, optimized for Chinese Q&A and conversation. With approximately 1T identifiers trained in both English and Chinese, and supported by supervised fine-tuning, feedback self-help, and human feedback reinforcement learning, ChatGLM-6B with 6.2 billion parameters is able to generate responses that are fairly consistent with human preferences. |
|
|
|
|
|
## Usage |
|
|
```python |
|
|
from modelscope import snapshot_download |
|
|
model_dir = snapshot_download('Genius-Society/chatglm_6b') |
|
|
``` |
|
|
|
|
|
## Maintenance |
|
|
```bash |
|
|
git clone [email protected]:Genius-Society/chatglm_6b |
|
|
cd chatglm_6b |
|
|
``` |
|
|
|
|
|
## Mirror |
|
|
<https://www.modelscope.cn/models/Genius-Society/chatglm_6b> |
|
|
|
|
|
## Thanks |
|
|
- <a href="https://www.modelscope.cn/models/ZhipuAI/ChatGLM-6B">ChatGLM-6B</a> |