grok-1/README.md

32 lines
975 B
Markdown
Raw Permalink Normal View History

2024-03-17 05:47:21 +00:00
---
license: apache-2.0
2024-03-18 02:55:11 +00:00
pipeline_tag: text-generation
library_name: grok
tags:
- grok-1
2024-03-17 05:47:21 +00:00
---
2024-03-17 05:49:40 +00:00
# Grok-1
2024-03-18 02:55:11 +00:00
This repository contains the weights of the Grok-1 open-weights model. You can find the code in the [GitHub Repository](https://github.com/xai-org/grok-1/tree/main).
2024-03-17 05:49:40 +00:00
2024-03-18 18:49:41 +00:00
# Download instruction
Clone the repo & download the `int8` checkpoint to the `checkpoints` directory by executing this command in the repo root directory:
2024-03-17 05:49:40 +00:00
```shell
2024-03-18 18:49:41 +00:00
git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
2024-03-18 15:49:47 +00:00
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
2024-03-17 05:49:40 +00:00
```
2024-03-18 02:55:11 +00:00
Then, you can run:
```shell
pip install -r requirements.txt
2024-03-18 15:49:47 +00:00
python run.py
2024-03-18 02:55:11 +00:00
```
2024-03-17 05:49:40 +00:00
You should be seeing output from the language model.
Due to the large size of the model (314B parameters), a multi-GPU machine is required to test the model with the example code.
2024-03-17 21:48:09 +00:00
p.s. we're hiring: https://x.ai/careers