grok-1/README.md
Szymon Tworkowski 5de83eb225
Add library tag (#61)
- Add tag (aed5fd857602f5a9e7ea598b830b488f200b73e8)


Co-authored-by: Omar Sanseviero <osanseviero@users.noreply.huggingface.co>
2024-03-28 16:25:32 +00:00

975 B

license pipeline_tag library_name tags
apache-2.0 text-generation grok
grok-1

Grok-1

This repository contains the weights of the Grok-1 open-weights model. You can find the code in the GitHub Repository.

Download instruction

Clone the repo & download the int8 checkpoint to the checkpoints directory by executing this command in the repo root directory:

git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False

Then, you can run:

pip install -r requirements.txt
python run.py

You should be seeing output from the language model.

Due to the large size of the model (314B parameters), a multi-GPU machine is required to test the model with the example code.

p.s. we're hiring: https://x.ai/careers