From c929c7735ac31aa03ef9a1e8d72c5d2f62999e27 Mon Sep 17 00:00:00 2001 From: Gustavo de Rosa Date: Mon, 29 Apr 2024 16:25:04 +0000 Subject: [PATCH] Update README.md --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index aef4817..f48c5ac 100644 --- a/README.md +++ b/README.md @@ -17,9 +17,9 @@ Our model hasn't been fine-tuned through reinforcement learning from human feedb ## How to Use -Phi-2 was integrated in `transformers` version 4.37. If you need to use an earlier version, you need to pass `trust_remote_code=True` to the `from_pretrained()` function. +Phi-2 has been integrated in the `transformers` version 4.37.0, please ensure that you are using a version equal or higher than it. -Phi-2 is known for having an attention overflow issue (with FP16). If you are facing this issue, please enable/disable autocast on the [PhiAttention.forward()](https://huggingface.co/microsoft/phi-2/blob/main/modeling_phi.py#L306) function. +Phi-2 is known for having an attention overflow issue (with FP16). If you are facing this issue, please enable/disable autocast on the [PhiAttention.forward()](https://github.com/huggingface/transformers/blob/main/src/transformers/models/phi/modeling_phi.py#L306) function. ## Intended Uses