Update README.md

This commit is contained in:
Gustavo de Rosa 2024-04-29 16:25:04 +00:00 committed by system
parent 710686f446
commit c929c7735a
No known key found for this signature in database
GPG Key ID: 6A528E38E0733467

@ -17,9 +17,9 @@ Our model hasn't been fine-tuned through reinforcement learning from human feedb
## How to Use
Phi-2 was integrated in `transformers` version 4.37. If you need to use an earlier version, you need to pass `trust_remote_code=True` to the `from_pretrained()` function.
Phi-2 has been integrated in the `transformers` version 4.37.0, please ensure that you are using a version equal or higher than it.
Phi-2 is known for having an attention overflow issue (with FP16). If you are facing this issue, please enable/disable autocast on the [PhiAttention.forward()](https://huggingface.co/microsoft/phi-2/blob/main/modeling_phi.py#L306) function.
Phi-2 is known for having an attention overflow issue (with FP16). If you are facing this issue, please enable/disable autocast on the [PhiAttention.forward()](https://github.com/huggingface/transformers/blob/main/src/transformers/models/phi/modeling_phi.py#L306) function.
## Intended Uses