Hugging Face
The huggingface
provider allows you to use Hugging Face Models using Text Generation Inference.
script({ model: "huggingface:microsoft/Phi-3-mini-4k-instruct" })
To use Hugging Face models with GenAIScript, follow these steps:
Sign up for a Hugging Face account and obtain an API key from their console. If you are creating a Fined Grained token, enable the Make calls to the serverless inference API option.
Add your Hugging Face API key to the
.env
file asHUGGINGFACE_API_KEY
,HF_TOKEN
orHUGGINGFACE_TOKEN
variables..env HUGGINGFACE_API_KEY=hf_...Find the model that best suits your needs by visiting the HuggingFace models.
Update your script to use the
model
you choose.script({...model: "huggingface:microsoft/Phi-3-mini-4k-instruct",})
Logging
Section titled “Logging”You can enable the genaiscript:huggingface
and genaiscript:huggingface:msg
logging namespaces for more information about the requests and responses:
Aliases
The following model aliases are attempted by default in GenAIScript.
Alias | Model identifier |
---|---|
large | meta-llama/Llama-3.3-70B-Instruct |
small | microsoft/phi-4 |
vision | meta-llama/Llama-3.2-11B-Vision-Instruct |
embeddings | nomic-ai/nomic-embed-text-v1.5 |
Limitations
- Uses OpenAI compatibility layer
- listModels
- Ignore prediction of output tokens