g4f.dev

HuggingFace

Access thousands of open-source models via HuggingFace’s Inference API.

Requirements

API Routes

Type URL
Proxy https://g4f.space/api/huggingface

Available Models

HuggingFace provides access to many models. Use client.models.get_all() to list available models.

Popular models include:

Examples

Python

from g4f.client import Client
from g4f.Provider import HuggingFace

# API key is required
client = Client(provider=HuggingFace, api_key="YOUR_HF_TOKEN")

# List available models
models = client.models.get_all()
print(f"Available models: {models[:5]}")

# Chat completion
response = client.chat.completions.create(
    model="meta-llama/Llama-3.1-70B-Instruct",
    messages=[
        {"role": "user", "content": "Hello, how are you?"}
    ],
)

print(response.choices[0].message.content)

JavaScript

import { HuggingFace } from '@gpt4free/g4f.dev';

const client = new HuggingFace({ apiKey: 'YOUR_HF_TOKEN' });

const response = await client.chat.completions.create({
    model: "meta-llama/Llama-3.1-70B-Instruct",
    messages: [
        { role: "user", content: "Hello, how are you?" }
    ],
});

console.log(response.choices[0].message.content);