Clarifai
Anthropic, OpenAI, Qwen, xAI, Gemini and most of Open soured LLMs are Supported on Clarifai.
| Property | Details | 
|---|---|
| Description | Clarifai is a powerful AI platform that provides access to a wide range of LLMs through a unified API. LiteLLM enables seamless integration with Clarifai's models using an OpenAI-compatible interface. | 
| Provider Doc | Clarifai โ | 
| OpenAI compatible Endpoint for Provider | https://api.clarifai.com/v2/ext/openai/v1 | 
| Supported Endpoints | /chat/completions | 
Pre-Requisitesโ
pip install litellm
Required Environment Variablesโ
To obtain your Clarifai Personal access token follow this link.
os.environ["CLARIFAI_PAT"] = "CLARIFAI_API_KEY"  # CLARIFAI_PAT
Usageโ
import os
from litellm import completion
os.environ["CLARIFAI_API_KEY"] = ""
response = completion(
  model="clarifai/openai.chat-completion.gpt-oss-20b",
  messages=[{ "content": "Tell me a joke about physics?","role": "user"}]
)
Streaming Supportโ
LiteLLM supports streaming responses with Clarifai models:
import litellm
for chunk in litellm.completion(
    model="clarifai/openai.chat-completion.gpt-oss-20b",
    api_key="CLARIFAI_API_KEY",
    messages=[
        {"role": "user", "content": "Tell me a fun fact about space."}
    ],
    stream=True,
):
    print(chunk.choices[0].delta)
Tool Calling (Function Calling)โ
Clarifai models accessed via LiteLLM support function calling:
import litellm
tools = [{
    "type": "function",
    "function": {
        "name": "get_weather",
        "description": "Get current temperature for a given location.",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "City and country e.g. Tokyo, Japan"
                }
            },
            "required": ["location"],
            "additionalProperties": False
        },
    }
  }
}]
response = litellm.completion(
    model="clarifai/openai.chat-completion.gpt-oss-20b",
    api_key="CLARIFAI_API_KEY",
    messages=[{"role": "user", "content": "What is the weather in Paris today?"}],
    tools=tools,
)
print(response.choices[0].message.tool_calls)
Clarifai modelsโ
liteLLM supports all models on Clarifai community
๐ง OpenAI Modelsโ
- gpt-oss-20b
- gpt-oss-120b
- gpt-5-nano
- gpt-5-mini
- gpt-5
- gpt-4o
- o3
- Many more...
๐ค Anthropic Modelsโ
- claude-sonnet-4
- claude-opus-4
- claude-3_5-haiku
- claude-3_7-sonnet
- Many more...
๐ช xAI Modelsโ
- grok-3
- grok-2-vision-1212
- grok-2-1212
- grok-code-fast-1
- grok-2-image-1212
- Many more...
๐ท Google Gemini Modelsโ
๐งฉ Qwen Modelsโ
- Qwen3-30B-A3B-Instruct-2507
- Qwen3-30B-A3B-Thinking-2507
- Qwen3-14B
- QwQ-32B-AWQ
- Qwen2_5-VL-7B-Instruct
- Qwen3-Coder-30B-A3B-Instruct
- Many more...
๐ก MiniCPM (OpenBMB) Modelsโ
- MiniCPM-o-2_6-language
- MiniCPM3-4B
- MiniCPM4-8B
- Many more...
๐งฌ Microsoft Phi Modelsโ
- Phi-4-reasoning-plus
- phi-4
- Many more...
๐ฆ Meta Llama Modelsโ
- Llama-3_2-3B-Instruct
- Many more...
๐ DeepSeek Modelsโ
- DeepSeek-R1-0528-Qwen3-8B
- Many more...
Usage with LiteLLM Proxyโ
Here's how to call Clarifai with the LiteLLM Proxy Server
1. Save key in your environmentโ
export CLARIFAI_PAT="CLARIFAI_API_KEY"
2. Start the proxyโ
- config.yaml
model_list:
  - model_name: clarifai-model
    litellm_params:
      model: clarifai/openai.chat-completion.gpt-oss-20b
      api_key: os.environ/CLARIFAI_PAT
litellm --config /path/to/config.yaml
# Server running on http://0.0.0.0:4000
3. Test itโ
- Curl Request
- OpenAI v1.0.0+
curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--data ' {
      "model": "clarifai-model",
      "messages": [
        {
          "role": "user",
          "content": "what llm are you"
        }
      ]
    }
'
import openai
client = openai.OpenAI(
    api_key="anything",
    base_url="http://0.0.0.0:4000"
)
response = client.chat.completions.create(
    model="clarifai-model",
    messages = [
        {
            "role": "user",
            "content": "this is a test request, write a short poem"
        }
    ]
)
print(response)
Important Notesโ
- Always prefix Clarifai model IDs with clarifai/when specifying the model name
- Use your Clarifai Personal Access Token (PAT) as the API key
- Usage is tracked and billed through Clarifai
- API rate limits are subject to your Clarifai account settings
- Most OpenAI parameters are supported, but some advanced features may vary by model
FAQsโ
| Question | Answer | 
|---|---|
| Can I use all Clarifai models with LiteLLM? | Most chat-completion models are supported. Use the Clarifai model URL as the model. | 
| Do I need a separate Clarifai PAT? | Yes, you must use a valid Clarifai Personal Access Token. | 
| Is tool calling supported? | Yes, provided the underlying Clarifai model supports function/tool calling. | 
| How is billing handled? | Clarifai usage is billed independently via Clarifai. |