Available Models

Firebender supports a wide range of AI models from different providers. Model names are case-insensitive, so you can use GPT-5, gpt-5, or Gpt-5 interchangeably.

Regular Models

  • default - Uses fastest or smartest model given the task (Recommended)
  • quick - Lightning fast, but less accurate
  • gpt-5 - OpenAI’s latest frontier model
  • gpt-5 (priority) - GPT-5 via OpenAI Priority processing (lower latency, premium pricing)
  • qwen-3-coder-480b - Lightning fast, great for small to medium features
  • claude-4 sonnet - Best general purpose coding agent
  • gpt-5-mini - OpenAI’s latest frontier model mini
  • gpt-oss-120b - Lightning fast, OpenAI largest open source model
  • gemini 2.5 pro - Best for large amounts of code input
  • grok-4 - xAI flagship model
  • gpt-5 (low reasoning) - OpenAI’s latest frontier model - less reasoning
  • gpt-5 (low reasoning, priority) - GPT-5 Low via OpenAI Priority processing (lower latency, premium pricing)
  • gpt-5 (high reasoning) - OpenAI’s latest frontier model - more reasoning
  • gpt-5 (high reasoning, priority) - GPT-5 High via OpenAI Priority processing (lower latency, premium pricing)
  • claude-4 sonnet (1m) - Best general purpose coding agent with 1M context (2x pricing over 200k tokens)

Research Models

These models are optimized for deep analysis and complex reasoning tasks:
  • claude-4.1 opus - Latest version with improved performance for deep analysis
  • claude-4 opus - Better for deep analysis than sonnet, but slower coding agent
  • openai-o3-pro - Great for single queries that require deep thinking

Legacy Models

Older model versions that are still available:
  • gpt-4.1 - GPT-4.1 with large context window
  • claude-3.7 sonnet - Claude-3.7 Sonnet
  • claude-3.5 sonnet - Claude-3.5 Sonnet
  • openai-o3 - OpenAI-o3 reasoning model
  • openai-o4-mini - OpenAI-o4-mini reasoning model
  • openai-o3-mini - OpenAI-o3-mini reasoning model
  • gpt-4o - GPT-4o multimodal model
  • grok-3 - Grok-3 model
  • deepseek v3 - Deepseek v3
  • deepseek r1 - Deepseek r1 reasoning model

Instructions

You can specify models using one of the following methods:
  • Deep Links: jetbrains://idea/firebender/chat?model=claude-4+sonnet (spaces become + in URLs)
  • Commands: Configure in firebender.json with "model": "claude-4 sonnet"
  • Plugin Interface: Select from the model dropdown in the Firebender chat interface