Plugin Settings
Local LLMs
Configure custom API keys and endpoints
Configure your own API key or base url for any self hosted model that is compatible with OpenAI chat protocol.
Inline Edit, and Read-only Agent behavior will send requests directly to your self hosted model without any proxy in-between
Some features use a custom model:
- Apply
- Autocomplete
- Auto/Write modes
- Commit name generation
- Chat name generation
These proxy settings won’t be relevant in these cases, and your code will be sent to Firebender servers for processing.
- We do not use your code data to improve our product (this includes training models, and we only use model providers that uphold this)
- We do not store your code data after the life of the request
See more at Firebender Code Policy.