Plugin Settings
Local LLMs
Configure custom API keys and endpoints
Configure your own API key or base url for any self hosted model that is compatible with OpenAI chat protocol.
Chat, Edit, and Agent behavior will send requests directly to your self hosted model without any proxy in-between
Some features may use a custom model like Apply, Autocomplete, Agent, so these proxy settings won’t be relevant, and your code will be sent to Firebender servers for processing.
- We do not use your code data to improve our product (this includes training models, and we only use model providers that uphold this)
- We do not store your code data after the life of the request
See more at Firebender Code Policy.
Was this page helpful?