Skip to main content
Codegen offers flexibility in choosing the Large Language Model (LLM) that powers your agent, allowing you to select from various providers and specific models. You can also configure custom API keys and base URLs if you have specific arrangements or need to use self-hosted models.

Configure Model Settings

Choose your LLM provider, select models, and configure custom API keys for your organization.

Accessing LLM Configuration

LLM Configuration settings are applied globally for your entire organization. You can access and modify these settings by navigating to:

Configure Model Settings

Choose your LLM provider, select models, and configure custom API keys for your organization.
This central location ensures that all agents operating under your organization adhere to the selected LLM provider and model, unless specific per-repository or per-agent overrides are explicitly configured (if supported by your plan).
LLM Configuration UI

LLM Configuration UI at codegen.com/settings/model

As shown in the UI, you can generally configure the following:
  • LLM Provider: Select the primary LLM provider you wish to use. Codegen supports major providers such as:
    • Anthropic
    • OpenAI
    • Google (Gemini)
  • LLM Model: Once a provider is selected, you can choose a specific model from that provider’s offerings (e.g., Claude 4 Sonnet, GPT-4, Gemini Pro).

Enhanced Agent Modes

For improved agent performance, you can enable Claude Code mode which runs agents in Anthropic’s specialized coding environment:

Claude Code Mode

Configure agents to run in Claude Code harness for enhanced coding capabilities and superior development assistance.

Model Recommendation

While Codegen provides access to a variety of models for experimentation and specific use cases, we highly encourage the use of Anthropic’s Claude 4 Sonnet. Our internal testing and prompt engineering are heavily optimized for Claude 4 Sonnet, and it consistently delivers the best performance, reliability, and cost-effectiveness for most software engineering tasks undertaken by Codegen agents. Other models are made available primarily for users who are curious or have unique, pre-existing workflows.

Custom API Keys and Base URLs

For advanced users or those with specific enterprise agreements with LLM providers, Codegen allows you to use your own API keys and, in some cases, custom base URLs (e.g., for Azure OpenAI deployments or other proxy/gateway services).

Configure API Keys

Set up custom API keys for OpenAI, Anthropic, Google, and Grok models.
We currently support custom API keys for:
  • OpenAI - GPT-4, GPT-4 Turbo, and other OpenAI models
  • Anthropic - Claude 4 Sonnet, Claude 4 Opus, and Claude 4 Haiku
  • Google - Gemini Pro and other Google AI models
  • Grok - Grok models from xAI
Benefits of custom API keys:
  • Custom API Key: If you provide your own API key, usage will be billed to your account with the respective LLM provider.
  • Custom Base URL: This allows Codegen to route LLM requests through a different endpoint than the provider’s default API.
Using the default Codegen-managed LLM configuration (especially with Claude 4 Sonnet) is recommended for most users to ensure optimal performance and to benefit from our continuous prompt improvements.
The availability of specific models, providers, and custom configuration options may vary based on your Codegen plan and the current platform capabilities.