Skip to main content

AI Providers & Models

Genum gives you full control over which LLM models your organization can use — and lets you connect your own providers alongside native ones.


Model Whitelist

The Model Whitelist lets you define exactly which LLM models are allowed in your workspace. Once configured, all projects in the organization can only use approved models — including models from custom LLM providers.

Why Use a Whitelist?

  • Governance: Ensure teams only use vetted, approved models.
  • Cost control: Prevent usage of expensive models without explicit approval.
  • Compliance: Restrict models to those that meet your security or data-residency requirements.

How It Works

  1. Open Organization Settings.
  2. Go to the Model Whitelist section.
  3. Select the models you want to allow from the list of all available models (native and custom).
  4. Save the whitelist.

Once saved, only whitelisted models can be selected in prompts across all projects in the organization. Models that are not on the whitelist will not appear as options for project members.

Key Details

  • The whitelist applies organization-wide — it covers every project within the organization.
  • Custom provider models are also subject to the whitelist. Adding a custom provider does not automatically make its models available; they must be explicitly whitelisted.
  • If a previously used model is removed from the whitelist, existing prompts referencing it will need to be updated before they can be executed again.
  • Only organization admins and the owner can manage the whitelist.

Custom LLM Providers

Genum supports custom providers that implement an OpenAI-like API. This lets you connect any compatible service and manage models alongside native providers. Custom models are added at the organization level and become available in all projects within that organization.

Add a Provider

  1. Open Organization Settings.
  2. Go to LLM API Keys and select the Custom tab.
  3. Click Add Provider.
  4. Fill in the connection details:
    • Base URL of your provider
    • API Key (optional)
  5. Click Test Connection.

If the connection is successful, Genum will display the list of available models from your provider.

Configure Models

After adding a provider, the same page shows all discovered models. For each model you can:

  • Rename the model for easier identification
  • Tune parameters (temperature, max tokens, and other supported settings)

These settings are stored per model and used in prompt executions.

Delete a Provider

You can remove a provider from the same LLM API Keys page. Deletion is allowed only when no prompts or productive commits reference any model from that provider.

If deletion is blocked, first update prompts and commits to use a different model, then try again.