← Articles
OpenClaw Feature February 10, 2026 · PR #11106

OpenClaw Adds Custom Provider Onboarding: Local AI Gets First-Class Support

A new configuration flow in OpenClaw makes it dramatically easier to connect self-hosted and custom AI providers. For privacy-conscious users and enterprises running local models, this removes a significant barrier to adoption.

About the Contributor

Blossom contributed the custom/local API configuration flow, while Gustavo Madeira Santana followed up with UX improvements renaming "Custom API Endpoint" to the clearer "Custom Provider" terminology. Together, these changes represent a significant investment in supporting diverse AI infrastructure.

Why This Matters

OpenClaw has grown to nearly 179,000 stars by making AI assistants accessible across platforms. But until now, connecting to anything beyond the major cloud providers (Anthropic, OpenAI, etc.) required manual configuration file editing—a barrier for many users.

This PR introduces a guided onboarding flow for custom providers, bringing the same polish that exists for Anthropic or OpenAI to self-hosted setups like:

Technical Implementation

The Configuration Flow

The new onboarding flow guides users through:

  1. Provider selection — Choose "Custom Provider" from the provider list
  2. Endpoint configuration — Enter the base URL for your API (e.g., http://localhost:11434 for Ollama)
  3. Authentication — Optional API key if your provider requires it
  4. Model selection — List available models or manually specify model names
  5. Validation — Test the connection before saving
Connection Validation

The flow includes automatic endpoint testing, catching common issues like incorrect ports, missing authentication, or unreachable hosts before users save their configuration.

Terminology Refinement

The follow-up commit from Gustavo renames "Custom API Endpoint" to "Custom Provider" throughout the UI. This subtle change clarifies that users are adding a provider (like Anthropic or OpenAI), not just an endpoint—reinforcing that custom providers are first-class citizens in OpenClaw.

Enterprise Implications

For enterprise deployments, this feature is particularly significant:

Combined with the GitHub Enterprise Cloud support for Copilot provider also merged today, OpenClaw is clearly prioritizing enterprise deployment scenarios.

The Broader Context

This change reflects a maturing understanding of how AI assistants will be deployed. The early assumption—that everyone would call cloud APIs—is giving way to a more nuanced reality:

By making custom providers a first-class experience, OpenClaw positions itself as infrastructure-agnostic—a key differentiator as the AI assistant market matures.

What's Next