Claude models are typically accessed directly through Anthropic's API. But for enterprise customers, that creates friction: API keys to manage, traffic leaving their cloud environment, separate billing relationships to maintain. This commit changes that equation.
With Vertex AI routing, organizations running on Google Cloud can now access Claude models through Google's infrastructure. The traffic stays within GCP, billing consolidates to a single vendor, and existing IAM policies apply. For enterprises already invested in Google Cloud, this removes a significant adoption barrier.
The implementation adds a new routing layer to OpenClaw's provider system:
// Provider routing now checks for Vertex AI configuration
if (model.startsWith('claude-') && config.vertexAI?.enabled) {
return routeThroughVertex(request);
}
Key technical changes include:
This capability matters most for organizations with existing cloud commitments:
"We have enterprise agreements with Google Cloud. Adding another vendor for AI creates procurement overhead, compliance questions, and billing complexity. Vertex AI routing eliminates all of that." — Enterprise user feedback
The pattern extends beyond Google Cloud. OpenClaw already supports Azure OpenAI for accessing OpenAI models through Microsoft's infrastructure. This Vertex AI addition brings Claude models to the same level of enterprise integration.
Enabling Vertex AI routing requires minimal configuration:
# In your OpenClaw config
providers:
anthropic:
vertexAI:
enabled: true
projectId: "your-gcp-project"
region: "us-central1"
Once configured, requests for Claude models automatically route through Vertex AI. No code changes needed — the provider abstraction handles it transparently.
This commit opens several enterprise paths:
As AI assistants become enterprise infrastructure, the ability to deploy within existing cloud environments — rather than alongside them — becomes a competitive necessity.