One place for provider access
Providers stay close to the stack instead of being copied into each app environment by hand.
This is the stack-wide hub for provider connections. It brings together private networking providers, AI providers, chat integrations, and private networking, repository providers and CI/CD, product integrations and secrets, storage and registries, and email delivery used by instances created from this stack.
Keeping them at stack level makes new environments and instances easier to launch because the same integration model can be reused across environments instead of rebuilt from scratch.
One place for provider access
Providers stay close to the stack instead of being copied into each app environment by hand.
Reuse the same setup across environments
Keep one provider model for development, staging, and production instead of reconnecting services every time.
Provider pages can go deeper later
Each provider can have its own stack-specific page without bloating the main stack page.
Keep the OpenClaw gateway connected to model providers, chat channels, and private networking from one reusable stack-level integration layer.
Attach your OpenAI API credentials to the OpenClaw gateway without hardcoding secrets.
Use Anthropic as an OpenClaw provider through a reusable integration in Wodby.
Connect Gemini credentials to the gateway when you want Google models available in OpenClaw.
Wire Telegram into the stack for chat-based workflows and notifications where supported.
Connect Discord credentials for OpenClaw-related messaging and bot workflows.
Keep the OpenClaw gateway private by default and reachable only inside your tailnet.