Stacks/OpenClaw/Integrations

Integrations for the OpenClaw stack

This is the stack-wide hub for provider connections. It brings together private networking providers, AI providers, chat integrations, and private networking, repository providers and CI/CD, product integrations and secrets, storage and registries, and email delivery used by instances created from this stack.

Keeping them at stack level makes new environments and instances easier to launch because the same integration model can be reused across environments instead of rebuilt from scratch.

One place for provider access

Providers stay close to the stack instead of being copied into each app environment by hand.

Reuse the same setup across environments

Keep one provider model for development, staging, and production instead of reconnecting services every time.

Provider pages can go deeper later

Each provider can have its own stack-specific page without bloating the main stack page.

AI, chat, and private networking integrations

Keep the OpenClaw gateway connected to model providers, chat channels, and private networking from one reusable stack-level integration layer.

OpenAI

Attach your OpenAI API credentials to the OpenClaw gateway without hardcoding secrets.

Anthropic

Use Anthropic as an OpenClaw provider through a reusable integration in Wodby.

Gemini

Connect Gemini credentials to the gateway when you want Google models available in OpenClaw.

Telegram

Wire Telegram into the stack for chat-based workflows and notifications where supported.

Discord

Connect Discord credentials for OpenClaw-related messaging and bot workflows.

Tailscale

Keep the OpenClaw gateway private by default and reachable only inside your tailnet.