Appearance
本页是 OpenClaw Provider 插件(模型 Provider)的完整开发指南,适合需要将自定义 LLM、OpenAI 兼容代理或私有推理端点接入 OpenClaw 的开发者。从创建 package.json 和 manifest 开始,到注册 Provider、添加动态模型解析、挂载 runtime hooks,再到附加语音/图像等能力,六步完成一个可用的 Provider 插件。养龙虾的模型库,全靠 Provider 插件来撑场面。
构建 OpenClaw Provider 插件
本指南带你一步步为 OpenClaw 构建 Provider 插件,将任意 LLM 接入模型选择体系。
如果你此前没有构建过任何 OpenClaw 插件,请先阅读插件开发入门了解基础包结构和 manifest 配置。
开发步骤
步骤 1:创建包和 manifest
package.json:
json
{
"name": "@myorg/openclaw-acme-ai",
"version": "1.0.0",
"type": "module",
"openclaw": {
"extensions": ["./index.ts"],
"providers": ["acme-ai"],
"compat": {
"pluginApi": ">=2026.3.24-beta.2",
"minGatewayVersion": "2026.3.24-beta.2"
},
"build": {
"openclawVersion": "2026.3.24-beta.2",
"pluginSdkVersion": "2026.3.24-beta.2"
}
}
}openclaw.plugin.json:
json
{
"id": "acme-ai",
"name": "Acme AI",
"description": "Acme AI model provider",
"providers": ["acme-ai"],
"modelSupport": {
"modelPrefixes": ["acme-"]
},
"providerAuthEnvVars": {
"acme-ai": ["ACME_AI_API_KEY"]
},
"providerAuthChoices": [
{
"provider": "acme-ai",
"method": "api-key",
"choiceId": "acme-ai-api-key",
"choiceLabel": "Acme AI API key",
"groupId": "acme-ai",
"groupLabel": "Acme AI",
"cliFlag": "--acme-ai-api-key",
"cliOption": "--acme-ai-api-key <key>",
"cliDescription": "Acme AI API key"
}
],
"configSchema": {
"type": "object",
"additionalProperties": false
}
}manifest 中 providerAuthEnvVars 让 OpenClaw 无需加载插件代码即可探测认证状态;modelSupport 让 OpenClaw 在 runtime hooks 启动前就能从简写模型 id(如 acme-large)自动激活插件。发布到 ClawHub 时,openclaw.compat 和 openclaw.build 字段必填。
步骤 2:注册 Provider
最小化 Provider 需要 id、label、auth 和 catalog:
typescript
// index.ts
import { definePluginEntry } from "openclaw/plugin-sdk/plugin-entry";
import { createProviderApiKeyAuthMethod } from "openclaw/plugin-sdk/provider-auth";
export default definePluginEntry({
id: "acme-ai",
name: "Acme AI",
description: "Acme AI model provider",
register(api) {
api.registerProvider({
id: "acme-ai",
label: "Acme AI",
docsPath: "/providers/acme-ai",
envVars: ["ACME_AI_API_KEY"],
auth: [
createProviderApiKeyAuthMethod({
providerId: "acme-ai",
methodId: "api-key",
label: "Acme AI API key",
hint: "API key from your Acme AI dashboard",
optionKey: "acmeAiApiKey",
flagName: "--acme-ai-api-key",
envVar: "ACME_AI_API_KEY",
promptMessage: "Enter your Acme AI API key",
defaultModel: "acme-ai/acme-large",
}),
],
catalog: {
order: "simple",
run: async (ctx) => {
const apiKey = ctx.resolveProviderApiKey("acme-ai").apiKey;
if (!apiKey) return null;
return {
provider: {
baseUrl: "https://api.acme-ai.com/v1",
apiKey,
api: "openai-completions",
models: [
{
id: "acme-large",
name: "Acme Large",
reasoning: true,
input: ["text", "image"],
cost: { input: 3, output: 15, cacheRead: 0.3, cacheWrite: 3.75 },
contextWindow: 200000,
maxTokens: 32768,
},
{
id: "acme-small",
name: "Acme Small",
reasoning: false,
input: ["text"],
cost: { input: 1, output: 5, cacheRead: 0.1, cacheWrite: 1.25 },
contextWindow: 128000,
maxTokens: 8192,
},
],
},
};
},
},
});
},
});用户现在可以 openclaw onboard --acme-ai-api-key <key> 并选择 acme-ai/acme-large 作为模型。
对于只注册一个文本 Provider 加单一 catalog 的简化场景,可以用更窄的 defineSingleProviderPluginEntry helper:
typescript
import { defineSingleProviderPluginEntry } from "openclaw/plugin-sdk/provider-entry";
export default defineSingleProviderPluginEntry({
id: "acme-ai",
name: "Acme AI",
description: "Acme AI model provider",
provider: {
label: "Acme AI",
docsPath: "/providers/acme-ai",
auth: [
{
methodId: "api-key",
label: "Acme AI API key",
optionKey: "acmeAiApiKey",
flagName: "--acme-ai-api-key",
envVar: "ACME_AI_API_KEY",
promptMessage: "Enter your Acme AI API key",
defaultModel: "acme-ai/acme-large",
},
],
catalog: {
buildProvider: () => ({
api: "openai-completions",
baseUrl: "https://api.acme-ai.com/v1",
models: [{ id: "acme-large", name: "Acme Large" }],
}),
},
},
});步骤 3:添加动态模型解析
如果 Provider 接受任意模型 ID(如代理或路由器),添加 resolveDynamicModel:
typescript
api.registerProvider({
// ... id, label, auth, catalog
resolveDynamicModel: (ctx) => ({
id: ctx.modelId,
name: ctx.modelId,
provider: "acme-ai",
api: "openai-completions",
baseUrl: "https://api.acme-ai.com/v1",
reasoning: false,
input: ["text"],
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
contextWindow: 128000,
maxTokens: 8192,
}),
});如果解析需要网络调用,用 prepareDynamicModel 做异步预热——resolveDynamicModel 在其完成后再次执行。
步骤 4:添加 Runtime Hooks(按需)
大多数 Provider 只需要 catalog + resolveDynamicModel。共享 helper builder 覆盖了最常见的 replay/tool-compat 家族,插件通常不需要逐一手动配置每个 hook:
typescript
import { buildProviderReplayFamilyHooks } from "openclaw/plugin-sdk/provider-model-shared";
import { buildProviderStreamFamilyHooks } from "openclaw/plugin-sdk/provider-stream";
import { buildProviderToolCompatFamilyHooks } from "openclaw/plugin-sdk/provider-tools";
const GOOGLE_FAMILY_HOOKS = {
...buildProviderReplayFamilyHooks({ family: "google-gemini" }),
...buildProviderStreamFamilyHooks("google-thinking"),
...buildProviderToolCompatFamilyHooks("gemini"),
};
api.registerProvider({
id: "acme-gemini-compatible",
// ...
...GOOGLE_FAMILY_HOOKS,
});可用 Replay 家族
| 家族 | 接线内容 |
|---|---|
openai-compatible | 共享 OpenAI 风格回放策略,含 tool-call-id 清理、assistant-first 排序修复 |
anthropic-by-model | 按 modelId 选择 Claude 感知回放策略,仅在模型确实是 Claude id 时才清理 thinking block |
google-gemini | 原生 Gemini 回放策略加 bootstrap 清理和标记推理输出模式 |
passthrough-gemini | 经 OpenAI 兼容代理运行的 Gemini 模型的 thought-signature 清理 |
hybrid-anthropic-openai | 混合了 Anthropic 和 OpenAI-compatible 模型面的 Provider |
实际内置示例:google 用 google-gemini;openrouter、kilocode 用 passthrough-gemini;amazon-bedrock、anthropic-vertex 用 anthropic-by-model;moonshot、ollama、xai 用 openai-compatible。
可用 Stream 家族
| 家族 | 接线内容 |
|---|---|
google-thinking | Gemini thinking payload 规范化 |
kilocode-thinking | Kilo reasoning 包装器(proxy stream 路径),kilo/auto 和不支持的 proxy reasoning id 自动跳过 |
moonshot-thinking | Moonshot 原生 thinking payload 映射(来自 config + /think 级别) |
minimax-fast-mode | MiniMax fast-mode 模型改写 |
openai-responses-defaults | OpenAI/Codex Responses 原生包装器:attribution header、/fast/serviceTier、text verbosity、原生 Codex 网页搜索、reasoning-compat payload 整形和 Responses context 管理 |
openrouter-thinking | OpenRouter reasoning 包装器(proxy 路由),不支持的模型和 auto 跳过集中处理 |
tool-stream-default-on | 默认开启 tool streaming(如 Z.AI) |
实际内置示例:google 用 google-thinking;kilocode 用 kilocode-thinking;moonshot 用 moonshot-thinking;minimax 和 minimax-portal 用 minimax-fast-mode;openai 和 openai-codex 用 openai-responses-defaults;openrouter 用 openrouter-thinking;zai 用 tool-stream-default-on。
其他常用 Hooks
Token 交换(每次推理前):
typescript
prepareRuntimeAuth: async (ctx) => {
const exchanged = await exchangeToken(ctx.apiKey);
return {
apiKey: exchanged.token,
baseUrl: exchanged.baseUrl,
expiresAt: exchanged.expiresAt,
};
},自定义请求头:
typescript
wrapStreamFn: (ctx) => {
if (!ctx.streamFn) return undefined;
const inner = ctx.streamFn;
return async (params) => {
params.headers = { ...params.headers, "X-Acme-Version": "2" };
return inner(params);
};
},用量查询:
typescript
resolveUsageAuth: async (ctx) => {
const auth = await ctx.resolveOAuthToken();
return auth ? { token: auth.token } : null;
},
fetchUsageSnapshot: async (ctx) => {
return await fetchAcmeUsage(ctx.token, ctx.timeoutMs);
},全部 Provider Hooks 一览
OpenClaw 按以下顺序调用 hooks(大多数 Provider 只用其中 2-3 个):
| 序号 | Hook | 使用时机 |
|---|---|---|
| 1 | catalog | 模型目录或 baseUrl 默认值 |
| 2 | applyConfigDefaults | 配置材料化时的 Provider 全局默认值 |
| 3 | normalizeModelId | 旧版/预览模型 id 别名清理 |
| 4 | normalizeTransport | Provider 家族 api/baseUrl 清理 |
| 5 | normalizeConfig | 规范化 models.providers.<id> 配置 |
| 10 | resolveDynamicModel | 接受任意上游模型 id |
| 11 | prepareDynamicModel | 解析前的异步元数据预取 |
| 12 | normalizeResolvedModel | runner 前的 transport 改写 |
| 19 | createStreamFn | 完全自定义 StreamFn transport |
| 20 | wrapStreamFn | 普通流式路径上的自定义头/body 包装 |
| 36 | prepareRuntimeAuth | 推理前的 token 交换 |
| 38 | fetchUsageSnapshot | 自定义用量端点 |
| 40 | buildReplayPolicy | 自定义回放/压缩策略 |
步骤 5:添加附加能力(可选)
Provider 插件可以在文本推理之外注册语音、实时转录、媒体理解、图像生成、视频生成、网页抓取和网页搜索能力:
typescript
register(api) {
api.registerProvider({ id: "acme-ai", /* ... */ });
api.registerSpeechProvider({
id: "acme-ai",
label: "Acme Speech",
isConfigured: ({ config }) => Boolean(config.messages?.tts),
synthesize: async (req) => ({
audioBuffer: Buffer.from(/* PCM data */),
outputFormat: "mp3",
fileExtension: ".mp3",
voiceCompatible: false,
}),
});
api.registerMediaUnderstandingProvider({
id: "acme-ai",
capabilities: ["image", "audio"],
describeImage: async (req) => ({ text: "A photo of..." }),
transcribeAudio: async (req) => ({ text: "Transcript..." }),
});
api.registerImageGenerationProvider({
id: "acme-ai",
label: "Acme Images",
generate: async (req) => ({ /* image result */ }),
});
api.registerWebSearchProvider({
id: "acme-ai-search",
label: "Acme Search",
search: async (req) => ({ content: [] }),
});
}OpenClaw 将此类插件归类为 hybrid-capability 插件,这是"一个厂商一个插件"的推荐模式。
步骤 6:测试
typescript
// src/provider.test.ts
import { describe, it, expect } from "vitest";
import { acmeProvider } from "./provider.js";
describe("acme-ai provider", () => {
it("解析动态模型", () => {
const model = acmeProvider.resolveDynamicModel!({
modelId: "acme-beta-v3",
} as any);
expect(model.id).toBe("acme-beta-v3");
expect(model.provider).toBe("acme-ai");
});
it("有 key 时返回目录", async () => {
const result = await acmeProvider.catalog!.run({
resolveProviderApiKey: () => ({ apiKey: "test-key" }),
} as any);
expect(result?.provider?.models).toHaveLength(2);
});
it("没有 key 时返回 null", async () => {
const result = await acmeProvider.catalog!.run({
resolveProviderApiKey: () => ({ apiKey: undefined }),
} as any);
expect(result).toBeNull();
});
});文件结构
<bundled-plugin-root>/acme-ai/
├── package.json # openclaw.providers 元数据
├── openclaw.plugin.json # 含 providerAuthEnvVars 的 manifest
├── index.ts # definePluginEntry + registerProvider
└── src/
├── provider.test.ts # 测试
└── usage.ts # 用量端点(可选)Catalog Order 参考
catalog.order 控制目录合并的时机:
| Order | 时机 | 使用场景 |
|---|---|---|
simple | 第一轮 | 普通 API-key Provider |
profile | simple 之后 | 依赖 auth profile 的 Provider |
paired | profile 之后 | 合成多个相关条目 |
late | 最后一轮 | 覆盖已有 Provider(碰撞时胜出) |
发布到 ClawHub
bash
clawhub package publish your-org/your-plugin --dry-run
clawhub package publish your-org/your-plugin发布插件包时使用 clawhub package publish,不要用旧版 skill-only 别名。
相关文档
常见问题
Q: 我的 Provider 是 OpenAI 兼容的代理,用哪个 replay 家族最合适?
A: 通常选 openai-compatible,它提供 OpenAI 风格的回放策略,含 tool-call-id 清理和 assistant-first 排序修复,适配大多数 OpenAI 兼容端点。如果代理后面跑的是 Gemini 模型,加上 passthrough-gemini 处理 thought-signature 清理。
Q: defineSingleProviderPluginEntry 和 definePluginEntry + registerProvider 有什么区别?
A: defineSingleProviderPluginEntry 是更窄的 helper,专门用于"只注册一个文本 Provider + API-key 认证 + catalog-backed runtime"的简单场景,少写很多样板代码。如果插件还需要注册语音、图像生成或其他附加能力,用 definePluginEntry。
Q: resolveDynamicModel 和 prepareDynamicModel 分别适合什么场景?
A: resolveDynamicModel 是同步的,适合直接返回模型元数据的场景(如代理透传任意 id)。prepareDynamicModel 是异步的,用于解析前需要网络请求的场景(如从上游 API 拉取模型信息),完成后 resolveDynamicModel 会再次执行拿到完整数据。