Skip to main content

ChatCloudflareWorkersAI

Workers AI allows you to run machine learning models, on the Cloudflare network, from your own code.

Usage

You'll first need to install the LangChain Cloudflare integration package:

npm install @langchain/cloudflare
tip

We're unifying model params across all packages. We now suggest using model instead of modelName, and apiKey for API keys.

import { ChatCloudflareWorkersAI } from "@langchain/cloudflare";

const model = new ChatCloudflareWorkersAI({
model: "@cf/meta/llama-2-7b-chat-int8", // Default value
cloudflareAccountId: process.env.CLOUDFLARE_ACCOUNT_ID,
cloudflareApiToken: process.env.CLOUDFLARE_API_TOKEN,
// Pass a custom base URL to use Cloudflare AI Gateway
// baseUrl: `https://gateway.ai.cloudflare.com/v1/{YOUR_ACCOUNT_ID}/{GATEWAY_NAME}/workers-ai/`,
});

const response = await model.invoke([
["system", "You are a helpful assistant that translates English to German."],
["human", `Translate "I love programming".`],
]);

console.log(response);

/*
AIMessage {
content: `Sure! Here's the translation of "I love programming" into German:\n` +
'\n' +
'"Ich liebe Programmieren."\n' +
'\n' +
'In this sentence, "Ich" means "I," "liebe" means "love," and "Programmieren" means "programming."',
additional_kwargs: {}
}
*/

const stream = await model.stream([
["system", "You are a helpful assistant that translates English to German."],
["human", `Translate "I love programming".`],
]);

for await (const chunk of stream) {
console.log(chunk);
}

/*
AIMessageChunk {
content: 'S',
additional_kwargs: {}
}
AIMessageChunk {
content: 'ure',
additional_kwargs: {}
}
AIMessageChunk {
content: '!',
additional_kwargs: {}
}
AIMessageChunk {
content: ' Here',
additional_kwargs: {}
}
...
*/

API Reference:

Tool calling

note

Tool calling is only available in @langchain/cloudflare version 0.0.7 and above.

Cloudflare's API now supports tool calling! The below example demonstrates how to invoke and stream tool calls.

import { ChatCloudflareWorkersAI } from "@langchain/cloudflare";
import {
AIMessageChunk,
HumanMessage,
SystemMessage,
} from "@langchain/core/messages";
import { tool } from "@langchain/core/tools";
import { z } from "zod";

const model = new ChatCloudflareWorkersAI({
model: "@hf/nousresearch/hermes-2-pro-mistral-7b",
cloudflareAccountId: process.env.CLOUDFLARE_ACCOUNT_ID,
cloudflareApiToken: process.env.CLOUDFLARE_API_TOKEN,
// Pass a custom base URL to use Cloudflare AI Gateway
// baseUrl: `https://gateway.ai.cloudflare.com/v1/{YOUR_ACCOUNT_ID}/{GATEWAY_NAME}/workers-ai/`,
});

const weatherSchema = z.object({
location: z.string().describe("The location to get the weather for"),
});
const weatherTool = tool<typeof weatherSchema>(
(input) => {
return `The weather in ${input.location} is sunny.`;
},
{
name: "get_weather",
description: "Get the weather",
}
);

const modelWithTools = model.bindTools([weatherTool]);

const inputMessages = [
new SystemMessage("You are a helpful assistant."),
new HumanMessage("What's the weather like in the North Pole?"),
];

const response = await modelWithTools.invoke(inputMessages);

console.log(response.tool_calls);

/*
[ { name: 'get_weather', args: { input: 'North Pole' } } ]
*/

const stream = await modelWithTools.stream(inputMessages);

let finalChunk: AIMessageChunk | undefined;
for await (const chunk of stream) {
console.log("chunk: ", chunk.content);
if (!finalChunk) {
finalChunk = chunk;
} else {
finalChunk = finalChunk.concat(chunk);
}
}

/*
chunk: <
chunk: tool
chunk: _
chunk: call
chunk: >
chunk: \n
chunk: {'
chunk: arguments
chunk: ':
chunk: {'
chunk: input
chunk: ':
chunk: '
chunk: N
chunk: orth
chunk: P
chunk: ole
chunk: '},
chunk: '
chunk: name
chunk: ':
chunk: '
chunk: get
chunk: _
chunk: we
chunk: ather
chunk: '}
chunk: \n
chunk: </
chunk: tool
chunk: _
chunk: call
chunk: >
chunk: <|im_end|>
*/

console.log(finalChunk?.tool_calls);

/*
[
{ name: 'get_weather', args: { input: 'North Pole' }, id: undefined }
]
*/

API Reference:


Was this page helpful?


You can also leave detailed feedback on GitHub.