Skip to content

Commit 01fca9a

Browse files
committed
fix: updates to the ai sdk provider
1 parent ff77e1f commit 01fca9a

File tree

2 files changed

+418
-315
lines changed

2 files changed

+418
-315
lines changed

README.md

Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ We've renamed from "llmclient" to "ax" to highlight our focus on powering agenti
2121
- Build Agents that can call other agents
2222
- Convert docs of any format to text
2323
- RAG, smart chunking, embedding, querying
24+
- Works with Vercel AI SDK
2425
- Output validation while streaming
2526
- Multi-modal DSPy supported
2627
- Automatic prompt tuning using optimizers
@@ -289,6 +290,68 @@ if (tag === "technicalSupport") {
289290
}
290291
```
291292

293+
## Vercel AI SDK Integration
294+
295+
Install the ax provider package
296+
297+
```shell
298+
npm i @ax-llm/ax-ai-sdk-provider
299+
```
300+
301+
Then use it with the AI SDK, you can either use the AI provider or the Agent Provider
302+
303+
```typescript
304+
const ai = new AxAI({
305+
name: 'openai',
306+
apiKey: process.env['OPENAI_APIKEY'] ?? "",
307+
});
308+
309+
// Create a model using the provider
310+
const model = new AxAIProvider(ai);
311+
312+
export const foodAgent = new AxAgent(ai, {
313+
name: 'food-search',
314+
description:
315+
'Use this agent to find restaurants based on what the customer wants',
316+
signature,
317+
functions
318+
})
319+
320+
// Get vercel ai sdk state
321+
const aiState = getMutableAIState()
322+
323+
// Create an agent for a specific task
324+
const foodAgent = new AxAgentProvider({
325+
agent: foodAgent,
326+
updateState: (state) => {
327+
aiState.done({ ...aiState.get(), state })
328+
},
329+
generate: async ({ restaurant, priceRange }) => {
330+
return (
331+
<BotCard>
332+
<h1>{restaurant as string} {priceRange as string}</h1>
333+
</BotCard>
334+
)
335+
}
336+
})
337+
338+
// Use with streamUI a critical part of building chat UIs in the AI SDK
339+
const result = await streamUI({
340+
model,
341+
initial: <SpinnerMessage />,
342+
messages: [
343+
// ...
344+
],
345+
text: ({ content, done, delta }) => {
346+
// ...
347+
},
348+
tools: {
349+
// @ts-ignore
350+
'find-food': foodAgent,
351+
}
352+
})
353+
```
354+
292355
## OpenTelemetry support
293356

294357
The ability to trace and observe your llm workflow is critical to building production workflows. OpenTelemetry is an industry-standard, and we support the new `gen_ai` attribute namespace.

0 commit comments

Comments
 (0)