This repository was archived by the owner on Jul 22, 2025. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6
This repository was archived by the owner on Jul 22, 2025. It is now read-only.
[Bug] Stream doesn't work with Deep Research modelΒ #322
Copy link
Copy link
Open
Labels
bugSomething isn't workingSomething isn't working
Description
π Describe the Bug
We tested the streaming API with different models with same code, the first token received with Deep Research model is almost same as the non-streaming version.
β Expected Behavior
The tokens should be streamed with Deep Research model.
β Actual Behavior
The time used between stream vs no stream are almost same.
π Steps to Reproduce
import OpenAI from 'openai';
async function noStream(model: string) {
const client = new OpenAI({
apiKey: process.env['PERPLEXITY_API_KEY'],
baseURL: 'https://api.perplexity.ai',
});
const t1 = new Date().getTime();
await client.chat.completions.create({
messages: [
{
role: 'system',
content: 'Be precise and concise.',
},
{
role: 'user',
content: 'How many stars are there in our galaxy?',
},
],
model,
stream: false,
});
const t2 = new Date().getTime();
console.log(`Model: ${model}, response time: ${t2 - t1}ms`);
}
async function stream(model: string) {
const client = new OpenAI({
apiKey: process.env['PERPLEXITY_API_KEY'],
baseURL: 'https://api.perplexity.ai',
});
const t1 = new Date().getTime();
const stream = await client.chat.completions.create({
messages: [
{
role: 'system',
content: 'Be precise and concise.',
},
{
role: 'user',
content: 'How many stars are there in our galaxy?',
},
],
model,
stream: true,
});
for await (const _chunk of stream) {
const t2 = new Date().getTime();
console.log(`Model: ${model}, first token received in: ${t2 - t1}ms`);
break;
}
}
const models = [
'sonar',
'sonar-pro',
'sonar-reasoning',
'sonar-reasoning-pro',
'sonar-deep-research',
];
console.log('No stream:');
for (const model of models) {
await noStream(model);
}
console.log('Stream:');
for (const model of models) {
await stream(model);
}π API Request & Response (if applicable)
π Environment
- API Version: sonar-deep-research
- SDK (if applicable): I used OpenAI node client for chat completions API, but it should be same with raw HTTP call.
- Operating System: MacOS
- Authentication Type: API Key
π Logs or Screenshots (if applicable)

Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working