-
-
Notifications
You must be signed in to change notification settings - Fork 13.9k
🔨 chore: New API support switch Responses API mode #9776
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
@sxjeru is attempting to deploy a commit to the LobeHub Community Team on Vercel. A member of the Team first needs to authorize it. |
|
👍 @sxjeru Thank you for raising your pull request and contributing to our Community |
TestGru AssignmentSummary
Files
Tip You can |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #9776 +/- ##
==========================================
- Coverage 83.66% 81.86% -1.80%
==========================================
Files 950 807 -143
Lines 65889 54414 -11475
Branches 8037 5009 -3028
==========================================
- Hits 55126 44547 -10579
+ Misses 10763 9867 -896
Flags with carried forward coverage won't be shown. Click here to find out more.
🚀 New features to boost your workflow:
|
Reviewer's GuideThis PR overhauls the Responses API mode support by removing the legacy handlePayload, introducing a payload.apiMode switch with a useResponseModels whitelist in the OpenAI-compatible factory, and propagating supportResponsesApi flags through UI and provider configurations, alongside updating tests, routes, and localization. Entity relationship diagram for custom provider SDK optionserDiagram
CUSTOM_PROVIDER_SDK_OPTIONS {
label string
value string
}
CUSTOM_PROVIDER_SDK_OPTIONS ||--o| ProviderSettings : "sets sdkType"
ProviderSettings {
sdkType string
supportResponsesApi boolean
}
Class diagram for OpenAI-compatible runtime payload processingclassDiagram
class OpenAICompatibleRuntime {
+createOpenAICompatibleRuntime<T>()
-_options
+chat(payload)
}
class ChatPayload {
+model: string
+messages: Message[]
+temperature: number
+apiMode: string
}
OpenAICompatibleRuntime --> ChatPayload: processes
class ChatCompletionOptions {
+useResponse: boolean
+useResponseModels: Array<string | RegExp>
}
OpenAICompatibleRuntime --> ChatCompletionOptions: uses
ChatPayload <|-- ProcessedPayload
class ProcessedPayload {
+apiMode: 'responses' | undefined
}
Class diagram for provider configuration changesclassDiagram
class ModelProviderCard {
+settings: ProviderSettings
+url: string
}
class ProviderSettings {
+sdkType: string
+showModelFetcher: boolean
+supportResponsesApi: boolean
}
ModelProviderCard --> ProviderSettings
class CreateAiProviderParams {
+settings: ProviderSettings
+name: string
+id: string
}
CreateAiProviderParams --> ProviderSettings
File-Level Changes
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey there - I've reviewed your changes - here's some feedback:
- Consider extracting the new payload processing (userApiMode + whitelist logic) into a standalone helper function to simplify createOpenAICompatibleRuntime and improve readability.
- Add targeted unit tests for the new apiMode switch logic (ON/OFF + whitelist behavior) in createOpenAICompatibleRuntime to validate all branching scenarios.
- The CUSTOM_PROVIDER_SDK_OPTIONS entry using value 'router' for New API could be renamed to a more explicit key (e.g. 'newapi') to avoid confusion with other router types.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Consider extracting the new payload processing (userApiMode + whitelist logic) into a standalone helper function to simplify createOpenAICompatibleRuntime and improve readability.
- Add targeted unit tests for the new apiMode switch logic (ON/OFF + whitelist behavior) in createOpenAICompatibleRuntime to validate all branching scenarios.
- The CUSTOM_PROVIDER_SDK_OPTIONS entry using value 'router' for New API could be renamed to a more explicit key (e.g. 'newapi') to avoid confusion with other router types.
## Individual Comments
### Comment 1
<location> `packages/model-runtime/src/core/openaiCompatibleFactory/index.ts:238` </location>
<code_context>
+ // Keep apiMode: 'responses'
+ } else {
+ log('using Chat Completions API mode (switch ON but model not in whitelist)');
+ processedPayload = { ...payload, apiMode: undefined } as any;
+ }
} else {
</code_context>
<issue_to_address>
**suggestion:** Setting apiMode to undefined may not fully remove the property.
Assigning apiMode: undefined keeps the key in the object. If consumers require the property to be absent, use a method to remove it from the payload.
Suggested implementation:
```typescript
processedPayload = { ...payload } as any;
delete processedPayload.apiMode;
```
```typescript
processedPayload = { ...payload } as any;
delete processedPayload.apiMode;
```
</issue_to_address>
### Comment 2
<location> `src/app/(backend)/webapi/stt/openai/route.ts:43` </location>
<code_context>
- const res = await createOpenaiAudioTranscriptions({ openai: openaiOrErrResponse, payload });
+ const res = await createOpenaiAudioTranscriptions({
+ openai: openaiOrErrResponse as any,
+ payload,
+ });
</code_context>
<issue_to_address>
**suggestion:** Using 'as any' for openai type may hide type errors.
Refining the type or validating the object's shape before casting will help maintain type safety and prevent hidden errors.
Suggested implementation:
```typescript
// if resOrOpenAI is a Response, it means there is an error,just return it
if (openaiOrErrResponse instanceof Response) return openaiOrErrResponse;
// Validate openaiOrErrResponse shape before passing
function isOpenAIClient(obj: unknown): obj is OpenAIClient {
return (
typeof obj === "object" &&
obj !== null &&
// Add checks for required properties/methods of OpenAIClient
typeof (obj as OpenAIClient).createTranscription === "function"
);
}
if (!isOpenAIClient(openaiOrErrResponse)) {
return new Response(
JSON.stringify({ error: "Invalid OpenAI client object" }),
{ status: 400 }
);
}
const res = await createOpenaiAudioTranscriptions({
openai: openaiOrErrResponse,
payload,
});
return new Response(JSON.stringify(res), {
headers: {
```
- You may need to define or import the `OpenAIClient` type/interface if it does not already exist in your codebase. Adjust the property/method checks in `isOpenAIClient` to match the actual shape of your OpenAI client.
- If `createTranscription` is not the correct method/property, update the type guard accordingly.
</issue_to_address>
### Comment 3
<location> `src/app/(backend)/webapi/tts/openai/route.ts:37` </location>
<code_context>
if (openaiOrErrResponse instanceof Response) return openaiOrErrResponse;
- return await createOpenaiAudioSpeech({ openai: openaiOrErrResponse, payload });
+ return await createOpenaiAudioSpeech({ openai: openaiOrErrResponse as any, payload });
};
</code_context>
<issue_to_address>
**suggestion:** Casting openaiOrErrResponse to 'any' may reduce type safety.
Consider validating or asserting the type of openaiOrErrResponse before passing it to createOpenaiAudioSpeech to maintain type safety.
Suggested implementation:
```typescript
// if resOrOpenAI is a Response, it means there is an error,just return it
if (openaiOrErrResponse instanceof Response) return openaiOrErrResponse;
// Type guard to check if openaiOrErrResponse is a valid OpenAI instance
function isOpenAIInstance(obj: unknown): obj is OpenAI {
// Adjust this check based on your OpenAI class/interface
return typeof obj === "object" && obj !== null && "audio" in obj;
}
if (!isOpenAIInstance(openaiOrErrResponse)) {
return new Response("Invalid OpenAI instance", { status: 500 });
}
return await createOpenaiAudioSpeech({ openai: openaiOrErrResponse, payload });
};
```
- You may need to import or define the `OpenAI` type/interface at the top of the file if it's not already present.
- Adjust the `isOpenAIInstance` logic to match the actual shape of your OpenAI instance.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
…r组件中调整provider图标映射逻辑
🔧 fix: 修复免费标识逻辑,确保正确判断模型是否免费
💻 Change Type
🔀 Description of Change
📝 Additional Information
Summary by Sourcery
Add switchable Responses API mode support for New API and AiHubMix providers by replacing hardcoded payload handling with configurable whitelist settings, updating factory logic to respect user toggles, extending provider configs and UI, and adjusting tests accordingly.
Enhancements:
Documentation:
Tests:
Chores: