-
Notifications
You must be signed in to change notification settings - Fork 19
Migrate to latest openai code #13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
lazeratops
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| import openai | ||
| from openai import OpenAI | ||
|
|
||
| client = OpenAI() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This appears to only be used from AzureAIService; if so I'd suggest making it a class variable.
| import os | ||
| from openai import OpenAI | ||
|
|
||
| client = OpenAI(api_key=os.getenv("OPENAI_API_KEY")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This appears to only be used from OpenAIService; if so, I'd suggest making it a class variable. It also looks like our earlier reference to the key was using the variable name OPEN_AI_KEY - was this meant to be that?
| image = client.images.generate(api_type = 'openai', | ||
| api_version = '2020-11-07', | ||
| api_base = "https://api.openai.com/v1", | ||
| api_key = os.getenv("OPEN_AI_KEY"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we specify the API key when constructing the client, I'd presume this might no longer be needed. (let's clean up the indentation as well!)
| if chunk["choices"][0]["delta"]["content"] != {}: #streaming a content chunk | ||
| next_chunk = chunk["choices"][0]["delta"]["content"] | ||
| if chunk.choices[0].delta.content: | ||
| if chunk.choices[0].delta.content != {}: #streaming a content chunk |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is more a note for us to follow up on as I realize it is not in scope of this PR (so more for @chadbailey59 and @Moishe maybe!), but the repeated list and property access here is redundant - I suggest, in a follow-up PR, accessing the required elements once for reuse (here and below). e.g.:
choices = chunk.choices
if len(choices):
continue
delta = chunk.choices[0].delta
if delta.content:
//... etc(Hastily written just to give an idea of what I mean)
| stream=stream, | ||
| messages=messages | ||
| ) | ||
| response = client.chat.completions.create(api_type = 'azure', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggest just cleaning up the indentation here for readability (and below)
|
It's great to have this updated code. I'm testing locally and the EDIT: @lazeratops already noted this: https://github.com/daily-demos/llm-talk/pull/13/files#r1412789861 |
Fixed a bunch of errors that was breaking the demo. Almost all of it is around handle updated data structures in the openai sdk.
(credits to @jmtame for pairing on this together)