-
Notifications
You must be signed in to change notification settings - Fork 0
gpt 5 o gpt 5 mini #12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds support for selecting between multiple GPT models in the chat interface. It introduces a model selector dropdown in the UI that allows users to choose from different Azure OpenAI deployments (gpt-4o, gpt-5, gpt-5-mini), and propagates the selected model through the entire request pipeline from the frontend to the OpenAI API calls.
Key changes:
- Added model selection UI component in the chat header with a dropdown selector
- Extended backend infrastructure to support multiple OpenAI model deployments through environment variables
- Modified OpenAI service to dynamically resolve deployment configurations based on the selected model
Reviewed changes
Copilot reviewed 10 out of 10 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| src/features/common/services/openai.ts | Added deployment override parameter and logic to lookup alternative model configurations from environment variables |
| src/features/chat-page/chat-store.tsx | Added selectedModel state property and update method to track user's model selection |
| src/features/chat-page/chat-services/models.ts | Extended UserPrompt interface to include optional model field |
| src/features/chat-page/chat-services/chat-api/chat-api.ts | Propagated model parameter to all chat API implementations |
| src/features/chat-page/chat-services/chat-api/chat-api-rag.ts | Added model parameter and passed it to OpenAIInstance |
| src/features/chat-page/chat-services/chat-api/chat-api-multimodal.tsx | Added model parameter and passed it to OpenAIInstance |
| src/features/chat-page/chat-services/chat-api/chat-api-extension.ts | Added model parameter and passed it to OpenAIInstance |
| src/features/chat-page/chat-header/chat-header.tsx | Added ModelSelect component with dropdown for gpt-4o, gpt-5, and gpt-5-mini options |
| infra/resources.bicep | Added infrastructure to map additional LLM deployments to environment variables |
| infra/main.bicep | Defined default additional deployments for gpt-5 and gpt-5-mini with model versions |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
Copilot reviewed 10 out of 10 changed files in this pull request and generated 6 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
LGTM :-) |
Choose between models