Skip to content

Conversation

@CarolinNygren
Copy link

Choose between models

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds support for selecting between multiple GPT models in the chat interface. It introduces a model selector dropdown in the UI that allows users to choose from different Azure OpenAI deployments (gpt-4o, gpt-5, gpt-5-mini), and propagates the selected model through the entire request pipeline from the frontend to the OpenAI API calls.

Key changes:

  • Added model selection UI component in the chat header with a dropdown selector
  • Extended backend infrastructure to support multiple OpenAI model deployments through environment variables
  • Modified OpenAI service to dynamically resolve deployment configurations based on the selected model

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
src/features/common/services/openai.ts Added deployment override parameter and logic to lookup alternative model configurations from environment variables
src/features/chat-page/chat-store.tsx Added selectedModel state property and update method to track user's model selection
src/features/chat-page/chat-services/models.ts Extended UserPrompt interface to include optional model field
src/features/chat-page/chat-services/chat-api/chat-api.ts Propagated model parameter to all chat API implementations
src/features/chat-page/chat-services/chat-api/chat-api-rag.ts Added model parameter and passed it to OpenAIInstance
src/features/chat-page/chat-services/chat-api/chat-api-multimodal.tsx Added model parameter and passed it to OpenAIInstance
src/features/chat-page/chat-services/chat-api/chat-api-extension.ts Added model parameter and passed it to OpenAIInstance
src/features/chat-page/chat-header/chat-header.tsx Added ModelSelect component with dropdown for gpt-4o, gpt-5, and gpt-5-mini options
infra/resources.bicep Added infrastructure to map additional LLM deployments to environment variables
infra/main.bicep Defined default additional deployments for gpt-5 and gpt-5-mini with model versions

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 10 out of 10 changed files in this pull request and generated 6 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@jakobehn-ica
Copy link

LGTM :-)

@jakobehn jakobehn merged commit 3f64680 into main Dec 17, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants