Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAI o3-mini support #1830

Open
Boleris opened this issue Feb 3, 2025 · 1 comment
Open

OpenAI o3-mini support #1830

Boleris opened this issue Feb 3, 2025 · 1 comment

Comments

@Boleris
Copy link

Boleris commented Feb 3, 2025

Bug Description
There is no support for the o3-mini model in the application. When attempting to use a custom model, the following error is displayed:

Connection to OpenAI API failed. This typically occurs due to incorrect configuration or OpenAI API account issues. Please check your settings and verify your OpenAI API account status.

API Error: Status Code 400, { "error": { "message": "Unsupported parameter: 'temperature' is not supported with this model.", "type": "invalid_request_error", "param": "temperature", "code": "unsupported_parameter" } }

Steps to Reproduce
Please provide the steps to reproduce the bug:

  1. Configure a custom model using the o3-mini model in the application.
  2. Attempt to interact with the model.
  3. Observe the error message displayed.

Expected Results
The o3-mini model should be supported, and the application should handle it without errors, or the unsupported parameters should not be sent in the request.


Screenshots

Image

@IGLOU-EU
Copy link

IGLOU-EU commented Feb 3, 2025

Hi, o3 like o1 does no support parameter like temperature. You need to implement it like o1-mini, but o1 mini does not support reasoning parameter. You can see this PR #1818

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants