You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bug Description
There is no support for the o3-mini model in the application. When attempting to use a custom model, the following error is displayed:
Connection to OpenAI API failed. This typically occurs due to incorrect configuration or OpenAI API account issues. Please check your settings and verify your OpenAI API account status.
API Error: Status Code 400, { "error": { "message": "Unsupported parameter: 'temperature' is not supported with this model.", "type": "invalid_request_error", "param": "temperature", "code": "unsupported_parameter" } }
Steps to Reproduce
Please provide the steps to reproduce the bug:
Configure a custom model using the o3-mini model in the application.
Attempt to interact with the model.
Observe the error message displayed.
Expected Results
The o3-mini model should be supported, and the application should handle it without errors, or the unsupported parameters should not be sent in the request.
Screenshots
The text was updated successfully, but these errors were encountered:
Hi, o3 like o1 does no support parameter like temperature. You need to implement it like o1-mini, but o1 mini does not support reasoning parameter. You can see this PR #1818
Bug Description
There is no support for the
o3-mini
model in the application. When attempting to use a custom model, the following error is displayed:Steps to Reproduce
Please provide the steps to reproduce the bug:
o3-mini
model in the application.Expected Results
The
o3-mini
model should be supported, and the application should handle it without errors, or the unsupported parameters should not be sent in the request.Screenshots
The text was updated successfully, but these errors were encountered: