Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

missing_required_parameter response_format.json_schema using example code in Unity (C# 9.0) #205

Closed
3 tasks done
Swah opened this issue Sep 1, 2024 · 3 comments
Closed
3 tasks done
Labels
bug Something isn't working

Comments

@Swah
Copy link

Swah commented Sep 1, 2024

Confirm this is not an issue with the OpenAI Python Library

  • This is not an issue with the OpenAI Python Library

Confirm this is not an issue with the underlying OpenAI API

  • This is not an issue with the OpenAI API

Confirm this is not an issue with Azure OpenAI

  • This is not an issue with Azure OpenAI

Describe the bug

I'm always getting the same error message as soon as I try to use any variation of ChatResponseFormat.CreateJsonSchemaFormat

Couldn't get response from ChatGPT: System.ClientModel.ClientResultException: HTTP 400 (invalid_request_error: missing_required_parameter)
Parameter: response_format.json_schema
Missing required parameter: 'response_format.json_schema'.
  at OpenAI.ClientPipelineExtensions.ProcessMessageAsync (System.ClientModel.Primitives.ClientPipeline pipeline, System.ClientModel.Primitives.PipelineMessage message, System.ClientModel.Primitives.RequestOptions options) [0x0015e] in <92cde0f04c3a467caa9bf3e15fe4f0c8>:0 
  at System.Threading.Tasks.ValueTask`1[TResult].get_Result () [0x0001b] in <27124aa0e30a41659b903b822b959bc7>:0 
  at OpenAI.Chat.ChatClient.CompleteChatAsync (System.ClientModel.BinaryContent content, System.ClientModel.Primitives.RequestOptions options) [0x000ad] in <92cde0f04c3a467caa9bf3e15fe4f0c8>:0 
  at OpenAI.Chat.ChatClient.CompleteChatAsync (System.Collections.Generic.IEnumerable`1[T] messages, OpenAI.Chat.ChatCompletionOptions options, System.Threading.CancellationToken cancellationToken) [0x00152] in <92cde0f04c3a467caa9bf3e15fe4f0c8>:0 

In the code below, if I use chatOptions.ResponseFormat = ChatResponseFormat.JsonObject, I don't get an error message, but then I of course don't get to specify the schema.

This user was also having issues in Unity, but the error message was unclear so it's hard to know if it's the same issue. Plus whatever fixed it should be documented.

To Reproduce

Run the code below in Unity 2022.3.22f1
Library installed using NuGet for Unity

Code snippets

var client = new OpenAI.Chat.ChatClient(model: model.Or(DEFAULT_CHAT_GPT_MODEL), apiKey);
var chatOptions = new ChatCompletionOptions();
chatOptions.ResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat(
    name: "math_reasoning",
    jsonSchema: BinaryData.FromString(
        "{\n" +
        "    \"type\": \"object\",\n" +
        "    \"properties\": {\n" +
        "        \"steps\": {\n" +
        "            \"type\": \"array\",\n" +
        "            \"items\": {\n" +
        "                \"type\": \"object\",\n" +
        "                \"properties\": {\n" +
        "                    \"explanation\": { \"type\": \"string\" },\n" +
        "                    \"output\": { \"type\": \"string\" }\n" +
        "                },\n" +
        "                \"required\": [\"explanation\", \"output\"],\n" +
        "                \"additionalProperties\": false\n" +
        "            }\n" +
        "        },\n" +
        "        \"final_answer\": { \"type\": \"string\" }\n" +
        "    },\n" +
        "    \"required\": [\"steps\", \"final_answer\"],\n" +
        "    \"additionalProperties\": false\n" +
        "}"),
    strictSchemaEnabled: true);
var chatMessage = new UserChatMessage(query);
return client.CompleteChatAsync(new List<UserChatMessage> { chatMessage }, chatOptions)

OS

winOS

.NET version

.NET Standard 2.1

Library version

2.0.0-beta.10

@Swah Swah added the bug Something isn't working label Sep 1, 2024
@Swah
Copy link
Author

Swah commented Sep 2, 2024

Just adding that using function calls seems to work (see #193)

private ChatTool GetSchema<T>() => ChatTool.CreateFunctionTool(
    functionName: nameof(T),
    functionDescription: "This is a format to create novel charater which user wants.",
    functionParameters: BinaryData.FromString(CreateSchema<T>()));

ChatCompletionOptions options = new()
{
    Tools = { GetSchema<T>() },
    ToolChoice = ChatToolChoice.Required,
};

client.CompleteChatAsync(new List<UserChatMessage> { chatMessage }, options)

So the issue could potentially be with ChatResponseFormat.CreateJsonSchemaFormat, or with how chat responses are handled internally compared to function calls. It seems like using Unity / C# 9.0 could be involved as well.

@joseharriaga
Copy link
Collaborator

Thank you for reaching out, @Swah ! Based on the information that you and @helloSalmon provided here and in the other thread (#193), we were able to identify the issue and track it back to this issue in the .NET runtime: 🔗 dotnet/runtime#103365.

I implemented a mitigation as part of this PR: 🔗 #206. I confirmed that this fixes the problem in Unity. ChatResponseFormat works as expected now.

We also just pushed a release today, so you can go grab this fix starting with version 2.0.0-beta.11:
🔗 https://www.nuget.org/packages/OpenAI/2.0.0-beta.11

@Swah
Copy link
Author

Swah commented Sep 4, 2024

@joseharriaga that was one of the quickest resolution on a ticket I've experienced, thanks :)

I can confirm beta.11 seems to fix the issue for me, using chat completion instead of function calls. Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants