-
Notifications
You must be signed in to change notification settings - Fork 349
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(auto-edit): fix the temperature value regression with the auto-edit #6848
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can consider this approved for backport via this message from @beyang: https://sourcegraph.slack.com/archives/C089TSH61KR/p1738017476230839?thread_ts=1738014461.934859&cid=C089TSH61KR
Thanks @umpox !! |
…ession with the auto-edit (#6852) ## Context 1. We received multiple reports describing inconsistent model outputs in the "auto-edit" feature. 2. We initially used a temperature value of 0.2. On December 17, we set it to 0 in the [client-side PR](https://github.com/sourcegraph/cody/pull/6363/files)(released December 22 pre-release, December 29 stable) for more consistent generation. 3. The [issue arises in cody-gateway](https://github.com/sourcegraph/sourcegraph/blob/main/cmd/cody-gateway/internal/httpapi/completions/upstream.go#L318), when JSON-marshaling the request, any zero-valued fields (like temperature=0) get dropped before being sent to Fireworks. 4. Because no temperature is passed, [Fireworks defaults to a temperature of 1](https://docs.fireworks.ai/api-reference/post-completions#body-temperature), which is significantly higher than intended and leads to inconsistent outputs. 5. This PR changes the value from 0 to 0.1. To decide the value, we did an offline evaluation on couple of prompts on over ~100 requests and got consistent output for the requests. ## Test plan 1. Fixed the test cases and CI 2. Local testing on the prompt to ensure the output is consistent. <br> Backport b585001 from #6848 Co-authored-by: Hitesh Sagtani <sagtanih@gmail.com>
…regression with the auto-edit (#6851) ## Context 1. We received multiple reports describing inconsistent model outputs in the "auto-edit" feature. 2. We initially used a temperature value of 0.2. On December 17, we set it to 0 in the [client-side PR](https://github.com/sourcegraph/cody/pull/6363/files)(released December 22 pre-release, December 29 stable) for more consistent generation. 3. The [issue arises in cody-gateway](https://github.com/sourcegraph/sourcegraph/blob/main/cmd/cody-gateway/internal/httpapi/completions/upstream.go#L318), when JSON-marshaling the request, any zero-valued fields (like temperature=0) get dropped before being sent to Fireworks. 4. Because no temperature is passed, [Fireworks defaults to a temperature of 1](https://docs.fireworks.ai/api-reference/post-completions#body-temperature), which is significantly higher than intended and leads to inconsistent outputs. 5. This PR changes the value from 0 to 0.1. To decide the value, we did an offline evaluation on couple of prompts on over ~100 requests and got consistent output for the requests. ## Test plan 1. Fixed the test cases and CI 2. Local testing on the prompt to ensure the output is consistent. <br> Backport b585001 from #6848 Co-authored-by: Hitesh Sagtani <sagtanih@gmail.com>
Context
Test plan