Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(auto-edit): fix the temperature value regression with the auto-edit #6848

Merged
merged 2 commits into from
Jan 28, 2025

Conversation

hitesh-1997
Copy link
Contributor

@hitesh-1997 hitesh-1997 commented Jan 28, 2025

Context

  1. We received multiple reports describing inconsistent model outputs in the "auto-edit" feature.
  2. We initially used a temperature value of 0.2. On December 17, we set it to 0 in the client-side PR(released December 22 pre-release, December 29 stable) for more consistent generation.
  3. The issue arises in cody-gateway, when JSON-marshaling the request, any zero-valued fields (like temperature=0) get dropped before being sent to Fireworks.
  4. Because no temperature is passed, Fireworks defaults to a temperature of 1, which is significantly higher than intended and leads to inconsistent outputs.
  5. This PR changes the value from 0 to 0.1. To decide the value, we did an offline evaluation on couple of prompts on over ~100 requests and got consistent output for the requests.

Test plan

  1. Fixed the test cases and CI
  2. Local testing on the prompt to ensure the output is consistent.

Copy link
Contributor

@umpox umpox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hitesh-1997
Copy link
Contributor Author

I think we can consider this approved for backport via this message from @beyang: https://sourcegraph.slack.com/archives/C089TSH61KR/p1738017476230839?thread_ts=1738014461.934859&cid=C089TSH61KR

Thanks @umpox !!

@umpox umpox merged commit b585001 into main Jan 28, 2025
21 checks passed
@umpox umpox deleted the hitesh/auto-edit-fix-inconsistency-regression branch January 28, 2025 16:13
umpox pushed a commit that referenced this pull request Jan 28, 2025
…ession with the auto-edit (#6852)

## Context
1. We received multiple reports describing inconsistent model outputs in
the "auto-edit" feature.
2. We initially used a temperature value of 0.2. On December 17, we set
it to 0 in the [client-side
PR](https://github.com/sourcegraph/cody/pull/6363/files)(released
December 22 pre-release, December 29 stable) for more consistent
generation.
3. The [issue arises in
cody-gateway](https://github.com/sourcegraph/sourcegraph/blob/main/cmd/cody-gateway/internal/httpapi/completions/upstream.go#L318),
when JSON-marshaling the request, any zero-valued fields (like
temperature=0) get dropped before being sent to Fireworks.
4. Because no temperature is passed, [Fireworks defaults to a
temperature of
1](https://docs.fireworks.ai/api-reference/post-completions#body-temperature),
which is significantly higher than intended and leads to inconsistent
outputs.
5. This PR changes the value from 0 to 0.1. To decide the value, we did
an offline evaluation on couple of prompts on over ~100 requests and got
consistent output for the requests.

## Test plan
1. Fixed the test cases and CI
2. Local testing on the prompt to ensure the output is consistent.
 <br> Backport b585001 from #6848

Co-authored-by: Hitesh Sagtani <sagtanih@gmail.com>
umpox pushed a commit that referenced this pull request Jan 28, 2025
…regression with the auto-edit (#6851)

## Context
1. We received multiple reports describing inconsistent model outputs in
the &quot;auto-edit&quot; feature.
2. We initially used a temperature value of 0.2. On December 17, we set
it to 0 in the [client-side
PR](https://github.com/sourcegraph/cody/pull/6363/files)(released
December 22 pre-release, December 29 stable) for more consistent
generation.
3. The [issue arises in
cody-gateway](https://github.com/sourcegraph/sourcegraph/blob/main/cmd/cody-gateway/internal/httpapi/completions/upstream.go#L318),
when JSON-marshaling the request, any zero-valued fields (like
temperature=0) get dropped before being sent to Fireworks.
4. Because no temperature is passed, [Fireworks defaults to a
temperature of
1](https://docs.fireworks.ai/api-reference/post-completions#body-temperature),
which is significantly higher than intended and leads to inconsistent
outputs.
5. This PR changes the value from 0 to 0.1. To decide the value, we did
an offline evaluation on couple of prompts on over ~100 requests and got
consistent output for the requests.

## Test plan
1. Fixed the test cases and CI
2. Local testing on the prompt to ensure the output is consistent.
 <br> Backport b585001 from #6848

Co-authored-by: Hitesh Sagtani <sagtanih@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants