Skip to content

Commit

Permalink
feat(auto-edit): fix the temperature value regression with the auto-e…
Browse files Browse the repository at this point in the history
…dit (#6848)

## Context
1. We received multiple reports describing inconsistent model outputs in
the "auto-edit" feature.
2. We initially used a temperature value of 0.2. On December 17, we set
it to 0 in the [client-side
PR](https://github.com/sourcegraph/cody/pull/6363/files)(released
December 22 pre-release, December 29 stable) for more consistent
generation.
3. The [issue arises in
cody-gateway](https://github.com/sourcegraph/sourcegraph/blob/main/cmd/cody-gateway/internal/httpapi/completions/upstream.go#L318),
when JSON-marshaling the request, any zero-valued fields (like
temperature=0) get dropped before being sent to Fireworks.
4. Because no temperature is passed, [Fireworks defaults to a
temperature of
1](https://docs.fireworks.ai/api-reference/post-completions#body-temperature),
which is significantly higher than intended and leads to inconsistent
outputs.
5. This PR changes the value from 0 to 0.1. To decide the value, we did
an offline evaluation on couple of prompts on over ~100 requests and got
consistent output for the requests.

## Test plan
1. Fixed the test cases and CI
2. Local testing on the prompt to ensure the output is consistent.

---------

Co-authored-by: Tom Ross <tom@umpox.com>
  • Loading branch information
hitesh-1997 and umpox authored Jan 28, 2025
1 parent 6e8d30d commit b585001
Show file tree
Hide file tree
Showing 8 changed files with 10 additions and 10 deletions.
4 changes: 2 additions & 2 deletions vscode/src/autoedits/adapters/cody-gateway.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ describe('CodyGatewayAdapter', () => {
expect.objectContaining({
stream: false,
model: options.model,
temperature: 0,
temperature: 0.1,
response_format: { type: 'text' },
prediction: {
type: 'content',
Expand Down Expand Up @@ -100,7 +100,7 @@ describe('CodyGatewayAdapter', () => {
expect.objectContaining({
stream: false,
model: options.model,
temperature: 0,
temperature: 0.1,
response_format: { type: 'text' },
prediction: {
type: 'content',
Expand Down
2 changes: 1 addition & 1 deletion vscode/src/autoedits/adapters/cody-gateway.ts
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ export class CodyGatewayAdapter implements AutoeditsModelAdapter {
const body: FireworksCompatibleRequestParams = {
stream: false,
model: options.model,
temperature: 0,
temperature: 0.1,
max_tokens: maxTokens,
response_format: {
type: 'text',
Expand Down
4 changes: 2 additions & 2 deletions vscode/src/autoedits/adapters/fireworks.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ describe('FireworksAdapter', () => {
expect.objectContaining({
stream: false,
model: options.model,
temperature: 0,
temperature: 0.1,
max_tokens: expect.any(Number),
response_format: { type: 'text' },
prediction: {
Expand Down Expand Up @@ -92,7 +92,7 @@ describe('FireworksAdapter', () => {
expect.objectContaining({
stream: false,
model: options.model,
temperature: 0,
temperature: 0.1,
max_tokens: expect.any(Number),
response_format: { type: 'text' },
prediction: {
Expand Down
2 changes: 1 addition & 1 deletion vscode/src/autoedits/adapters/fireworks.ts
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ export class FireworksAdapter implements AutoeditsModelAdapter {
const body: FireworksCompatibleRequestParams = {
stream: false,
model: options.model,
temperature: 0,
temperature: 0.1,
max_tokens: maxTokens,
response_format: {
type: 'text',
Expand Down
2 changes: 1 addition & 1 deletion vscode/src/autoedits/adapters/sourcegraph-chat.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ describe('SourcegraphChatAdapter', () => {
expect(chatOptions).toMatchObject({
model: 'anthropic/claude-2',
maxTokensToSample: getMaxOutputTokensForAutoedits(options.codeToRewrite),
temperature: 0,
temperature: 0.1,
prediction: {
type: 'content',
content: 'const x = 1',
Expand Down
2 changes: 1 addition & 1 deletion vscode/src/autoedits/adapters/sourcegraph-chat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ export class SourcegraphChatAdapter implements AutoeditsModelAdapter {
{
model: option.model,
maxTokensToSample: maxTokens,
temperature: 0,
temperature: 0.1,
prediction: {
type: 'content',
content: option.codeToRewrite,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ describe('SourcegraphCompletionsAdapter', () => {
expect(params).toMatchObject({
model: 'anthropic/claude-2',
maxTokensToSample: getMaxOutputTokensForAutoedits(options.codeToRewrite),
temperature: 0,
temperature: 0.1,
messages: [{ speaker: 'human', text: ps`user message` }],
prediction: {
type: 'content',
Expand Down
2 changes: 1 addition & 1 deletion vscode/src/autoedits/adapters/sourcegraph-completions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ export class SourcegraphCompletionsAdapter implements AutoeditsModelAdapter {
model: option.model as ModelRefStr,
messages,
maxTokensToSample: maxTokens,
temperature: 0,
temperature: 0.1,
prediction: {
type: 'content',
content: option.codeToRewrite,
Expand Down

0 comments on commit b585001

Please sign in to comment.