feat(auto-edit): fix the temperature value regression with the auto-edit #26133
Annotations
3 errors
Run pnpm run test:unit --run:
vscode/src/autoedits/adapters/cody-gateway.test.ts#L99
AssertionError: expected { stream: false, …(8) } to deeply equal ObjectContaining{…}
- Expected
+ Received
- ObjectContaining {
+ Object {
+ "max_tokens": 515,
"model": "anthropic/claude-2",
"prediction": Object {
"content": "const x = 1",
"type": "content",
},
"prompt": "user message",
"response_format": Object {
"type": "text",
},
"rewrite_speculation": true,
"stream": false,
- "temperature": 0,
+ "temperature": 0.1,
"user": "test-user",
}
❯ src/autoedits/adapters/cody-gateway.test.ts:99:29
|
Run pnpm run test:unit --run:
vscode/src/autoedits/adapters/fireworks.test.ts#L91
AssertionError: expected { stream: false, …(7) } to deeply equal ObjectContaining{…}
- Expected
+ Received
- ObjectContaining {
- "max_tokens": Any<Number>,
+ Object {
+ "max_tokens": 515,
"model": "accounts/fireworks/models/llama-v2-7b",
"prediction": Object {
"content": "const x = 1",
"type": "content",
},
"prompt": "user message",
"response_format": Object {
"type": "text",
},
"stream": false,
- "temperature": 0,
+ "temperature": 0.1,
"user": "test-user",
}
❯ src/autoedits/adapters/fireworks.test.ts:91:29
|
Run pnpm run test:unit --run
Process completed with exit code 1.
|
Loading