-
Notifications
You must be signed in to change notification settings - Fork 349
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Decompose ChatController.sendChat into handlers for different request types #6469
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
|
This reverts commit 3d0ca34.
beyang
force-pushed
the
bl/refactor-chat-controller
branch
from
December 27, 2024 04:18
c3b44bd
to
fa86812
Compare
beyang
changed the title
Refactor ChatController
Decompose ChatController.sendChat into handlers for different request types (search, chat, DeepCody)
Dec 27, 2024
beyang
changed the title
Decompose ChatController.sendChat into handlers for different request types (search, chat, DeepCody)
Decompose ChatController.sendChat into handlers for different request types
Dec 27, 2024
sqs
approved these changes
Dec 27, 2024
abeatrix
reviewed
Dec 28, 2024
this.toolProvider.getTools(), | ||
corpusContext | ||
) | ||
agent.setStatusCallback(model => this.postEmptyMessageInProgress(model)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was added today but got deleted by accident
abeatrix
added a commit
that referenced
this pull request
Dec 28, 2024
The setStatusCallback code was removed by #6469 by accident. This change is to add it back so the context agent (deep cody) can send its processing status back to webview 1. Added a `postMessageCallback` parameter to the `DeepCodyHandler` constructor to allow for posting empty messages during the context retrieval process. 2. Updated the `DeepCodyHandler` to set the status callback on the `DeepCodyAgent` instance, allowing for the posting of empty messages during the context retrieval process. 3. Updated the `AgentTools` interface to include the `postMessageCallback` parameter, ensuring that it is passed to the `DeepCodyHandler` constructor. 4. Updated the `ChatController` to pass the `postEmptyMessageInProgress` callback to the `DeepCodyHandler` constructor, enabling the posting of empty messages during the context retrieval process.
beyang
pushed a commit
that referenced
this pull request
Dec 30, 2024
The statusCallback method was removed by #6469. Add a `postStatuses` method to the `AgentHandlerDelegate` interface to pass back status updates to the chat view.
arafatkatze
added a commit
that referenced
this pull request
Jan 28, 2025
…span closure of Chat Spans (#6807) ## Changes 1. Fixing problem with `externalAuthRefreshChanges` not having initial value which was blocking the configuration 2. Making caching checks more correct 3. Makes sure the spans for chat are properly closed so that we can actually see them in HoneyComb 4. Fixing different edge cases in the re-registration of OtelCollectors with observables ### Span Closure Issues <img width="1479" alt="image" src="https://github.com/user-attachments/assets/a995c3bb-909e-4aea-992f-88009caa20a8" /> This PR ensures that the spans for handling chat messages are properly ended, so that they are properly finished and exported. This way, they can be seen in the Trace View(I have verified this behavior by testing locally too) so now honeycomb should get proper chat traces. NOTE: This behavior was working perfectly [in the original PR](#6294) but in [this massive refactor ](#6469) the span closure got accidentally removed so now I am [adding those span closures again](2df697c) and adding a comment for future safety. ### OpenTelemetry Configuration Issues This PR resolves configuration update issues in the OpenTelemetry integration by implementing a provider recycling pattern, ensuring stable tracing across dynamic configuration changes while preventing API registration conflicts. In the past we would re-register on this.tracerProvider on opentelemetry changes but that lead to a lot of oTel errors like ``` "OTLPExporterError: Unprocessable Entity \tat IncomingMessage.<anonymous> (/Users/pkukielka/Work/sourcegraph/cody2/vscode/dist/extension.node.js:183479:31) \tat IncomingMessage.emit (node:events:530:35) \tat IncomingMessage.emit (node:domain:489:12) \tat endReadableNT (node:internal/streams/readable:1698:12) \tat process.processTicksAndRejections (node:internal/process/task_queues:82:21)" ``` Or alternatively some form of re-registration errors so this PR registers TracerProvider only ones but it changes the processors so that they can account for different conditions like change in telemetry sending URL and enable/disable etc. ## Test Plan NOTE: Every operation in the test plan like turning false to true or changing the account requires a wait time of 5 seconds atleast coz observables take time to "propagate". ### Test 1: Tracing Flag Toggle Without Restart **Steps:** - Set "cody.experimental.tracing": true in settings.json of the vscode config (no restart). - Ask a chat question. - Verify: CodyTraceExporter::export is executed (check debugger). - Verify: No `otlp_export_error` in debug console. - Set `cody.experimental.tracing ` to false (no restart). - Ask a chat question. - Verify: CodyTraceExporter::export is not executed and blocked by trace enablement check. - Verify: No errors in console. - Set cody.experimental.tracing to true again (no restart). - Ask a chat question. - Verify: CodyTraceExporter::export executes and emits data (passes trace enablement check). - Verify: No otlp_export_error. ### Test 2: Account Switching with Tracing **Steps:** Account A Setup (sourcegraph.com): - Set "cody.experimental.tracing": true in config. - Ask a chat question. - Verify: CodyTraceExporter::export executes. - Verify: No errors. Switch to Account B (sourcegraph.sourcegraph.com):(No need to restart cody) - Ask a chat question. - Verify: Tracing is enabled by default in the new account; CodyTraceExporter::export is called. - Verify: No errors. - Ask a chat question. - Verify: CodyTraceExporter::export executes for Account B. - See the export function for spans being hit ### Test 3: Error Handling Validation Steps: - Across all test cases above, explicitly check the debug console for: - Absence of otlp_export_error or other tracing-related errors. - No unexpected authentication or network errors during account switches.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Factor out the main user message submission handler (
ChatController.sendChat
) into multiple handlers:ChatHandler
DeepCodyHandler
SearchHandler
Previously,
ChatController.sendChat
was a long long method that implicitly encoded these different types of response handlers through various conditionals and difficult-to-reason-about state. Now, each request types gets its own handler class, which more clearly separates concerns. This will allow us to implement more complex logic for each request type without breaking encapsulation, and it also opens the door to more easily add new types of agentic handlers in the future.Also remove the
edit
andinsert
message intents, which seem to be defunct/unused.There should be no behavior change. Telemetry behavior is preserved.
Test plan
Check out the branch and test the following local:
Issues found
"Failed to edit prompt" when editing message midway through response generationpre-existingSecurity
The instances of
PromptString.unsafe_fromLLMResponse
were copied from the oldChatController.sendChat
implementation.