-
Notifications
You must be signed in to change notification settings - Fork 642
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
botocore: add basic tracing for Bedrock InvokeModelWithStreamResponse #3206
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you ad-hoc test langchain? This is the more important PR as it is the default mode.
from langchain_aws import ChatBedrock
def main():
llm = ChatBedrock(
model_id='amazon.titan-text-lite-v1',
streaming=True,
)
response_stream = llm.stream("Write a short poem on OpenTelemetry.")
for chunk in response_stream:
print(chunk.content, end="")
if __name__ == "__main__":
main()
sorry about the "requested changes" that was a clicking accident |
3cff0c8
to
1575884
Compare
I did test it with Claude :
and titan:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
great. langchain uses will be happy, as will others!
Description
This adds to the botocore bedrock extensions ability to trace InvokeModelWithStreamResponse calls for amazon.titan, amazon.nova and anthropic claude models using the completion api for amazon.titan and the messages api for amazon.nova and anthropic.claude.
Type of change
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Does This PR Require a Core Repo Change?
Checklist:
See contributing.md for styleguide, changelog guidelines, and more.