-
Notifications
You must be signed in to change notification settings - Fork 1.2k
feat!: Implement include parameter specifically for adding logprobs in the output message #4261
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Note to the reviewers: The current behavior when passing include["message.output_text.logprobs"] along with tools is as follows:
The behavior gets further complicated when passing include["message.output_text.logprobs"] with multiple tools:
I wanted to get other opinions to determine how logprobs should be handled with built-in and mcp tools. My preference is to keep it consistent i.e. either return logprobs for both or do not return logprobs for either. Appreciate any feedback! Test script: https://github.com/s-akhtar-baig/llama-stack-examples/blob/main/responses/src/include.py |
|
This looks good to me. Doing this consistently makes sense! |
|
This pull request has merge conflicts that must be resolved before it can be merged. @s-akhtar-baig please rebase it. https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork |
f092085 to
b139005
Compare
✱ Stainless preview buildsThis PR will update the Edit this comment to update it. It will appear in the SDK's changelogs. ✅ llama-stack-client-node studio · code · diff
✅ llama-stack-client-go studio · code · diff
⏳ These are partial results; builds are still running. This comment is auto-generated by GitHub Actions and is automatically kept up to date as you push. |
02424d7 to
8ff43de
Compare
|
This pull request has merge conflicts that must be resolved before it can be merged. @s-akhtar-baig please rebase it. https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork |
dbd1409 to
ec145d9
Compare
ec145d9 to
d8f6154
Compare
Thanks, @ashwinb! It took a while for me to get the recordings in, sorry about that, but now the pipeline is green. Please let me know if the changes look good. Thanks! |
|
@s-akhtar-baig looks like the docker variant is still red somehow :( |
|
@ashwinb, it was failing on acquiring a lock, hopefully it passes successfully this time. |
|
@ashwinb, the check failed again on:
Re: Don't know why it keeps failing. |
|
Something in the PR is triggering a deadlock in the conversations test case |
Problem
As an Application Developer, I want to use the include parameter with the value message.output_text.logprobs, so that I can receive log probabilities for output tokens to assess the model's confidence in its response.
What does this PR do?
Closes #4260
Test Plan