-
Notifications
You must be signed in to change notification settings - Fork 167
[ai] Fix collectUiMessages option by accumulating chunks in a separate step call #784
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ai] Fix collectUiMessages option by accumulating chunks in a separate step call #784
Conversation
Signed-off-by: Peter Wielander <[email protected]>
🦋 Changeset detectedLatest commit: 6d2df7d The changes in this PR will be included in the next version bump. This PR includes changesets to release 2 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
📊 Benchmark Results
workflow with no steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Express | Nitro | Next.js (Turbopack) workflow with 1 step💻 Local Development
▲ Production (Vercel)
🔍 Observability: Next.js (Turbopack) | Express | Nitro workflow with 10 sequential steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Nitro | Express | Next.js (Turbopack) Promise.all with 10 concurrent steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Express | Next.js (Turbopack) | Nitro Promise.all with 25 concurrent steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Nitro | Express | Next.js (Turbopack) Promise.race with 10 concurrent steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Next.js (Turbopack) | Express | Nitro Promise.race with 25 concurrent steps💻 Local Development
▲ Production (Vercel)
🔍 Observability: Nitro | Express | Next.js (Turbopack) Stream Benchmarks (includes TTFB metrics)workflow with stream💻 Local Development
▲ Production (Vercel)
🔍 Observability: Nitro | Express | Next.js (Turbopack) SummaryFastest Framework by WorldWinner determined by most benchmark wins
Fastest World by FrameworkWinner determined by most benchmark wins
Column Definitions
Worlds:
|
🧪 E2E Test Results❌ Some tests failed Summary
❌ Failed Tests🌍 Community Worlds (21 failed)mongodb (1 failed):
redis (1 failed):
starter (18 failed):
turso (1 failed):
Details by Category✅ ▲ Vercel Production
✅ 💻 Local Development
✅ 📦 Local Production
✅ 🐘 Local Postgres
✅ 🪟 Windows
❌ 🌍 Community Worlds
|
readUIMessageStream
readUIMessageStreamSigned-off-by: Peter Wielander <[email protected]>
Signed-off-by: Peter Wielander <[email protected]>
Signed-off-by: Peter Wielander <[email protected]>
Signed-off-by: Peter Wielander <[email protected]>
When a step function is invoked, only capture `this` if it has a `classId` property (indicating it's a class constructor registered for serialization by the SWC plugin). This prevents serialization errors when step functions are called as methods on arbitrary objects (e.g., `tool.execute()`) where the object has non-serializable properties like Zod schemas.
…ent-collectUiMessages
I, Peter Wielander <[email protected]>, hereby add my Signed-off-by to this commit: 0e8fe8c Signed-off-by: Peter Wielander <[email protected]>
| type: 'start', | ||
| // Note that if useChat is used client-side, useChat will generate a different | ||
| // messageId. It's hard to work around this. | ||
| messageId: generateId(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Client-side would do sendStart: false in that case, no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the user can decide to do that, but would then have to handle the start themselves, which I think is fine. The ID seems unlikely to cause issues unless message persistence is mixed between client and server side
| }) { | ||
| const shownCount = issues.length; | ||
| const title = `Top ${shownCount} issues (last ${digestIntervalDays}d) in ${repoFullName}`; | ||
| const title = `Top issues of last ${digestIntervalDays}d in ${repoFullName}`; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unrelated to this PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, wasn't worth its own, but can split if you want
pranaygp
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm. surprised that vade didn't leave any comments 🤔
This is a refactor of #768 which moves the accumulation logic to a step. The test harness I was using to test #768 didn't correctly account for the workflow transformation, so
collectUiMessagesdidn't actually work as intended before (also reported in issue #739).This PR moves the final accumulation call to a
"use step"at the end of theagent.streamcall. The actual logic still uses AI SDK'sreadUIMessageStreaminternally, just like before.I alternatively considered pre-creating a stream in a setup step, then double piping to that stream, before passing it to the final step that can call
readUIMessageStreamon it. In a non-workflow world, this would result in a performance improvement (native stream use), but since we are awaiting step boundaries anyway, it would not help in this case, and it would make the content less observable.I tested this in

flight-booking-app. See the final step it makes here that gets the chunks and returns theuiMessages: