feat: new chains.ChatChain that uses Model.GenerateContent(...) #1315
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Introduces a new chain type that allows chaining a
prompts.MessageFormatterwith anllms.Model. In contrast to theLLMChain, it usesModel.GenerateContent(...), using separate messages instead of a single message.Remarks
Draft PR, because I'm not sure if this is the direction we should go. Tests and examples are missing and will be added once the direction is clear.
General objective of this change
My main goal is to be able to have a way to combine a prompt template
prompts.MessageFormatterwith anllms.Model, so that I can easily re-use it. I think the combination ofprompt x model x model configis quite common, but there is currently no way to achieve this.There is already another similar chain (
chains.LLMChain), but it usesllms.GenerateFromSinglePromptandprompts.FormatPrompter.FormatPrompt(values map[string]any).String()which does not create separate messages with the correct message type, but renders all messages into a single message. That's not really compatible with most chat API's.Current workaround
prompts.FormatMessages(values map[string]any)returns[]llms.ChatMessage(interface) which is not directly compatible withllms.Model.GenerateContent(ctx context.Context, messages []MessageContent, options ...CallOption) (*ContentResponse, error). A simple for-loop is required, e.g.:State of the
chainspackageThe
chainspackage seems to be a bit behind the llms package.chains.Callacceptschains.ChainCallOptionas options and has a private mapping functiongetLLMCallOptionsthat converts the options intollms.CallOption. It seems like this function currently doesn't support all options, e.g.llms.CallOptions.JSONMode. To me, this feels like a design flaw, because changes in thellmspackage require changes in thechainspackage to keep it usable.Alternative A
It might also be helpful to add another function to the concrete types in the
promptspackage that returns[]llms.MessageContentwhich can be passed tollms.GenerateContent(...). This would help avoiding the conversion step.Alternative B
A more disruptive change would be to have
llms.GenerateContent(...)accept an interface type instead of a struct type, similar to howhttp.NewRequestaccepts anio.Readeras body instead of[]byte(docs). This would probably be the most versatile approach, but it would probably break a lot of code. We could introduce a helper function that converts[]llms.MessageContentinto the new interface type to make it easy to fix the breaking change.