Skip to content

Conversation

@mheck136
Copy link
Contributor

Introduces a new chain type that allows chaining a prompts.MessageFormatter with an llms.Model. In contrast to the LLMChain, it uses Model.GenerateContent(...), using separate messages instead of a single message.

Remarks

Draft PR, because I'm not sure if this is the direction we should go. Tests and examples are missing and will be added once the direction is clear.

General objective of this change

My main goal is to be able to have a way to combine a prompt template prompts.MessageFormatter with an llms.Model, so that I can easily re-use it. I think the combination of prompt x model x model config is quite common, but there is currently no way to achieve this.

There is already another similar chain (chains.LLMChain), but it uses llms.GenerateFromSinglePrompt and prompts.FormatPrompter.FormatPrompt(values map[string]any).String() which does not create separate messages with the correct message type, but renders all messages into a single message. That's not really compatible with most chat API's.

Current workaround

prompts.FormatMessages(values map[string]any) returns []llms.ChatMessage (interface) which is not directly compatible with llms.Model.GenerateContent(ctx context.Context, messages []MessageContent, options ...CallOption) (*ContentResponse, error). A simple for-loop is required, e.g.:

msgs, err := tmpl.FormatMessages(data)
if err != nil {
	return nil, fmt.Errorf("failed to format messages: %w", err)
}
contents := make([]llms.MessageContent, len(msgs))
for i, msg := range msgs {
	contents[i] = llms.TextParts(msg.GetType(), msg.GetContent())
}
// use contents in llms.GenerateContent

State of the chains package

The chains package seems to be a bit behind the llms package. chains.Call accepts chains.ChainCallOption as options and has a private mapping function getLLMCallOptions that converts the options into llms.CallOption. It seems like this function currently doesn't support all options, e.g. llms.CallOptions.JSONMode. To me, this feels like a design flaw, because changes in the llms package require changes in the chains package to keep it usable.

Alternative A

It might also be helpful to add another function to the concrete types in the prompts package that returns []llms.MessageContent which can be passed to llms.GenerateContent(...). This would help avoiding the conversion step.

Alternative B

A more disruptive change would be to have llms.GenerateContent(...) accept an interface type instead of a struct type, similar to how http.NewRequest accepts an io.Reader as body instead of []byte (docs). This would probably be the most versatile approach, but it would probably break a lot of code. We could introduce a helper function that converts []llms.MessageContent into the new interface type to make it easy to fix the breaking change.

Introduces a new chain type that allows chaining a prompts.MessageFormatter
with an llms.Model. In contrast to the LLMChain, it uses Model.GenerateContent(...),
using separate messages instead of a single message.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant