Skip to content

Conversation

@diego-s
Copy link

@diego-s diego-s commented Nov 18, 2025

This update makes the code compatible with Open Router or other cases where the Open AI API Endpoint URL needs to be changed in a simple way:

  • A configurable openai_api_base parameter is added (default is still the original Open AI API URL: https://api.openai.com/v1 but can be changed to 'https://openrouter.ai/api/v1' for Open Router, for example.
  • Configurable research_model_provider, summary_model_provider, compression_model_provider, final_report_model_provider: all default to 'openai', doesn't need to be changed for Open Router (these were needed because otherwise init_chat_model can't infer the provider for models that are not from OpenAI).

When using OpenRouter, obviously the models can now also be changed to any model in Open Router (e.g. meta-llama/llama-4-maverick, etc). Be aware that the names may change slightly between OpenAI and Open Router: e.g. openai:gpt-4.1-mini -> openai/gpt-4.1-mini.

These can all be changed now via the LangSmith UI too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant