Skip to content

Feature/OpenAI providers#324

Merged
dwash96 merged 14 commits intocecli-dev:v0.91.3from
chrisnestrud:feature/openai-providers
Dec 30, 2025
Merged

Feature/OpenAI providers#324
dwash96 merged 14 commits intocecli-dev:v0.91.3from
chrisnestrud:feature/openai-providers

Conversation

@chrisnestrud
Copy link
Copy Markdown

This PR adds support for the openai-like providers known to LiteLLM so their models are included in lists, autocomplete, etc. Also adds scripts/generate_openai_providers.py to create a JSON file based on litellm data. For models, tokens/cost calculation is retrieved using /models endpoint if available, cached similar to openrouter, and made available to cecli.

@dwash96
Copy link
Copy Markdown
Collaborator

dwash96 commented Dec 28, 2025

So this seems to be a very similar intention as:
#179

where I commented:

What I would recommend for this kind of PR is to essentially be able to do this automatically for openrouter models based on what is returned in the openrouter API and the format present in the model-metadata.json file. Basically, instead of storing every possible model in an incompatible format like this PR does currently, use the openrouter API to actually configure models that are not present in the metadata

in:

#179 (comment)

I feel like with this level of complexity, you might as well collapse both openrouter.py and your openai_providers.py file into a single helpers/model_providers.py and OpenAIProviderManager just becomes ModelProviderManager so it's easier to see what's going on with all of this. Instead of dumping a large list of models into memory, it makes more sense to me to lookup models not specified, resolve the provider, and configure them for in MODEL_SETTINGS appropriately in models.py. I generally want to get away from large static lists of models with the litellm provider file being the base (and itself then shrank to a reasonable maybe top 100) and then a dynamic lookup system off of stuff like this for anything outside of that list, so people can still specify whatever model and whatever provider, but we aren't on the hook of knowing about every model in existence, just the model providers

Introduce OpenAIProviderManager plus JSON-backed metadata to hydrate /models payloads for OpenAI-like providers such as Synthetic. Hook ModelInfoManager, Model, and CLI completions/listings into that registry, expose configuration data in aider/resources/openai_providers.json, and ensure LiteLLM is initialized with the custom handler so cecli can call these endpoints reliably.
Some OpenAI-compatible providers emit costs like '/bin/bash.00000055', which our float parser treated as invalid and left the UI without per-token pricing (e.g., synthetic MiniMax-M2). Strip currency symbols/commas before parsing and add a regression test that proves static model caches with dollar-prefixed pricing still populate ModelInfoManager.
Synthetic/other OpenAI-like providers returned both reasoning_content and content, but our consolidation skipped storing the final content whenever reasoning existed, so cecli printed only the THINKING section. Always capture the message.content (including list-style OpenAI blocks) and add a regression test that feeds a recorded MiniMax completion via a heredoc JSON snippet to assert both THINKING and ANSWER text render.
Reinstalled the dev toolchain, ran the documented pre-commit hooks, and applied the resulting isort/black fixes across provider modules plus removed an unused import/variable in tests so the branch now passes the project’s formatting gate.
Co-authored-by: aider-ce (synthetic/hf:deepseek-ai/DeepSeek-V3.2)
@chrisnestrud chrisnestrud force-pushed the feature/openai-providers branch from b0817e8 to e365eaa Compare December 28, 2025 19:43
@chrisnestrud
Copy link
Copy Markdown
Author

Working for me in testing, please review when you have the time.

@dwash96
Copy link
Copy Markdown
Collaborator

dwash96 commented Dec 29, 2025

Very nice, I'll make a v0.91.3 and make this is all included in that

@dwash96 dwash96 changed the base branch from main to v0.91.3 December 30, 2025 05:37
@dwash96 dwash96 merged commit 0c43224 into cecli-dev:v0.91.3 Dec 30, 2025
8 checks passed
@dwash96 dwash96 mentioned this pull request Dec 30, 2025
@chrisnestrud chrisnestrud deleted the feature/openai-providers branch January 5, 2026 12:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants