Skip to content

Conversation

@deanlorenz
Copy link
Collaborator

When benchmarking a vLLM directly, the first containterPort might not be the API port of the vLLM.
In the default standup the first port is the NIXL port.
Instead, this fix tries to find the correct port from the probe configuration.
If that fails it falls back to the metrics port which uses the same base url.

@maugustosilva maugustosilva self-requested a review December 2, 2025 00:10
@maugustosilva maugustosilva merged commit 1abeb83 into llm-d:main Dec 2, 2025
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants