[Help]: Model 'command-xlarge-beta', not recognized #2517
Replies: 4 comments 4 replies
-
|
@GillesMoyse that looks like cohere is raising that error. Are you able to make a direct curl request to |
Beta Was this translation helpful? Give feedback.
-
|
converting to discussion, as this isn't being caused due to litellm. |
Beta Was this translation helpful? Give feedback.
-
|
Ok, found the answer - the model name to use is "command-xlarge-nightly". |
Beta Was this translation helpful? Give feedback.
-
|
Adding "cohere" before the model name works when calling from LiteLLM (previous answer worked with cURL). So the model name is |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
I have an exception while calling Cohere's model 'command-xlarge-beta'. The exact same code works perfectly with 'command-nightly'.
Relevant log output
The following exception occurred with prompt meta={} user='Qui a inventé le télégraphe électrique ? A) Samuel Morse B) Charles Wheatstone C) William Fothergill Cooke' system="Répond uniquement avec la lettre. Ne donne qu'une seule réponse." {"message":"model 'command-xlarge-beta' not found, make sure the correct model ID was used and that you have access to the model."} Traceback (most recent call last): File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\main.py", line 1197, in completion model_response = cohere.completion( File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\llms\cohere.py", line 173, in completion raise CohereError(message=response.text, status_code=response.status_code) litellm.llms.cohere.CohereError: {"message":"model 'command-xlarge-beta' not found, make sure the correct model ID was used and that you have access to the model."} During handling of the above exception, another exception occurred: Traceback (most recent call last): File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 2628, in wrapper result = original_function(*args, **kwargs) File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\main.py", line 1941, in completion raise exception_type( File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 8100, in exception_type raise e File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 8068, in exception_type raise APIConnectionError( litellm.exceptions.APIConnectionError: {"message":"model 'command-xlarge-beta' not found, make sure the correct model ID was used and that you have access to the model."} During handling of the above exception, another exception occurred: Traceback (most recent call last): File "{$home}\source\repos\ragtime-package\src\ragtime\generators.py", line 398, in complete ans:dict = completion(messages=messages, model=self.name, File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 2698, in wrapper return litellm.completion_with_retries(*args, **kwargs) File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\main.py", line 1973, in completion_with_retries return retryer(original_function, *args, **kwargs) File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\tenacity\__init__.py", line 379, in __call__ do = self.iter(retry_state=retry_state) File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\tenacity\__init__.py", line 325, in iter raise retry_exc.reraise() File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\tenacity\__init__.py", line 158, in reraise raise self.last_attempt.result() File "{$home}\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\_base.py", line 451, in result return self.__get_result() File "{$home}\AppData\Local\Programs\Python\Python310\lib\concurrent\futures\_base.py", line 403, in __get_result raise self._exception File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\tenacity\__init__.py", line 382, in __call__ result = fn(*args, **kwargs) File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 2727, in wrapper raise e File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 2628, in wrapper result = original_function(*args, **kwargs) File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\main.py", line 1941, in completion raise exception_type( File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 8100, in exception_type raise e File "{$home}\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 8068, in exception_type raise APIConnectionError( litellm.exceptions.APIConnectionError: {"message":"model 'command-xlarge-beta' not found, make sure the correct model ID was used and that you have access to the model."}Twitter / LinkedIn details
No response
Beta Was this translation helpful? Give feedback.
All reactions