-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
Problem
When attempting to use Azure AI Foundry's v1-preview API Version, an error occurs when attempting to call /chat/completions
, but not /responses
when using OpenAI models.
Steps to Recreate
- Create a model config like such:
- model_name: gpt-4.1
litellm_params:
model: "azure/gpt-4-1"
api_base: <base_url>
api_key: <api_key>
api_version: preview # IMPORTANT
timeout: 60
stream_timeout: 30
rpm: 2000
tpm: 2000000
- Load LiteLLM
- Send a request like such


Relevant log output
{
"error": {
"message": "litellm.APIConnectionError: AzureException APIConnectionError - list index out of range\nTraceback (most recent call last):\n File \"/usr/local/lib/python3.13/site-packages/litellm/main.py\", line 1194, in completion\n optional_params = get_optional_params(\n **optional_param_args, **non_default_params\n )\n File \"/usr/local/lib/python3.13/site-packages/litellm/utils.py\", line 3819, in get_optional_params\n optional_params = litellm.AzureOpenAIConfig().map_openai_params(\n non_default_params=non_default_params,\n ...<7 lines>...\n ),\n )\n File \"/usr/local/lib/python3.13/site-packages/litellm/llms/azure/chat/gpt_transformation.py\", line 165, in map_openai_params\n api_version_month = api_version_times[1]\n ~~~~~~~~~~~~~~~~~^^^\nIndexError: list index out of range\n. Received Model Group=gpt-4.1\nAvailable Model Group Fallbacks=None",
"type": null,
"param": null,
"code": "500"
}
}
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.73.6
Twitter / LinkedIn details
No response
colindonovan-8451
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working