Description
Description
Bug: CreateStreamedResponse fails with custom models that return sources without choices
Issue Description
When using custom models through Open WebUI that return sources in the first response without a 'choices' array, the CreateStreamedResponse::from()
method fails because it can't find the expected 'choices' key in the response attributes.
Steps To Reproduce
Current Behavior
The first response from a custom model using sources returns something like:
{
"sources": [
{
"source": {
"id": "b3d24ed7-a1f1-4054-bb37-55c9cf96ad70",
"name": "Example Document",
"type": "collection"
},
"document": ["Content example..."],
"metadata": [
{
"name": "Example File.txt",
"source": "Example File.txt"
}
],
"distances": [0.8338083668559383]
}
]
}
This causes the following code in CreateStreamedResponse::from()
to fail:
public static function from(array $attributes): self
{
$choices = array_map(fn (array $result): CreateStreamedResponseChoice => CreateStreamedResponseChoice::from(
$result
), $attributes['choices']);
return new self(
$attributes['id'],
$attributes['object'],
$attributes['created'],
$attributes['model'],
$choices,
isset($attributes['usage']) ? CreateResponseUsage::from($attributes['usage']) : null,
);
}
The error occurs because $attributes['choices']
doesn't exist in the response, causing the stream to fail instead of gracefully handling or skipping this response chunk.
Expected Behavior
The client should be able to handle response chunks that don't contain the 'choices' field by either:
- Skipping chunks that don't have a 'choices' field, or
- Adding support for the 'sources' field in the response format
This would allow the client to work with custom models that return sources information, particularly in the first chunk of a streamed response.
Environment
- Package version: v0.10.3
- PHP version: 8.3
- Using with: Open WebUI custom model with knowledges
OpenAI PHP Client Version
v0.10.3
PHP Version
8.3.16
Notes
This issue specifically affects custom models in Open WebUI that return sources information. The standard OpenAI API doesn't have this issue, but as more people use the client with alternative backends, handling non-standard response formats becomes important for compatibility.