-
-
Notifications
You must be signed in to change notification settings - Fork 12.9k
不支持思考的 Gemini 模型请求不应包含 includeThoughts
#7985
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
👀 @sxjeru Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
你反馈的问题是正确的:部分 Gemini 模型(如 根据模型描述,Gemini 2.5 Flash 及相关模型分为“带思考”和“不带思考”两个版本,只有带有 建议:
目前在 lobe-chat 仓库中未检索到针对该问题的详细文档或历史讨论,但模型描述和错误信息已明确说明了参数兼容性。 To reply, just mention my name @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
✅ @sxjeru This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 1.88.18 🎉 The release is available on: Your semantic-release bot 📦🚀 |
一些不支持思考的模型增加includeThoughts后出现错误,例如 Gemini API 的
gemini-2.0-flash-preview-image-generation
和 Vertex 的gemini-2.0-flash
、gemini-2.0-flash-lite
。Vertex 的 Gemini 2.0 Flash
Originally posted by @KoellM in #7686 (comment)
The text was updated successfully, but these errors were encountered: