Open AI compatible LLM "Method not allowed" – why? #44007
benfrain
started this conversation in
Feature Requests
Replies: 2 comments
-
|
I get the a very similar error with an Open AI compatible model as well: More specifically, it does not work for a GPT 5.2 codex model (I had to turn off chat completions), but it does work for other models, so I assume it is linked to the request that Zed is making. Is there a way to inspect the request made by an LLM? It would make debugging a lot easier! @benfrain, did you manage to resolve this issue? |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Sadly not. Ended up going back to VSCode and RooCode. 😮 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
We use a proxy at our workplace, which is Open AI compatible. This proxy works fine via Curl and in VSCode.
I am trying to get it working in Zed. I am adding an Open AI compatible provider in the UI, and adding my details.
However, when I enter a message I get a
HTTP response error "405 Method not allowed – "method not allowed\n"The proxy expects a POST and the only thing we have in the proxy regarding this is:
Can anyone point out what I am doing wrong here? Feels like there is some piece of config that Zed needs I am not providing.
Beta Was this translation helpful? Give feedback.
All reactions