Skip to content

when using remote ollama, it has parse error #5900

Open
@woots29

Description

@woots29

Before submitting your bug report

Relevant environment info

- OS: Windows 10 
- Continue version: 1.0.10
- IDE version: 1.100.2
- Model: Llama 3.1
- config:
  

  
  OR link to assistant in Continue hub:

Description

the exact error I'm encountering is:
request to http://127.0.0.1:11434/api/chat failed, reason: Parse Error: Expected HTTP/
when a code is being sent to chat
but when i fill "test" and chat response is normal

Im using Rustdesk TCP tunnel because my ollama server is on remote.

Metadata

Metadata

Assignees

Labels

area:chatRelates to chat interfacekind:bugIndicates an unexpected problem or unintended behavior

Type

No type

Projects

Status

Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions