Closed
Description
Before submitting your bug report
- I believe this is a bug. I'll try to join the Continue Discord for questions
- I'm not able to find an open issue that reports the same bug
- I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: Windows 11
- Continue version: vscode 0.8.61
- IDE version:
Description
Looks like signal parameter is missing in streamChat and streamCompletion definition.
So models are not loaded properly, options (CompletionOptions) is empty.
I guess the signal value goes to options and options gets a default empty value.
export interface CustomLLMWithOptionals {
options: LLMOptions;
streamCompletion?: (
prompt: string,
options: CompletionOptions,
fetch: (input: RequestInfo | URL, init?: RequestInit) => Promise<Response>,
) => AsyncGenerator<string>;
streamChat?: (
messages: ChatMessage[],
options: CompletionOptions,
fetch: (input: RequestInfo | URL, init?: RequestInit) => Promise<Response>,
) => AsyncGenerator<string>;
listModels?: (
fetch: (input: RequestInfo | URL, init?: RequestInit) => Promise<Response>,
) => Promise<string[]>;
}
// core/index.d.ts
streamChat(
messages: ChatMessage[],
signal: AbortSignal,
options?: LLMFullCompletionOptions,
): AsyncGenerator<ChatMessage, PromptLog>;
There are signals parameters in core/index.d.ts, but not in core/config/types.ts
To reproduce
No response
Log output
No response