Skip to content

CustomLLMWithOptionals interface definition is not up-to-date #3316

Closed
@70m4

Description

@70m4

Before submitting your bug report

Relevant environment info

- OS: Windows 11
- Continue version: vscode 0.8.61
- IDE version:

Description

Looks like signal parameter is missing in streamChat and streamCompletion definition.
So models are not loaded properly, options (CompletionOptions) is empty.
I guess the signal value goes to options and options gets a default empty value.

export interface CustomLLMWithOptionals {
    options: LLMOptions;
    streamCompletion?: (
      prompt: string,
      options: CompletionOptions,
      fetch: (input: RequestInfo | URL, init?: RequestInit) => Promise<Response>,
    ) => AsyncGenerator<string>;
    streamChat?: (
      messages: ChatMessage[],
      options: CompletionOptions,
      fetch: (input: RequestInfo | URL, init?: RequestInit) => Promise<Response>,
    ) => AsyncGenerator<string>;
    listModels?: (
      fetch: (input: RequestInfo | URL, init?: RequestInit) => Promise<Response>,
    ) => Promise<string[]>;
  }

// core/index.d.ts
streamChat(
    messages: ChatMessage[],
    signal: AbortSignal,
    options?: LLMFullCompletionOptions,
  ): AsyncGenerator<ChatMessage, PromptLog>;

There are signals parameters in core/index.d.ts, but not in core/config/types.ts

To reproduce

No response

Log output

No response

Metadata

Metadata

Assignees

Labels

kind:bugIndicates an unexpected problem or unintended behaviorneeds-triage

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions