Skip to content

How to distinguish the reasoning part and the answer part when calling the reasoning model #377

@kaliwin

Description

@kaliwin

Can anyone provide an example of an inference model?
There is no inference output field in openai.ChatCompletionChunkChoiceDelta. It can only be seen in json. Is there any way to extract the inference process?

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentationquestionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions