Skip to content

[Bug]: litellm Connection Error with ADK #12931

@kuri-leo

Description

@kuri-leo

What happened?

Hi folks,

I'm running into an issue using litellm (v1.74.7 and v1.74.8) with google-adk (v1.8.0), when connecting to a self-hosted SGLang instance.

I'm initializing LiteLLM like this:

LiteLlm(
    model="hosted_vllm/Qwen2.5-7B-Instruct",  # also tried "openai/Qwen2.5-7B-Instruct"
    api_base="http://192.168.125.4:18008/v1",
    api_key="sk-123456",
)

Everything runs smoothly at first, but after ~20 minutes of parallel ADK calls, I get the following error:

litellm.InternalServerError: InternalServerError: Hosted_vllmException - Connection error

I’ve ruled out network issues, server instability, and client-side bugs, everything works fine when using the official openai client instead of litellm.

I also tested with different concurrency levels and timeouts, but the error still shows up after some runtime. Unfortunately, enabling debug mode didn’t yield any useful logs.

Hope this helps track down the issue. Let me know if I can provide more info

Potential related issues:

Relevant log output

litellm.InternalServerError: InternalServerError: Hosted_vllmException - Connection error

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.74.7, v1.74.8

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions