Skip to content

New AI gateway does not support BYOK passed via request, OR this is not documented #5421

@gavento

Description

@gavento
  • The older, provider-specific LLM proxies allowed to specify both Helicone Key and LLM provider key, see e.g. here for what I mean.
  • The new AI gateway does not seems to have this option - it only accepts Helicone key. OR perhaps this feature is not documented (I could not find it).

Why is this a problem: This is required for transition to the new API e.g. when you actually have several provider API keys, used with different clients or projects; or just don't want to / can't store your API keys in Helicone UI.

Proposed solution: Make this feature visible if it already exists :-)
Otherwise add an option to BYOK to the AI gateway API, e.g. as an extra header to be passed to LLM calls, e.g. as client=OpenAI(default_headers={"Helicone-BYOK-anthropic": os.env["ANTHROPIC_API_KEY"]}, ...) (globally or in the completion call). BTW this would already be helpful if this would not support failover to other providers. (See OpenAI client source)

To be clear, the new API is awesome otherwise, and I would want to switch to it as soon as possible - kudos to the Helicone dev team!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions