You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm currently using LiteLLM to serve various models hosted on vLLM, Ollama, Infinity (for embedding/reranking), and Speeches.ai (for whisper for STT / piper for TTS) and it's working really well.
We're considering migrating to OpenShift, and I've been exploring tools like KServe, Open Data Hub, and OpenShift AI. However, I'm having trouble understanding how all these tools fit together.
Does LiteLLM integrate with any of these components, or would it be simpler to just adapt our current Docker-based stack to Kubernetes directly, without using the broader OpenShift AI ecosystem?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm currently using LiteLLM to serve various models hosted on vLLM, Ollama, Infinity (for embedding/reranking), and Speeches.ai (for whisper for STT / piper for TTS) and it's working really well.
We're considering migrating to OpenShift, and I've been exploring tools like KServe, Open Data Hub, and OpenShift AI. However, I'm having trouble understanding how all these tools fit together.
Does LiteLLM integrate with any of these components, or would it be simpler to just adapt our current Docker-based stack to Kubernetes directly, without using the broader OpenShift AI ecosystem?
Thanks in advance,
Morgan
Beta Was this translation helpful? Give feedback.
All reactions