🦍 The API and AI Gateway
-
Updated
Feb 26, 2026 - Lua
🦍 The API and AI Gateway
Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.
Deploy serverless AI workflows at scale. Firebase for AI agents
AutoRAG: An Open-Source Framework for Retrieval-Augmented Generation (RAG) Evaluation & Optimization with AutoML-Style Automation
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
The platform for LLM evaluations and AI agent testing
Your autonomous engineering team in a CLI. Point Zeroshot at an issue, walk away, and return to production-grade code. Supports Claude Code, OpenAI Codex, OpenCode, and Gemini CLI.
The collaborative spreadsheet for AI. Chain cells into powerful pipelines, experiment with prompts and models, and evaluate LLM responses in real-time. Work together seamlessly to build and iterate on AI applications.
AIConfig is a config-based framework to build generative AI applications.
Python SDK for running evaluations on LLM generated responses
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
The Context Hub for AI agents
One command to a full local AI stack — LLM inference, chat UI, voice agents, workflows, RAG, and privacy tools. Includes operations toolkit for persistent AI agents. No cloud, no subscriptions.
🛡️ Kernel-level governance for AI agents — policy enforcement, action interception, OWASP Agentic Top 10 coverage. Works with LangChain, CrewAI, AutoGen, OpenAI, Google ADK, PydanticAI, smolagents.
[⛔️ DEPRECATED] Friendli: the fastest serving engine for generative AI
Quality Control for AI Artifact Management
🔍 AI observability skill for Claude Code. Debug LangChain/LangGraph agents by fetching execution traces from LangSmith Studio directly in your terminal.
Production operations framework for AI-powered SaaS. The architectural patterns, failure modes, and operational playbooks that determine whether your AI systems scale profitably or fail expensively.
Miscellaneous codes and writings for MLOps
Add a description, image, and links to the llm-ops topic page so that developers can more easily learn about it.
To associate your repository with the llm-ops topic, visit your repo's landing page and select "manage topics."