LiteLLM
AI & LLMsPython SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
Release History
v1.80.15.rc.134 fixes18 featuresThis release introduces several new provider integrations, enhanced Prometheus metrics for monitoring, and numerous bug fixes across proxying, routing, and provider configurations. Performance improvements were also made to provider configuration lookups.
v1.80.15-nightly32 fixes18 featuresThis release introduces numerous new features, including provider additions (abliteration.ai, Bedrock for token counting), enhanced Prometheus metrics, and UI improvements for the Playground. Numerous bug fixes address issues across proxy streaming, provider configurations (Gemini, Azure), security, and internal workflows.
v1.80.11-stable20 fixes14 featuresThis release introduces significant new features like Guardrails Load Balancing, Unified Skills API, and a new Rag Search API. It also includes numerous bug fixes across pricing, integrations (like Langfuse and Datadog), and UI elements, alongside performance improvements via lazy loading.
v1.80.13.rc.1v1.80.13-nightlyv1.80.12-nightly17 fixes31 featuresThis release introduces significant feature enhancements, including support for image tokens, AWS Polly TTS, Minimax integration, and major UI improvements like error code filtering and key management updates. It also includes substantial internal refactoring to lazy load configuration classes for better performance.
v1.80.8-stable.1-patch01No changelog details were provided for this patch release.
v1.80.11.rc.125 fixes21 featuresThis release adds several new capabilities such as Guardrails load‑balancing, a unified Skills API, Rag Search and Query APIs with rerankers, and expanded model support, while also delivering numerous bug fixes and performance improvements across CI/CD, UI, and observability components.
v1.80.11-nightly28 fixes20 featuresThis release adds numerous new features such as Guardrails load balancing, a unified Skills API, and expanded model support, while delivering a large set of bug fixes and performance improvements across CI/CD, UI, and observability components.
v1.80.10.dev.1Breaking24 fixes22 featuresThis release adds extensive new features—including Gemini 3 flash preview support, Azure Sentinel logging, Guardrails enhancements, and a new LinkUp Search provider—while delivering numerous bug fixes and a breaking change that renames `extra_headers` to `additional_headers`.
v1.80.10.rc.5No specific changes were listed in the release notes; refer to the full changelog for details.
v1.80.10.rc.4No changelog details were provided beyond the link to the full comparison.
v1.80.8-stable.1No changes were documented for this release.
v1.80.10.rc.317 fixes21 featuresThis release adds numerous new providers and features—including Stability AI, Azure Cohere reranking, and VertexAI Agent Engine—while fixing a wide range of bugs and refactoring lazy imports for better performance.
v1.80.10.rc.2No detailed changelog items were provided; see the full changelog link for specifics.
v1.80.10.rc.125 fixes22 featuresLiteLLM v1.80.10 introduces Azure GPT‑5.2 models, new security guardrail evidence, expanded UI features, and numerous bug fixes and documentation updates.
v1.80.10-nightly28 fixes26 featuresLitellm v1.80.10 introduces Azure GPT‑5.2 model support, new security guardrail evidence, JWT team‑id selection, OTEL latency metrics, UI enhancements, and a large set of bug fixes and documentation updates.
v1.80.8-stableRelease v1.80.8 stable was published; specific changes are not listed in the provided notes. See the full changelog for details.
v1.80.9.dev629 fixes22 featuresThis release adds extensive model support, new UI and agent gateway features, and numerous bug fixes and performance improvements across routing, authentication, and API handling.
v1.80.9.dev5Release v1.80.9.dev5 contains no listed changes; refer to the full changelog for details.
v1.80.9.dev15 featuresThis release adds Helicone and SAP Gen AI providers, a new NVIDIA model, and a dynamic rate limiter with TTL support, along with documentation and minor routing updates.
v1.80.9-nightly9 fixes9 featuresThis release adds several new features such as cache polling, Bedrock OSS models, Sumologic webhook integration, and new provider support, while also fixing numerous bugs including Docker directory handling, Anthropic streaming issues, and response API errors.
v1.80.8.dev.15 fixes2 featuresThis release adds a User Info Delete modal and spend logging enhancements, and includes several UI and backend bug fixes.
v1.80.8.rc.134 fixes23 featuresThis release adds extensive new features—including Guardrail API enhancements, OTEL integration, new model support, and UI improvements—while delivering numerous bug fixes and minor deprecations.
v1.80.8-nightly30 fixes23 featuresThis release adds extensive new features such as Guardrail API tool‑call checks, OTEL integration, new model support, and enhanced UI capabilities, while also delivering numerous bug fixes and minor deprecations.
v1.80.5-stable.1No changelog details were provided beyond a link to the full comparison.
v1.80.7.dev.4Breaking25 fixes18 featuresThis release adds extensive new features—including Guardrail tool‑call checks, OTEL integration, new model support, and UI enhancements—while fixing numerous bugs and introducing a breaking change that renames `output_tokens_details` to `completion_tokens_details`.
v1.80.7.dev.330 fixes23 featuresThis release focuses heavily on UI improvements, infrastructure updates, and expanding provider support, including new features for guardrails and cost tracking. A key change is the deprecation of the legacy `/spend/logs` endpoint in favor of `/spend/logs/v2`.
v1.80.5-stablev1.80.7.dev.2Breaking28 fixes27 featuresThis release introduces a rebuilt UI, numerous new providers and model support, extensive guardrail enhancements, and many bug fixes, while deprecating the old `spend/logs` endpoint and removing feature flags.
v1.80.7.dev.130 fixes23 featuresThis release focuses heavily on UI improvements, infrastructure updates, and expanding provider support, including new features for guardrails and cost tracking. Key updates include the addition of publicai.co and Z.AI providers, and deprecation of the old spend/logs endpoint.
v1.80.7-nightly3 fixes3 featuresThis release adds guardrail support for pass‑through endpoints, a new publicai.co provider, UI enhancements, and several bug fixes, while upgrading websockets to version 15.
v1.80.6-nightly4 fixes13 featuresThis release introduces numerous new features such as organization usage UI, WatsonX audio transcription, Vertex RAG support, and several API enhancements, along with bug fixes and a version bump to 0.4.9.
v1.80.5.dev3224 fixes17 featuresThis release focuses heavily on stability, bug fixes across various providers (Vertex AI, OCI, Bedrock), and significant feature additions, including support for Claude Opus 4.5 and enhanced UI capabilities for permission and OAuth2 management. Several UI noise reductions and fixes for streaming/logging were also implemented.
v1.80.5.dev38 fixes5 featuresThis release adds Azure GPT‑5.1 and Vertex AI Image Edit support, introduces MCP Hub and UI model comparison, and includes several bug fixes and security improvements.
v1.80.5.dev225 fixes14 featuresThis release focuses heavily on bug fixes across various providers (Vertex AI, OCI, Azure, Bedrock) and significant UI/UX improvements, alongside adding support for new models like Claude Opus 4.5 and new features like OAuth2 registration and tool permission guardrails.
v1.80.5.dev123 fixes17 featuresThis release focuses heavily on stability, fixing numerous bugs across various providers (Vertex AI, OCI, Azure, Gemini) and enhancing permission management and UI consistency. New features include support for Claude Opus 4.5, Claude Skills API, and expanded OAuth2/tool permission configurations.
v1.80.0-stable.1No specific changes were listed in the release notes; the release appears to contain no documented breaking changes, new features, deprecations, or bug fixes.
v1.80.5.rc.2v1.80.5.rc.11 fixThis release fixes UI noise by reverting to console outputs.
v1.80.5-nightlyBreaking31 fixes25 featuresThis release adds numerous new models and UI features, upgrades pydantic to 2.11.0, and includes a large set of bug fixes and performance improvements.
v1.78.5-stable-patch-1No specific changes were listed in the release notes; refer to the full changelog for details.
v1.80.0.dev6Breaking13 fixes12 featuresThe release adds several UI enhancements, expands Prompt Management capabilities, updates cost maps, and introduces support for new models, while fixing numerous bugs and a breaking change that makes gpt-5 models default to response mode.
v1.77.3-stable-patch-2No specific changes were listed; refer to the full changelog for details.
v1.77.3-stable-patch-1Patch release with no documented changes.
v1.80.0.dev21 fix2 featuresThis release adds /delete support for files and /cancel support for batches, updates SSO documentation, and fixes AI Gateway JWT auth handling for Team Tags.
v1.79.3-stable.gemini3No changes were listed in the release notes; the update only references the full changelog.
v1.80.0.rc.2Release notes contain only a link to the full changelog; no specific changes are listed.
v1.80.0.dev1Breaking11 fixes20 featuresThis release introduces significant UI enhancements, new model support (including Day 0 Gemini and Deepseek), and crucial backend refactoring for the MCP client lifecycle. It also includes various bug fixes across providers and UI components.
v1.80.0.rc.1The release notes only provide a link to the full changelog and do not list specific changes.
v1.80.0-nightlyBreaking13 fixes15 featuresThis release introduces support for the new gpt-5.1 family models and adds significant features to the UI, Model Management API, and various provider integrations like RunwayML and fal-ai. It also removes generic exception handling, improving error visibility.
v1.79.3-stableThe release notes only reference the full changelog link and do not list specific changes.
v1.79.1-stable-patch-1The release notes only reference a link to the full changelog and do not list specific changes.
v1.79.3.dev77 fixes9 featuresThis release adds several new features such as RunwayML image generation, expanded Gemini image support, and bearer token auth for Bedrock AgentCore, while also fixing a range of UI and authentication bugs.
v1.79.3.dev510 fixes10 featuresThis release adds several UI enhancements, new provider support, and numerous bug fixes, including a Zscaler AI Guard hook and RunwayML video generation provider.
v1.79.3.dev21 featureThis release updates the CI/CD Docker version for end-to-end UI tests and introduces a new Zscaler AI Guard hook.
v1.79.dev.113 fixes9 featuresThis release adds several UI enhancements, new provider and routing features, and numerous bug fixes across management endpoints, streaming, and budget handling.
v1.79.3.rc.13 fixes3 featuresThis release adds Vertex and Gemini Videos APIs with cost tracking, Azure Content Policy error support, and gpt-4o-transcribe cost tracking, along with several UI and Prisma runtime fixes.
v1.79.3-nightly3 fixes4 featuresLitellm v1.79.3-nightly adds Vertex and Gemini Videos APIs with cost tracking, Azure Content Policy error support, and gpt-4o-transcribe cost tracking, while delivering several bug fixes and documentation updates.
v1.79.2-nightly20 fixes35 featuresThis release introduces significant feature enhancements across vector stores (Milvus, Azure AI), OCR providers (VertexAI, Azure AI Doc Intelligence), and new integrations like Bedrock Agentcore and CyberArk Secrets Manager. Numerous bug fixes address issues related to Pydantic warnings, API parameter handling, and cost calculation across various providers.
v1.79.1-stablev1.79.1.dev620 fixes18 featuresThis release introduces significant feature enhancements, including expanded vector store support (Milvus, Azure AI), new OCR providers (VertexAI, Azure AI Doc Intelligence), and various UI improvements. Numerous bug fixes address issues across providers like AWS Bedrock, Azure, Gemini, and Anthropic, alongside memory leak resolutions related to Pydantic warnings.
1.78.5-stable-patch-1v1.79.1.rc.2No changelog items were provided for this release; therefore there are no documented breaking changes, deprecations, new features, or bug fixes.
v1.79.1.dev512 fixes11 featuresThis release introduces significant feature enhancements, including Milvus vector store support and expanded Passthrough API capabilities for Azure AI Vector Stores. It also addresses several critical bugs related to memory usage, AWS Bedrock, and model configuration.
v1.79.1.dev312 fixes10 featuresThis release introduces significant feature enhancements, including Milvus vector store support and Azure AI Vector Store improvements, alongside numerous bug fixes across various integrations and the UI.
v1.79.1.dev213 fixes11 featuresThis release introduces significant enhancements to vector store support (Milvus, Azure AI), adds custom provider support for video endpoints, and resolves several critical bugs related to Pydantic warnings, AWS Bedrock, and Azure API handling. It also includes various UI improvements and fixes.
v1.79.1.rc.121 fixes20 featuresThis release introduces significant feature enhancements, including support for Lasso API v3, OpenTelemetry context propagation, and various UI improvements. Numerous bug fixes address issues across Azure OpenAI, logging, guardrails, and provider integrations.
v1.79.1-nightly25 fixes22 featuresThis release introduces significant feature enhancements, including support for Lasso API v3, OpenTelemetry context propagation, and various UI improvements for credential and guardrail management. Numerous bug fixes address issues across Azure OpenAI, logging, and provider configurations.
v1.79.0-stableNo specific changelog details were provided; refer to the full changelog link for information.
v1.78.5.rc.4v1.79.1.dev122 fixes21 featuresThis release introduces significant enhancements across observability (OpenTelemetry), guardrails, and the UI, alongside numerous bug fixes for providers like Azure OpenAI and improved performance.
v1.77.7.dev324 fixes14 featuresThis release introduces several new features, including support for dynamic client registration, OpenTelemetry context propagation, and FAL AI Image Generations. Numerous bug fixes address issues across Azure OpenAI, Guardrails, logging, and UI components.
v1.77.7.dev21 fixThis development release primarily addresses a bug where the completion endpoint failed for Azure OpenAI when using gpt-4-turbo.
v1.79.0.rc.119 fixes18 featuresThis release introduces significant feature enhancements, particularly around Search APIs, Vector Stores (Vertex AI), and Guardrail integration across various endpoints. Numerous bug fixes address issues related to error handling, configuration mapping, and security for the Responses API.
v1.79.0-nightly16 fixes20 featuresThis release introduces significant feature enhancements, including support for new Search APIs (DataforSEO, Google PSE), expanded Vector Store capabilities (Vertex AI Search), and major updates to Guardrail functionality across various endpoints. Numerous bug fixes address issues related to error handling, configuration mapping, and cost tracking.
v1.78.5-stable4 fixes3 featuresThis release introduces team-level rate limit controls and enhances guardrail capabilities with content masking and streaming support for PANW Prisma AIRS. Several bug fixes address memory leaks and configuration handling.
v1.78.0-stable1 fixThis release focuses on stability with fixes for the Passthrough Endpoint.
v1.78.7-nightly8 fixes12 featuresThis release introduces extensive support for various search APIs (Perplexity, Tavily, Parallel AI, EXA AI) and adds Guardrails support. Several bug fixes address issues related to Ollama parsing, budget enforcement, and configuration renaming.
v1.78.6-nightly11 fixes3 featuresThis release focuses heavily on bug fixes across various providers, including Anthropic, OpenAI Realtime API, and Gemini, alongside introducing new Azure AVA TTS and cost tracking features.
v1.78.5.rc.13 fixes4 featuresThis release introduces team-level rate limiting features and enhances guardrail capabilities with content masking and streaming support. It also includes several bug fixes addressing memory leaks and configuration handling.
v1.78.5-nightly4 fixes3 featuresThis release introduces team-level rate limiting controls and enhances guardrail capabilities with content masking and streaming support. Several bug fixes address memory leaks and configuration handling in the proxy server.
v1.78.4.dev13 fixes1 featureThis release introduces Global Cross-Region Inference for Bedrock and includes several bug fixes related to token counting, budget settings, and Bedrock/MCP integration.
v1.78.4-nightly9 fixes2 featuresThis release focuses on numerous bug fixes across UI, pricing, and model integrations, including enabling streaming for GPT-OSS on Bedrock and adding new API guardrails.
v1.78.3-nightly7 fixes13 featuresThis release introduces new features like global vendor discounts, UI discount settings, and support for Gemini 2.5 Flash and new models, alongside substantial performance improvements in the router and various bug fixes.
v1.78.2-nightly8 fixes10 featuresThis release introduces significant feature enhancements, including GPT-5 reasoning support, new Anthropic model versions (Haiku 4.5), and expanded admin capabilities like spending reports. Several bug fixes address pricing inaccuracies and endpoint response formats.
v1.78.0.rc.31 featureThis release introduces an update to the LiteLLM Proxy Docker image, enabling the use of STORE_MODEL_IN_DB=True directly within the containerized environment.
v1.78.0.rc.21 featureThis release updates the Docker image tag for the LiteLLM Proxy to v1.78.0.rc.2 and promotes the Hosted Proxy Alpha service.
v1.78.0.dev12 featuresThis release introduces support for GPT-5 reasoning content and GPT-5-Codex on Claude Code, alongside documentation updates and new Docker image availability.
v1.78.0.rc.1This release primarily focuses on documentation updates and cleanup, including building the UI for the documentation.
v1.77.7-stableThis release primarily updates the stable Docker image tag for the LiteLLM Proxy to v1.77.7-stable and provides information about the hosted proxy offering. Load test results for the proxy are also included.
v1.78.0-nightly15 fixes13 featuresThis release introduces support for new Azure AI, Vertex AI Gemma, and OCI Cohere models, alongside EnkryptAI Guardrails integration. Numerous bug fixes focus on session management, rate limiting, and UI stability.
v1.77.7.dev161 featureThis release updates the LiteLLM Proxy Docker image tag to v1.77.7.dev16 and provides information about the Hosted Proxy Alpha service.
v1.77.7.dev151 featureThis release updates the LiteLLM Proxy Docker image tag to v1.77.7.dev15 and promotes the Hosted Proxy Alpha service.
v1.77.7.dev141 featureThis release updates the Docker image tag for the LiteLLM Proxy to v1.77.7.dev14 and promotes the Hosted Proxy Alpha service.
v1.77.7.gemma22 featuresThis release updates the LiteLLM Proxy Docker image to v1.77.7.gemma2 and announces the availability of a Hosted Proxy Alpha service.
v1.77.7.dev131 featureThis release updates the LiteLLM Proxy Docker image tag to v1.77.7.dev13 and provides information about the Hosted Proxy Alpha offering.
v1.77.7.dev-gemma9 fixes2 featuresThis release focuses on stability and feature enhancements, including fixes for costing, rate limiting, and Anthropic tool calls, alongside the addition of EnkryptAI Guardrails.
v1.77.7.dev121 fix2 featuresThis release updates the LiteLLM Proxy Docker image tag to v1.77.7.dev12 and announces the availability of a Hosted Proxy Alpha program.
v1.77.7.dev113 fixes2 featuresThis release focuses on stability and security by fixing costing issues, improving Anthropic adapter reliability, and enhancing credential redaction. It also introduces support for EnkryptAI Guardrails and new Azure AI models.