LlamaIndex
AI & LLMsLlamaIndex is the leading framework for building LLM-powered agents over your data.
Release History
v0.14.129 fixes11 featuresThis release introduces async tool spec support across several integrations, adds new LLM providers like AI Badgr and Typecast, and provides significant updates to MongoDB and Vertex AI vector stores. It also includes various bug fixes for metadata filtering and Pydantic schema handling.
v0.14.101 fix2 featuresThis release introduces a mock function calling LLM for core testing and a new Airweave tool integration, alongside minor typo fixes in the Qianfan provider modules.
v0.14.912 fixes8 featuresThis release introduces support for Claude 4.5 and GPT-5.1-chat, adds the OVHcloud provider, and improves MultiModal index engines. It also includes critical bug fixes for async utilities, memory management, and various vector store connectors.
v0.14.8Breaking7 fixes7 featuresThis release introduces OpenAI v2 SDK support, enhances tool calling for Google GenAI and Bedrock, and adds a new Scrapy-based web reader. It also includes critical fixes for ReAct agents and LanceDB indexing performance.
v0.14.75 fixes9 featuresThis release introduces SerpEx tool integration and expands tool call block support across Anthropic, MistralAI, and Ollama. It also adds significant updates to Bedrock, Couchbase vector stores, and GitHub authentication.
v0.14.66 fixes7 featuresThis release introduces new integrations for Isaacus and Helicone, adds async support for Bedrock retrievers, and includes critical security fixes for SQL parameterization in PostgresKVStore.
v0.14.58 fixes11 featuresThis release introduces support for next-generation models including GPT-5 and Claude 4.5/Haiku 4.5, adds Sglang and SignNow integrations, and provides significant bug fixes for document processing and streaming.
v0.14.410 fixes7 featuresThis release introduces support for Claude 4.5 and structured outputs for OpenAI-like models, while primarily addressing dependency issues and bug fixes across the ecosystem.
v0.14.3Breaking4 fixes7 featuresThis release introduces ThinkingBlock support across major LLM providers, adds new PaddleOCR and Azure PostgreSQL integrations, and migrates MongoDB components to the PyMongo Asynchronous API. It also includes a critical security fix for SingleStoreDB and updates the Firecrawl reader to the v2 SDK.
v0.14.2No release notes provided.
v0.14.1.post1No release notes provided.
v0.14.1No release notes provided.
v0.14.0Breaking1 featureThis release updates llama-index-core to 0.14.0, introducing breaking changes in workflows by removing deprecated features and methods, while adding document block support for OpenAI LLMs.
v0.13.6No release notes provided.
v0.13.55 fixes3 featuresThis release introduces YugabyteDB chat storage support and Bedrock prompt caching, alongside critical bug fixes for path handling, LiteLLM context calculations, and Milvus node creation.
v0.13.43 fixes10 featuresThis release introduces Baseten and ZenRows integrations, adds MMR search to Chroma, and updates IBM integrations with external URL support while deprecating instance_id. It also includes critical fixes for ContextChatEngine and Postgres vector store deletion logic.
v0.13.3.post1No release notes provided.
v0.13.310 fixes4 featuresThis release introduces HerokuEmbeddings support, Qdrant sharding, and GPT-5 model support, alongside critical stability fixes for streaming responses, empty vector stores, and concurrent reader usage.
v0.13.2.post11 fixThis release focuses on documentation fixes.
v0.13.29 fixes6 featuresThis release introduces the Superlinked retriever integration, enhances PowerPoint data extraction, and updates ClickHouse for new vector search capabilities, alongside various bug fixes for Anthropic, AzureOpenAI, and Ollama integrations.
v0.13.1Breaking8 fixes12 featuresThis release introduces several new integrations including Heroku LLMs and Bedrock AgentCore tools, adds support for GPT-5 and Anthropic Opus 4.1, and deprecates the managed Llama Cloud index in favor of a new services package.
v0.13.0.post3No release notes provided.
v0.13.0.post2No release notes provided.
v0.13.0.post1No release notes provided.
v0.13.0Breaking5 fixes10 featuresThis major update to llama-index-core (v0.13.0) removes legacy agent classes and QueryPipeline in favor of a new Workflow-based agent architecture, while updating various provider integrations for LLMs and vector stores.
v0.12.52.post1No release notes provided.
v0.12.525 fixes3 featuresThis release includes bug fixes for agent memory and async multimodal calls, introduces a new Jira tool integration, and optimizes memory for BGE-M3 indices.
v0.12.513 fixes1 featureThis release introduces automatic type conversion for FunctionTool and fixes critical issues regarding vector store retrievers and metadata serialization in Azure Blob storage.
v0.12.50Breaking10 fixes8 featuresThis release introduces several new integrations including Cloudflare AI Gateway, Service-Now, and S3 vector stores, alongside Llama 4 support in Bedrock. It also includes security improvements to cache directory handling and various bug fixes for agents and retrievers.
v0.12.498 fixes7 featuresThis release introduces DuckDB-based storage and Moorcheh vector store support, while implementing structured output for agents and resolving critical thread-safety issues in DuckDB integrations.
v0.12.48Breaking6 fixes4 featuresThis release introduces breaking changes to the Context API, adds Cached Content support for Google GenAI, and includes several bug fixes across core, OCI, and LlamaCloud integrations.
v0.12.47Breaking3 fixes6 featuresThis release unifies Multi Modal LLMs into the base LLM class, introduces a security fix for the RAG CLI, and adds several integrations including LanceDB MultiModal and Anthropic citations.
v0.12.46.post1No release notes provided.
v0.12.46Breaking6 fixes1 featureThis release introduces async operations for VectorStoreIndex and provides several bug fixes across core, MCP tools, and NVIDIA embeddings. It also includes a significant migration for ElevenLabs voice agents and updated dependency constraints for Google GenAI.
v0.12.45Breaking5 fixes7 featuresThis release introduces content block support for tools, adds new AWS Claude models, and updates the core memory and UI event standards while fixing several parser and vector store bugs.
v0.12.44.post1No release notes provided.
v0.12.4411 fixes10 featuresThis release introduces several new integrations including IBM Db2 and OpenAI Realtime Conversation, while deprecating older agent architectures and query pipelines. It also includes significant bug fixes for ReAct agents and expanded batching support for embedding providers.
v0.12.438 fixes8 featuresThis release modularizes the core library by moving Workflows and Instrumentation into dedicated packages while adding Mermaid diagram support for workflows and new integrations for openGauss and Hive tools.
v0.12.428 fixes8 featuresThis release introduces support for OpenAI's O3-pro and MistralAI reasoning models, adds a new multi-modal OpenAI-like LLM, and provides various fixes for async memory operations and figure retrieval.
v0.12.41Breaking10 fixes10 featuresThis release introduces several new integrations including ElevenLabs and ApertureDB, adds support for OpenAI JSON Schema and Ollama's 'think' feature, and includes a breaking rename of the JsonPickleSerializer to PickleSerializer.
v0.12.40Breaking6 fixes4 featuresThis release introduces workflow validation for StopEvents, adds Measure Space tools, and provides several bug fixes for OpenAI, Anthropic, and Google GenAI integrations.
v0.12.392 fixes5 featuresThis release introduces dependency injection for Workflows via the Resource class, adds tool enforcement for LLMs, and updates provider support for Ollama and Milvus.
v0.12.388 fixes10 featuresThis release introduces OpenTelemetry observability, Claude 4 support across multiple providers, and a new embeddings cache. It also includes critical security fixes for JSONReader and significant enhancements to the Model Context Protocol (MCP) integration.
v0.12.378 fixes6 featuresThis release introduces the Vectorize retriever and DeSearch tool, while providing critical bug fixes for tool call parsing in Bedrock and OpenAI, and improving multi-agent workflow stability.
v0.12.35Breaking13 fixes11 featuresThis release introduces a major memory system revamp, adds several new integrations including Gel, Oxylabs, and Meta Llama-api, and provides critical bug fixes for ReAct agents and tool call parsing across multiple LLM providers.
v0.12.34No release notes provided.
v0.12.33No release notes provided.
v0.12.32No release notes provided.
v0.12.31No release notes provided.
v0.12.30No release notes provided.
v0.12.29No release notes provided.
v0.12.28No release notes provided.
v0.12.27No release notes provided.
v0.12.26No release notes provided.
v0.12.25No release notes provided.
v0.12.24.post2No release notes provided.
v0.12.24.post1No release notes provided.
v0.12.24No release notes provided.
v0.12.23No release notes provided.
v0.12.22.post1No release notes provided.
v0.12.22No release notes provided.
v0.12.21No release notes provided.
v0.12.20No release notes provided.
v0.12.19No release notes provided.
v0.12.18No release notes provided.
v0.12.17.post2No release notes provided.
v0.12.17.post1No release notes provided.
v0.12.16No release notes provided.
v0.12.15No release notes provided.
v0.12.14No release notes provided.
v0.12.13.post1No release notes provided.
v0.12.13No release notes provided.
v0.12.12No release notes provided.
v0.12.11No release notes provided.
v0.12.10No release notes provided.