OpenSearch 3.0 accelerates vector database performance and has Model Context Protocol (MCP) support for AI agent interactions.
This version of OpenSearch is claimed to deliver a 9.5x performance improvement over OpenSearch 2.17.1, which itself “was 1.6x faster than its closest industry competitor” – which was Elasticsearch 8.15.4. If both vendor claims are accurate, OpenSearch 3.0 would theoretically be 15.2 times faster than Elasticsearch 8.15.4. But the current 9.0.0 Elasticsearch version was released on April 15 this year, and, with Better Binary Quantization (BBQ), is claimed to be up to 5x faster than OpenSearch with its FAISS (Facebook AI Similarity Search) library. We would assume that, absent formal benchmark tests, Elasticsearch 9.0 and OpenSearch 3.0 have approximately the same performance.

Carl Meadows, Governing Board Chair at the OpenSearch Software Foundation and Director of Product Management at AWS, stated: “The enterprise search market is skyrocketing in tandem with the acceleration of AI, and it is projected to reach $8.9 billion by 2030. OpenSearch 3.0 is a powerful step forward in our mission to support the community with an open, scalable platform built for the future of search and analytics, and it reflects our commitment to open collaboration and innovation that drives real-world impact.”
For context, Elasticsearch is an open source, distributed analytics engine that appeared in 2010 based on Apache Lucene search software. It is the world’s most-used vector database, according to Elastic. In January 2012, Elastic changed its Apache 2.0 source license to a dual license structure based on the restrictive SSPL (Server Side Public License) and an Elastic License to discourage major cloud service providers from using its software without contributing to the community or buying support. Consequently, AWS forked Elasticsearch (7.10.2) and developed its own OpenSearch software based on Elasticsearch, along with OpenSearch Dashboards as a fork from the Kibana open source data visualization and exploration software.
Kibana is part of the ELK (Elasticsearch, Logstash, Kibana) stack, a set of Elastic software for collecting, processing, and visualizing data. It was developed at Elastic by Rashid Khan.
OpenSearch, supported by the OpenSearch Software Foundation, has an Apache v2.0 license and its code-contributing community includes Logz.io, Red Hat, SAP, and Uber. v3.0 features include:
- GPU-based acceleration, leveraging Nvidia cuVS for indexing workflows, for its Vector Engine with this “experimental feature” speeding index builds by up to 9.3x and accelerating data-intensive workload performance.
- Native MCP support to enable AI agents to integrate with OpenSearch.
- Derived Source capability, which reduces storage capacity one-third by removing redundant vector data sources and utilizing primary data to recreate source documents as needed for reindexing or source callback.
AI inferencing performance will depend heavily on vector search times and all the AI search and vector database suppliers and open source coders are in a race to provide the fastest vector search times they can.
OpenSearch data management additions include gRPC support, pull-based ingestion, reader and writer separation, index type detection to speed up log analysis, and Apache Calcite integration. Calcite is an open source framework for building databases and data management systems. Google Remote Procedure Call (gRPC) is an open source, cross-platform, high-performance remote procedure call framework, developed to connect microservices in Google’s data centers.
OpenSearch 3.0 uses Apache Lucene 10.0, has updated its minimum supported runtime to Java 21, added support for the Java Platform Module System, and is now available.
Bootnote
Elasticsearch and OpenSearch competitors include Algolia, Meilisearch, OpenObserve, Apache Solr, Typesense, and many others.