[RELEASE] Release version 3.3.0
This is a component issue for 3.3.0.
Coming from https://github.com/opensearch-project/opensearch-build/issues/5693. Please follow the following checklist.
Please refer to the DATES in that post.
How to use this issue
This Component Release Issue
This issue captures the state of the OpenSearch release, on component/plugin level; its assignee is responsible for driving the release. Please contact them or @mention them on this issue for help. Any release related work can be linked to this issue or added as comments to create visiblity into the release status.
Release Steps
There are several steps to the release process; these steps are completed as the whole component release and components that are behind present risk to the release. The component owner resolves the tasks in this issue and communicate with the overall release owner to make sure each component are moving along as expected.
Steps have completion dates for coordinating efforts between the components of a release; components can start as soon as they are ready far in advance of a future release. The most current set of dates is on the overall release issue linked at the top of this issue.
The Overall Release Issue
Linked at the top of this issue, the overall release issue captures the state of the entire OpenSearch release including references to this issue, the release owner which is the assignee is responsible for communicating the release status broadly. Please contact them or @mention them on that issue for help.
What should I do if my plugin isn't making any changes?
If including changes in this release, increment the version on 3.x branch to 3.3.0 for Min/Core, and 3.3.0.0 for components. Otherwise, keep the version number unchanged for both.
Preparation
- [x] Assign this issue to a release owner.
- [x] Finalize scope and feature set and update the Public Roadmap.
- [x] All the tasks in this issue have been reviewed by the release owner.
- [x] Create, update, triage and label all features and issues targeted for this release with
v3.3.0. - [x] Finalize the code and create the the release branch
3.3from the3.xbranch.
CI/CD
- [x] All code changes for
3.3.0are complete. - [x] Ensure working and passing CI.
- [x] Check that this repo is included in the distribution manifest.
Pre-Release
- [x] Increment the version on the parent branch to the next development iteration.
- [x] Gather, review and publish release notes following the rules and back port it to the release branch.git-release-notes may be used to generate release notes from your commit history.
- [x] Confirm that all changes for
3.3.0have been merged. - [x] Add this repo to the manifest for the next developer iteration.
Release Testing
- [ ] Find/fix bugs using latest tarball and docker image provided in parent release issue and update the release notes if necessary.
- [x] Code Complete: Test within the distribution, ensuring integration, backwards compatibility, and performance tests pass.
- [ ] Sanity Testing: Sanity testing and fixing of critical issues found.
- [x] File issues for all intermittent test failures.
Release
- [ ] Complete documentation.
- [ ] Verify all issued labeled for this release are closed or labelled for the next release.
- [ ] Verify the release date mentioned in release notes is correct and matches actual release date.
Post Release
- [ ] Prepare for an eventual security fix development iteration by incrementing the version on the release branch to the next eventual patch version.
- [ ] Add this repo to the manifest of the next eventual security patch version.
- [ ] Suggest improvements to this template.
- [ ] Conduct a retrospective, and publish its results.
Release Blog Content
Piped Processing Language
[Featured] Calcite Engine Now Default for Enhanced Performance
OpenSearch 3.3 enables Apache Calcite as the default query engine for PPL. The transition brings comprehensive query optimization capabilities including rule-based and cost-based optimizers. Apache Calcite's mature framework provides robust query planning and execution, leveraging an industry-standard foundation that powers major data management systems. Calcite's comprehensive optimization capabilities and extensive function library expand PPL's analytical power
and query processing efficiency. Moreover, this release significantly expands PPL's capabilities with new commands, evaluation functions, and statistical functions for advanced data analysis.

[Featured] Enhanced Performance Benchmarking Infrastructure
OpenSearch 3.3 introduces comprehensive benchmarking infrastructure to validate PPL's performance capabilities. New benchmark workloads include ClickBench and Big5 datasets, providing standardized performance testing across different analytical scenarios. These datasets enable evaluation of PPL queries under various conditions and workload patterns. Automated nightly benchmark runs provide continuous performance monitoring, ensuring consistent quality and identifying any performance regressions. Public dashboards offer transparency into PPL query performance, giving users visibility into how the query engine performs across different benchmark scenarios and query types. Nightly benchmarks for Big5 PPL are to the OpenSearch Bechmarks https://opensearch.org/benchmarks/
Process unstructured log data at query time
This release adds new text processing capabilities to PPL with the addition of regex, rex, and spath commands. These features enable users to filter, extract, and parse unstructured text directly at query time without requiring data preprocessing. The regex command provides pattern-based filtering to isolate relevant log entries, while rex extracts structured fields from raw text using regular expressions. The spath command extracts fields from JSON data, enabling access to nested objects and arrays. Together, these commands enable instant adaptation to new log formats without requiring re-indexing operations, allowing users to analyze previously unstructured data immediately.
rex example:

Analyze time-series data and distributions with simple commands
OpenSearch 3.3 introduces streamlined temporal and distribution analysis with new timechart and bin commands. The timechart command aggregates data over time intervals with flexible span controls, automatically handling time gap filling and result ordering for time-series analysis. It provides visualization-ready formatting with time as the primary axis and supports grouping by additional fields. The bin command automatically groups numeric data into ranges or buckets, facilitating distribution analysis for understanding data spread and frequency patterns. These commands make temporal pattern analysis and data distribution modeling more accessible within PPL queries, allowing users to identify trends and outliers directly through query operations.
timechart example:
Combine and reshape datasets within single queries
This release enhances PPL's data manipulation capabilities with new commands for flexible data combination and field operations. The append command combines results from multiple queries into a unified dataset, enabling users to merge data from different sources or time ranges within a single operation. The multisearch command executes multiple independent searches simultaneously and presents the combined results with timestamp-based interleaving when available, eliminating the need for manual composition of complex multi-query operations that previously required append plus explicit sorting. Additionally, wildcard support in the fields, table, and rename commands allows bulk operations on similarly named fields. These capabilities enable comprehensive data analysis workflows within single PPL queries.
append example:
Perform sophisticated statistical analysis with new functions
OpenSearch 3.3 expands PPL's analytical power with new statistical functions for deeper data analysis. The earliest and latest functions retrieve timestamp-based values, enabling time-series analysis by finding the earliest or latest occurrence of values within groups based on their timestamps. New multivalue statistical functions list and values collect multiple values into structured arrays during aggregation operations, with list preserving duplicates while values returns unique values. These functions preserve relationships between grouped data points, enabling advanced analysis workflows that require maintaining collections of related values. Additionally, max and min aggregate functions now support non-numeric data types for broader comparison operations.
earliest example:
New evaluation functions for data transformation
This release introduces new evaluation functions for data transformations. The enhanced coalesce function handles null values across mixed data types, providing flexible fallback logic for data cleaning operations. The mvjoinfunction combines multi-value fields into single strings using specified delimiters, enabling array manipulation within queries. New mathematical functions sum, avg, max, and min enable row-level calculations and comparisons across multiple values or fields. Text processing capabilities are expanded with regex_match for pattern matching operations and strftime for timestamp formatting. These functions enable complex data transformations directly within PPL expressions without requiring external processing steps.
mvjoin example:
Hi @Yury-Fridlyand, @YANG-DB, @ps48, @acarbonetto, @vamsimanohar, @swiddis, @ykmr1224, @joshuali925, @qianheng-aws, @derek-ho, @seankao-az, @RyanL1997, @dai-chen, @LantaoJin, @penghuo, @noCharger, @mengweieric, @yuancu, @kavithacm, @MaxKsyunz,
Could someone kindly volunteer to take on the role of release owner for this component in order to meet the entrance criteria ? If no one is able to take it on, we may need to assign someone randomly before the release window opens.
Thank you in advance for your help!
Hi @derek-ho,
Since this component currently does not have a release owner, we will assign you to this role for the time being! If you feel this should be reassigned, please feel free to delegate it to the appropriate maintainer.
Thank you!
Hi @derek-ho, Since this component currently does not have a release owner, we will assign you to this role for the time being! If you feel this should be reassigned, please feel free to delegate it to the appropriate maintainer. Thank you!
Ummm we need to triage this and and apparently we should not just randomly assign to people from the maintainer list.
Hi @cwperks, @heemin32
opensearch-sql plugin depend on geospatial plugin and job-scheduler plugin, when bump OS 3.3 version, we facing IT failed issue. Is it related to any change introduced in job-scheduler OS 3.3? there any clue how to fix it?
./gradlew ':integ-test:integTest' --tests "org.opensearch.sql.ppl.GeoIpFunctionsIT"
=== Standard error of node `node{:integ-test:integTest-0}` ===
» ↓ last 40 non error or warning messages from /Users/penghuo/oss/os-sql/integ-test/build/testclusters/integTest-0/logs/opensearch.stderr.log ↓
» WARNING: Using incubator modules: jdk.incubator.vector
» WARNING: Unknown module: org.apache.arrow.memory.core specified to --add-opens
» fatal error in thread [Thread-3], exiting
» java.lang.IncompatibleClassChangeError: Found interface org.opensearch.jobscheduler.spi.utils.LockService, but class was expected
» at org.opensearch.geospatial.ip2geo.common.Ip2GeoLockService.acquireLock(Ip2GeoLockService.java:61)
» at org.opensearch.geospatial.ip2geo.action.PutDatasourceTransportAction.doExecute(PutDatasourceTransportAction.java:79)
» at org.opensearch.geospatial.ip2geo.action.PutDatasourceTransportAction.doExecute(PutDatasourceTransportAction.java:44)
» at org.opensearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:220)
» at org.opensearch.action.support.TransportAction.execute(TransportAction.java:190)
» at org.opensearch.action.support.TransportAction.execute(TransportAction.java:109)
» at org.opensearch.transport.client.node.NodeClient.executeLocally(NodeClient.java:113)
» at org.opensearch.geospatial.ip2geo.action.RestPutDatasourceHandler.lambda$prepareRequest$0(RestPutDatasourceHandler.java:71)
» at org.opensearch.rest.BaseRestHandler.handleRequest(BaseRestHandler.java:132)
» at org.opensearch.rest.RestController.dispatchRequest(RestController.java:381)
» at org.opensearch.rest.RestController.tryAllHandlers(RestController.java:467)
» at org.opensearch.rest.RestController.dispatchRequest(RestController.java:287)
» at org.opensearch.http.AbstractHttpServerTransport.dispatchRequest(AbstractHttpServerTransport.java:374)
» at org.opensearch.http.AbstractHttpServerTransport.handleIncomingRequest(AbstractHttpServerTransport.java:482)
» at org.opensearch.http.AbstractHttpServerTransport.incomingRequest(AbstractHttpServerTransport.java:357)
» at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:56)
» at org.opensearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:42)
» at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at org.opensearch.http.netty4.Netty4HttpPipeliningHandler.channelRead(Netty4HttpPipeliningHandler.java:72)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:107)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:107)
» at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:120)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:107)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.handler.codec.http.HttpContentDecoder.decode(HttpContentDecoder.java:166)
» at io.netty.handler.codec.http.HttpContentDecoder.decode(HttpContentDecoder.java:48)
» at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:91)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:107)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
» at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
» at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
» at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:289)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:107)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
» at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1357)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
» at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
» at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:868)
» at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
» at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:796)
» at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:697)
» at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:660)
» at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
» at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:998)
» at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
» at java.base/java.lang.Thread.run(Thread.java:1583)