Add session and statement state for all query types
Description
- create session and statement for all the queryType.
Issues Resolved
https://github.com/opensearch-project/sql/issues/2401
Check List
- [x] New functionality includes testing.
- [ ] All tests pass, including unit test, integration test and doctest
- [ ] New functionality has been documented.
- [ ] New functionality has javadoc added
- [ ] New functionality has user manual doc added
- [x] Commits are signed per the DCO using --signoff
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license. For more information on following Developer Certificate of Origin and signing off your commits, please check here.
Codecov Report
Merging #2413 (d45376c) into main (2f2ecd2) will increase coverage by
0.01%. The diff coverage is100.00%.
@@ Coverage Diff @@
## main #2413 +/- ##
============================================
+ Coverage 95.54% 95.56% +0.01%
- Complexity 4985 4987 +2
============================================
Files 478 478
Lines 13883 13919 +36
Branches 931 931
============================================
+ Hits 13265 13301 +36
Misses 598 598
Partials 20 20
| Flag | Coverage Δ | |
|---|---|---|
| sql-engine | 95.56% <100.00%> (+0.01%) |
:arrow_up: |
Flags with carried forward coverage won't be shown. Click here to find out more.
| Files | Coverage Δ | |
|---|---|---|
| ...search/sql/spark/dispatcher/AsyncQueryHandler.java | 100.00% <100.00%> (ø) |
|
| ...search/sql/spark/dispatcher/BatchQueryHandler.java | 100.00% <100.00%> (ø) |
|
| ...ensearch/sql/spark/dispatcher/IndexDMLHandler.java | 100.00% <100.00%> (ø) |
|
| .../sql/spark/dispatcher/InteractiveQueryHandler.java | 100.00% <100.00%> (ø) |
|
| ...rch/sql/spark/dispatcher/SparkQueryDispatcher.java | 100.00% <100.00%> (ø) |
|
| ...ch/sql/spark/dispatcher/StreamingQueryHandler.java | 100.00% <100.00%> (ø) |
|
| ...ql/spark/execution/session/InteractiveSession.java | 100.00% <100.00%> (ø) |
|
| ...arch/sql/spark/execution/session/SessionModel.java | 98.82% <100.00%> (+0.01%) |
:arrow_up: |
| ...earch/sql/spark/execution/session/SessionType.java | 100.00% <100.00%> (ø) |
|
| ...earch/sql/spark/execution/statement/Statement.java | 100.00% <100.00%> (ø) |
|
| ... and 1 more |
@kaituo two questions.
1. should we add a new StatementState `cancelling` and Spark job update it as `cancelled`? 2. what is the required configuration of REPL job, we have following configurations now, do we need more? * config.put(FLINT_JOB_REQUEST_INDEX, DATASOURCE_TO_REQUEST_INDEX.apply(datasourceName)); * config.put(FLINT_JOB_SESSION_ID, sessionId);
-
Can you just update it as cancelled? If you cancel before repl pick the statement up or after repl finishes the statement, repl doesn't need to do anything else. If you cancel after repl picks it up and before repl finishes, repl may change your state, which I think is fine.
-
I also need these two:
val dataSource = conf.get("spark.flint.datasource.name", "unknown") val wait = conf.get("spark.flint.job.type", "continue")