ClickHouse
                                
                                 ClickHouse copied to clipboard
                                
                                    ClickHouse copied to clipboard
                            
                            
                            
                        Apply some quirks to workaround issues with sanitizers
Since likely the image will not be updated from CI, on other words it doesn't looks like the part that I could run by myself, I've changed couple of other places to test will this helps or not within PR.
Changelog category (leave one):
- Not for changelog (changelog entry is not required)
Fixes: https://github.com/ClickHouse/ClickHouse/issues/64086 Cc: @Felixoid (if this will work, can you please update the image)
This is an automated comment for commit 64a308013f6d0075fcf9d7c90d7e50cd9a3ae19e with description of existing statuses. It's updated for the latest CI running
❌ Click here to open a full report in a separate page
| Check name | Description | Status | 
|---|---|---|
| CI running | A meta-check that indicates the running CI. Normally, it's in success or pending state. The failed status indicates some problems with the PR | ⏳ pending | 
| Mergeable Check | Checks if all other necessary checks are successful | ❌ failure | 
| Stateless tests | Runs stateless functional tests for ClickHouse binaries built in various configurations -- release, debug, with sanitizers, etc | ❌ failure | 
Successful checks
| Check name | Description | Status | 
|---|---|---|
| A Sync | There's no description for the check yet, please add it to tests/ci/ci_config.py:CHECK_DESCRIPTIONS | ✅ success | 
| AST fuzzer | Runs randomly generated queries to catch program errors. The build type is optionally given in parenthesis. If it fails, ask a maintainer for help | ✅ success | 
| ClickBench | Runs [ClickBench](https://github.com/ClickHouse/ClickBench/) with instant-attach table | ✅ success | 
| ClickHouse build check | Builds ClickHouse in various configurations for use in further steps. You have to fix the builds that fail. Build logs often has enough information to fix the error, but you might have to reproduce the failure locally. The cmake options can be found in the build log, grepping for cmake. Use these options and follow the general build process | ✅ success | 
| Compatibility check | Checks that clickhouse binary runs on distributions with old libc versions. If it fails, ask a maintainer for help | ✅ success | 
| Docker keeper image | The check to build and optionally push the mentioned image to docker hub | ✅ success | 
| Docker server image | The check to build and optionally push the mentioned image to docker hub | ✅ success | 
| Docs check | Builds and tests the documentation | ✅ success | 
| Fast test | Normally this is the first check that is ran for a PR. It builds ClickHouse and runs most of stateless functional tests, omitting some. If it fails, further checks are not started until it is fixed. Look at the report to see which tests fail, then reproduce the failure locally as described here | ✅ success | 
| Flaky tests | Checks if new added or modified tests are flaky by running them repeatedly, in parallel, with more randomization. Functional tests are run 100 times with address sanitizer, and additional randomization of thread scheduling. Integrational tests are run up to 10 times. If at least once a new test has failed, or was too long, this check will be red. We don't allow flaky tests, read the doc | ✅ success | 
| Install packages | Checks that the built packages are installable in a clear environment | ✅ success | 
| Integration tests | The integration tests report. In parenthesis the package type is given, and in square brackets are the optional part/total tests | ✅ success | 
| PR Check | There's no description for the check yet, please add it to tests/ci/ci_config.py:CHECK_DESCRIPTIONS | ✅ success | 
| Performance Comparison | Measure changes in query performance. The performance test report is described in detail here. In square brackets are the optional part/total tests | ✅ success | 
| Stateful tests | Runs stateful functional tests for ClickHouse binaries built in various configurations -- release, debug, with sanitizers, etc | ✅ success | 
| Stress test | Runs stateless functional tests concurrently from several clients to detect concurrency-related errors | ✅ success | 
| Style check | Runs a set of checks to keep the code style clean. If some of tests failed, see the related log from the report | ✅ success | 
| Unit tests | Runs the unit tests for different release types | ✅ success | 
| Upgrade check | Runs stress tests on server version from last release and then tries to upgrade it to the version from the PR. It checks if the new server can successfully startup without any errors, crashes or sanitizer asserts | ✅ success | 
Different attempt:
- https://github.com/ClickHouse/ClickHouse/pull/64091
Interesting, builds passed, but stateless tests failed
Actually I don't see that it helps, at least not 100%, so I'm still looking for a solution
Interesting, looks like it helped - TSan and MSan builded and stateless tests passed
It's strange that some tests fail with weird error:
Here are all the VM changes made to the AMI on Friday
> git diff --word-diff /tmp/sysctl.{old,new} | grep '+}' | grep vm.
vm.min_free_kbytes = [-45056-]{+67584+}
vm.mmap_rnd_bits = [-28-]{+32+}
vm.mmap_rnd_compat_bits = [-8-]{+16+}
vm.user_reserve_kbytes = [-52114-]{+131072+}
I'm sorry about that. The update was made based on the hard deprecation of the actions runner. This deprecation will be effective this coming Thursday.
If it helps, we can deploy a new AMI tomorrow. There's another tiny fix to deploy.
vm.mmap_rnd_bits = [-28-]{+32+} vm.mmap_rnd_compat_bits = [-8-]{+16+}
Yes, this is exactly the reason.
I'm sorry about that. The update was made based on the hard deprecation of the actions runner. This deprecation will be effective this coming Thursday.
No problem; maybe more visibility of AMI updates is needed.
Let's start with a manual comp-ci-ami-updated label. Then we'll try to automate it