performance issue in v2.1.0 ?
Hello,
when upgrading from v2.0.0 to v2.1.0 our suite of pytests which usually takes around 10 minutes, runs for many hours without finishing. I've narrowed it down to the fact that upgrading to v2.1.0 of libfaketime seems to be the cause.
Any idea what might be going on here?
Greetings,
jgb
Hi. Thank you for the report. For additional context, on which system do you run your testsuite? Do you observe the same behavior on different systems? We updated the underlying libfaketime version as well as changing a bit of code. To be sure of which one is causing the issue, would you try to run your testsuite, with the current python-libfaketime codebase, and the previous libfaketime version?
git clone https://github.com/simon-weber/python-libfaketime.git
cd python-libfaketime
git clone https://github.com/simon-weber/libfaketime.git libfaketime/vendor/libfaketime
make -C libfaketime/vendor/libfaketime
Then install this version in your project environment and run your testsuite:
cd your/project/path
pip install the/path/to/python-libfaketime -U --force
pytest
@azmeuk thanks, I did what you asked, and then I don't observe the performance regression. This is on Ubuntu 24.04 LTS with python 3.12.
I've run into the same issue. I could work around by setting FAKETIME_FORCE_MONOTONIC_FIX to 0.
@vlaci what is your OS?
@jgb do you confirm FAKETIME_FORCE_MONOTONIC_FIX make a change for you?
If so, I don't know if we would want to enable it by default, as it is discouraged by the libfaketime documentation:
Please try to avoid compiling with FORCE_MONOTONIC_FIX on platforms that
do not need it. While it won't make a difference in most cases, depending
on the specific FAKETIME settings in use, it would cause certain
intercepted functions such as pthread_cond_timedwait() return with a
time-out too early or too late, which could break some applications.
@azmeuk I can confirm, just tested v2.1.0 with FAKETIME_FORCE_MONOTONIC_FIX=0, that brings performance back to a normal level. Without that exported variable, performance becomes a disaster.
@simon-weber do you have any opinion about disabling FAKETIME_FORCE_MONOTONIC_FIX by default?
@vlaci what is your OS? @jgb do you confirm
FAKETIME_FORCE_MONOTONIC_FIXmake a change for you?If so, I don't know if we would want to enable it by default, as it is discouraged by the libfaketime documentation:
Please try to avoid compiling with FORCE_MONOTONIC_FIX on platforms that do not need it. While it won't make a difference in most cases, depending on the specific FAKETIME settings in use, it would cause certain intercepted functions such as pthread_cond_timedwait() return with a time-out too early or too late, which could break some applications.
I am on NixOS (Linux) with glibc 2.39
It looks like this may be a regression in libfaketime 0.9.10. Maybe we downgrade our vendored libfaketime instead? From a quick look over the FORCE_MONOTONIC_FIX discussions I'm not sure about the safety of disabling it.
@jgb @vlaci can you check if #82 solves the issue for your use cases?
@jgb @vlaci can you check if #82 solves the issue for your use cases?
hello, I tried, but it fails to build / install...
@jgb What error message do you see?
Did you install with pip install git+https://github.com/simon-weber/python-libfaketime@refs/pull/82/head?
@jgb What error message do you see? Did you install with
pip install git+https://github.com/simon-weber/python-libfaketime@refs/pull/82/head?
Sorry, I must have done something wrong initially, using your command it installed just fine. Tried it out just now, even though the result isn't as bad as it was, it still makes my test suite go from taking 5 minutes before, and now 12+ minutes. Also it makes about 20 of my tests fail, which don't fail using v2.0.0 and those failing tests are all related to selenium + chrome...
@jgb @vlaci can you check if #82 solves the issue for your use cases?
I can confirm that the PR works. It indeed looks a bit slower. In my case, it's about 10%
Thank you both for you feedback.
Tried it out just now, even though the result isn't as bad as it was, it still makes my test suite go from taking 5 minutes before, and now 12+ minutes.
I can confirm that the PR works. It indeed looks a bit slower. In my case, it's about 10%
The current benchmark.py script does not show different behaviors for python-libfaketime 2.1.0 and libfaketime 0.9.8, 0.9.9 and 0.9.10 with the master branch. I could not test 0.9.7 because it won't compile on my machine. This is too bad because this is the closest version to simon's libfaketime fork that was used in python-libfaketime.
However, benchmark.py with python-libfaketime 2.0.0 show better results than 2.1.0 (something like 40%), so the good news is that this is reproducible. It seems the fault should lie either on the libfaketime 0.9.7 to 0.9.8+ upgrade, or more probably in my recent changes.
Just to be sure, can you check if libfaketime 0.9.8 improve perfs in comparison to 0.9.9 with pip install git+https://github.com/simon-weber/python-libfaketime@refs/pull/82/head :pray: ?
Also it makes about 20 of my tests fail, which don't fail using v2.0.0 and those failing tests are all related to selenium + chrome...
Do the tests you are referring to also fail with python-libfaketime 2.1.0 (i.e. with libfaketime 0.9.10) or just with #82 (i.e. with libfaketime 0.9.9)?
Just to be sure, can you check if libfaketime 0.9.8 improve perfs in comparison to 0.9.9 with
pip install git+https://github.com/simon-weber/python-libfaketime@refs/pull/82/head🙏 ?
I assume you wanted me to check #83.
It was more of a hassle to install, as I needed to explicitly set CFLAGS=-Wno-error=unused-variable to make it build. The performance seems to be back to normal though.
Do the tests you are referring to also fail with python-libfaketime 2.1.0 (i.e. with libfaketime 0.9.10) or just with #82 (i.e. with libfaketime 0.9.9)?
I didn't actually manage to test this: v2.1.0 brings my whole machine to a halt, I killed the pytest processes after a few hours.
It was more of a hassle to install, as I needed to explicitly set
CFLAGS=-Wno-error=unused-variableto make it build. The performance seems to be back to normal though.
@jgb do you also see nominal perfs with #83?
@simon-weber what do you think of all of this? Should we downgrade?
CFLAGS=-Wno-error=unused-variable
Hi @azmeuk I just tested #83 and I can confirm it's slower by a few orders of magnitude compared to v2.0.0. Not AS slow as v2.1.0 but still so slow that it becomes unworkable.
Hm. Maybe we can get way with fixing this with FAKETIME_FORCE_MONOTONIC_FIX? I see comments in the libfaketime thread about it breaking things for java, but maybe we don't need to worry about it since we're only running against python.
Hm. Maybe we can get way with fixing this with FAKETIME_FORCE_MONOTONIC_FIX? I see comments in the libfaketime thread about it breaking things for java, but maybe we don't need to worry about it since we're only running against python.
Worth a try! Any update on this? Can I do something to help?
Want to open a PR and test it out? I think we just need to add that env var for both platforms here.
@simon-weber I opened a PR :)
Released in 3.0.0. Thanks!
@simon-weber sorry for the bad news, but upgrading to 3.0.0 makes all of my tests fail because I cannot connect to my mariadb database any longer. Downgrading back to 2.0.0 fixes it. :sad:
@jgb do you have more context to give us?
Yes, sorry for the lack of context. I am using libfaketime in a set of around 2000+ tests that we run with pytest. Some info about our setup:
platform linux -- Python 3.12.3, pytest-8.3.4, pluggy-1.5.0 -- /root/.venv/vidicenter/bin/python3
cachedir: .pytest_cache
metadata: {'Python': '3.12.3', 'Platform': 'Linux-6.12.9-amd64-x86_64-with-glibc2.39', 'Packages': {'pytest': '8.3.4', 'pluggy': '1.5.0'}, 'Plugins': {'variables': '3.1.0', 'html': '4.1.1', 'xdist': '3.6.1', 'celery': '0.0.0', 'anyio': '4.
6.2.post1', 'rerunfailures': '15.0', 'metadata': '3.1.1', 'django-webtest': '1.9.12', 'mock': '3.14.0', 'ref': '0.2.0', 'Faker': '33.0.0', 'plugin-helpers': '0.1.0', 'django': '4.9.0', 'cov': '6.0.0', 'selenium': '4.1.0', 'base-url': '2.1.
0'}, 'Base URL': '', 'Driver': 'Chrome', 'Capabilities': {}}
django: version: 5.1.5, settings: vidicenter.test_settings (from env)
driver: Chrome
sensitiveurl: .*
rootdir: /srv/vidicenter
configfile: pytest.ini
plugins: variables-3.1.0, html-4.1.1, xdist-3.6.1, celery-0.0.0, anyio-4.6.2.post1, rerunfailures-15.0, metadata-3.1.1, django-webtest-1.9.12, mock-3.14.0, ref-0.2.0, Faker-33.0.0, plugin-helpers-0.1.0, django-4.9.0, cov-6.0.0, selenium-4.
1.0, base-url-2.1.0
We run usually pytest -n4.
When I do this using libfaketime v2.0.0 all tests work fine, when I upgrade to v3.0.0 everything fails, with errors such as the following:
[gw1] linux -- Python 3.12.3 /root/.venv/vidicenter/bin/python3
self = <DatabaseWrapper vendor='mysql' alias='__no_db__'>
@async_unsafe
def ensure_connection(self):
"""Guarantee that a connection to the database is established."""
if self.connection is None:
if self.in_atomic_block and self.closed_in_transaction:
raise ProgrammingError(
"Cannot open a new connection in an atomic block."
)
with self.wrap_database_errors:
> self.connect()
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/base.py:279:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/root/.venv/vidicenter/lib/python3.12/site-packages/django/utils/asyncio.py:26: in inner
return func(*args, **kwargs)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/base.py:256: in connect
self.connection = self.get_new_connection(conn_params)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/utils/asyncio.py:26: in inner
return func(*args, **kwargs)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/mysql/base.py:256: in get_new_connection
connection = Database.connect(**conn_params)
/root/.venv/vidicenter/lib/python3.12/site-packages/MySQLdb/__init__.py:121: in Connect
return Connection(*args, **kwargs)
# PEP-249 requires autocommit to be initially off
autocommit = kwargs2.pop("autocommit", False)
self._set_attributes(*args, **kwargs2)
> super().__init__(*args, **kwargs2)
E MySQLdb.OperationalError: (2002, "Can't connect to server on 'db_testing' (2)")
/root/.venv/vidicenter/lib/python3.12/site-packages/MySQLdb/connections.py:200: OperationalError
The above exception was the direct cause of the following exception:
request = <SubRequest 'django_db_setup' for <Function test_chart_output_no_data[dummy_dataset_1-global_uptime-all]>>, django_test_environment = None, django_db_blocker = <pytest_django.plugin.DjangoDbBlocker object at 0x7fd5b45d85f0>
django_db_use_migrations = True, django_db_keepdb = True, django_db_createdb = False, django_db_modify_db_settings = None
@pytest.fixture(scope="session")
def django_db_setup(
request: pytest.FixtureRequest,
django_test_environment: None,
django_db_blocker: DjangoDbBlocker,
django_db_use_migrations: bool,
django_db_keepdb: bool,
django_db_createdb: bool,
django_db_modify_db_settings: None,
) -> Generator[None, None, None]:
"""Top level fixture to ensure test databases are available"""
from django.test.utils import setup_databases, teardown_databases
setup_databases_args = {}
if not django_db_use_migrations:
_disable_migrations()
if django_db_keepdb and not django_db_createdb:
setup_databases_args["keepdb"] = True
with django_db_blocker.unblock():
> db_cfg = setup_databases(
verbosity=request.config.option.verbose,
interactive=False,
**setup_databases_args,
)
/root/.venv/vidicenter/lib/python3.12/site-packages/pytest_django/fixtures.py:144:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/root/.venv/vidicenter/lib/python3.12/site-packages/django/test/utils.py:206: in setup_databases
connection.creation.create_test_db(
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/creation.py:62: in create_test_db
self._create_test_db(verbosity, autoclobber, keepdb)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/creation.py:202: in _create_test_db
with self._nodb_cursor() as cursor:
/usr/lib/python3.12/contextlib.py:137: in __enter__
return next(self.gen)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/base.py:712: in _nodb_cursor
with conn.cursor() as cursor:
/root/.venv/vidicenter/lib/python3.12/site-packages/django/utils/asyncio.py:26: in inner
return func(*args, **kwargs)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/base.py:320: in cursor
return self._cursor()
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/base.py:296: in _cursor
self.ensure_connection()
/root/.venv/vidicenter/lib/python3.12/site-packages/django/utils/asyncio.py:26: in inner
return func(*args, **kwargs)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/base.py:278: in ensure_connection
with self.wrap_database_errors:
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/utils.py:91: in __exit__
raise dj_exc_value.with_traceback(traceback) from exc_value
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/base.py:279: in ensure_connection
self.connect()
/root/.venv/vidicenter/lib/python3.12/site-packages/django/utils/asyncio.py:26: in inner
return func(*args, **kwargs)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/base/base.py:256: in connect
self.connection = self.get_new_connection(conn_params)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/utils/asyncio.py:26: in inner
return func(*args, **kwargs)
/root/.venv/vidicenter/lib/python3.12/site-packages/django/db/backends/mysql/base.py:256: in get_new_connection
connection = Database.connect(**conn_params)
/root/.venv/vidicenter/lib/python3.12/site-packages/MySQLdb/__init__.py:121: in Connect
return Connection(*args, **kwargs)