I have a simple integration test in Django that spawns a Celery worker to run a job that writes a record to the database. The Django thread also writes the record to the database. Since it’s a test, I’m using the default in-memory sqlite3 database. No transactions are used.
I keep getting this error:
django.db.utils.OperationalError: database table is locked
According to the Django documentation, this is due to one connection timing out while waiting for the other to complete. This is “more concurrency than sqlite can handle in the default configuration”. This seems strange since it is in two threads Both records. Nonetheless, the same documentation says to increase the timeout option to force the connection to wait longer. OK, I changed the database settings to:
DATABASES = { 'default': { 'ENGINE': 'django.db.backends.sqlite3', 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'), 'OPTIONS': {'timeout': 10000000}, } }
This has no effect. The error still appears, it’s clearly not waiting 1e7 seconds or 1e7 milliseconds or 1e7 microseconds before doing so. Is there an extra setting I’m missing?
I have tried Python 3.5 and Python 3.6 as well as Django 1.11 and Django 2.0.
1> prokher..:
I encountered Coming to the same problem, my experiments gave me the following:
I discovered that Django uses an in-memory SQLite DB in test mode until you explicitly change it. This explains why I only see the issue in unit tests. Forces Django to use SQLite DB settings.py
in filesets set explicitly in .NET by DATABASES->TEST->NAME
. For example:
DATABASES = { 'default': { ... 'TEST': { 'NAME': 'testdb.sqlite3', }, }, }
Setting the timeout value greater than 2147483.647 (looks familiar, right? :-)) will disable the timeout (or set it to a ignorably smaller value).
As I understand it, the source of the problem is that when SQLite uses a shared cache, the timeout value is not respected at all.