Django Celery Max DB connections reached

时间:2019-04-16 23:23:02

标签: python django celery django-celery

I am running tasks on my celery worker in a django application where each task takes about 1-2 seconds to execute. Usually these executions are fine but from time to time, especially if the Django application has been deployed for a while, I start seeing errors like this:

File "/usr/lib64/python3.6/site-packages/sqlalchemy/pool/base.py", line 428, in __init__
    self.__connect(first_connect_check=True)
  File "/usr/lib64/python3.6/site-packages/sqlalchemy/pool/base.py", line 630, in __connect
    connection = pool._invoke_creator(self)
  File "/usr/lib64/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 114, in connect
    return dialect.connect(*cargs, **cparams)
  File "/usr/lib64/python3.6/site-packages/sqlalchemy/engine/default.py", line 453, in connect
    return self.dbapi.connect(*cargs, **cparams)
  File "/usr/lib64/python3.6/site-packages/psycopg2/__init__.py", line 130, in connect
    conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) FATAL:  remaining connection slots are reserved for non-replication superuser connections

Which indicates to me that the Celery worker is not closing connections properly.

I checked the idle connection count on the DB when this error occurred -- there were definitely some connections left so the DB's max connection limit was not reached.

My question: How can I ensure that the celery worker is closing DB connections?

Celery settings:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'my_proj.settings')

_celery_broker = settings.CELERY_BROKER
_result_backend = settings.RESULT_BACKEND

app = Celery('my_proj', broker=_celery_broker, backend=_result_backend)

app.autodiscover_tasks(['common'])

app.conf.update(
    worker_prefetch_multiplier=0,
    event_queue_ttl=0,
    task_acks_late=True,
)

My Django DB settings:

'DATABASES': {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': <...>,
        'USER': <...>,
        'PASSWORD': <...>,
        'HOST': <...>,
        'PORT': 5480,
    }
}

How I start my deployed Django server

gunicorn --config gunicorn.config my_proj.wsgi:application

gunicorn config

bind = '0.0.0.0:8201'
workers = 3
worker_class = 'gthread'
threads = 3
limit_request_line = 0
timeout = 1800

How I start my celery worker:

celery -A my_proj worker -l info

I read in the Django docs that if unspecified, the MAX_CONN_AGE setting is by default 0 and from my understanding the celery worker should pick this up as well.

1 个答案:

答案 0 :(得分:0)

可能是可以启动某种缓冲池,还是故意启动和关闭连接可能会有所帮助。遍历此https://code.i-harness.com/en/q/2263d77的讨论是关于dB连接池以及为芹菜任务创建/关闭连接的。我还没有尝试过。