来自芹菜的Docker-compose连接被拒绝

时间:2016-11-28 13:39:50

标签: django docker celery docker-compose

我正在运行docker-compose将django,芹菜,postgres和rabbitmq聚集在一起,使用以下docker-compose.yml

version: '2'

services:
  # PostgreSQL database
  db:
    image: postgres:9.4
    hostname: db
    environment:
      - POSTGRES_USER=<XXX>
      - POSTGRES_PASSWORD=<XXX>
      - POSTGRES_DB=<XXX>
    ports:
      - "5431:5432"

  rabbit:
    hostname: rabbit
    image: rabbitmq:3-management
    environment:
      - RABBITMQ_DEFAULT_USER=<XXX>
      - RABBITMQ_DEFAULT_PASS=<XXX>
    ports:
      - "5672:5672" 
      - "15672:15672"

  # Django web server
  web:
     build:
         context: .
         dockerfile: Dockerfile
     hostname: web
     command: /srv/www/run_web.sh
     volumes:
          - .:/srv/www
     ports:
       - "8000:8000"
     links:
       - db
       - rabbit
     depends_on:
       - db

   # Celery worker
    worker:
       hostname: celery
       build:
           context: .
           dockerfile: Dockerfile
       command: /srv/www/run_celery.sh
       volumes:
           - .:/srv/www
       links:
          - db
          - rabbit
       depends_on:
          - rabbit

在其中一个Django视图中,我委托给芹菜任务进行一些处理,然后尝试将结果发布到另一个Web服务:

#views.py
@csrf_exempt
def process_data(request):
    if request.method == 'POST':

        #
        #Processing to retrieve data here
        #

        delegate_celery_task.delay(data)
    return HttpResponse(status=200)

#tasks.py
@app.task
def delegate_celery_task(in_data):
    from extractorService.settings import MASTER_NODE
    import json
    import urllib

    #
    #Some processing on in_data here to give out_data
    # 

    data = {'data': out_data}
    params = json.dumps(data).encode('utf8')

    req = urllib.request.Request('http://%s/api/data/'%(MASTER_NODE), data=params,
              headers={'content-type': 'application/json'})

    urllib.request.urlopen(req)

现在MASTER_NODE只是localhost:8001,我正在运行其他Web服务。当我在docker之外运行所有东西时,设置运行。在启动docker时,虽然工作进程给出:

worker_1 | [2016-11-28 12:20:17,527: WARNING/PoolWorker-2] unable to cache TLDs in file /usr/local/lib/python3.5/site-packages/tldextract/.tld_set: [Errno 13] Permission denied: '/ usr/local/lib/python3.5/site-packages/tldextract/.tld_set'

然后在发布到Django视图时,芹菜工作者启动但在urlopen调用上出错:

worker_1 | Traceback (most recent call last): worker_1 | File "/usr/local/lib/python3.5/site-packages/celery/app/trace.py", line 368, in trace_task worker_1 | R = retval = fun(*args, **kwargs) worker_1 | File "/usr/local/lib/python3.5/site-packages/celery/app/trace.py", line 623, in protected_call worker_1 | return self.run(*args, **kwargs) worker_1 | File "/srv/extractor_django/extractorService/tasks.py", line 25, in extract_entities worker_1 | urllib.request.urlopen(req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 162, in urlopen worker_1 | return opener.open(url, data, timeout) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 465, in open worker_1 | response = self._open(req, data) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 483, in _open worker_1 | '_open', req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 443, in _call_chain worker_1 | result = func(*args) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 1268, in http_open worker_1 | return self.do_open(http.client.HTTPConnection, req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 1242, in do_open worker_1 | raise URLError(err) worker_1 | urllib.error.URLError:

settings.py中的celery配置为:

RABBIT_HOSTNAME = os.environ.get('RABBIT_PORT_5672_TCP', 'rabbit')
if RABBIT_HOSTNAME.startswith('tcp://'):
    RABBIT_HOSTNAME = RABBIT_HOSTNAME.split('//')[1]

BROKER_URL = os.environ.get('BROKER_URL', '')
if not BROKER_URL:
    BROKER_URL = 'amqp://{user}:{password}@{hostname}'.format(
        user=os.environ.get('RABBIT_ENV_USER', '<XXX>'),
        password=os.environ.get('RABBIT_ENV_RABBITMQ_PASS', '<XXX>'),
        hostname=RABBIT_HOSTNAME)

BROKER_HEARTBEAT = '?heartbeat=30'
if not BROKER_URL.endswith(BROKER_HEARTBEAT):
BROKER_URL += BROKER_HEARTBEAT

BROKER_POOL_LIMIT = 1
BROKER_CONNECTION_TIMEOUT = 10

CELERY_DEFAULT_QUEUE = 'default'
CELERY_QUEUES = (
Queue('default', Exchange('default'), routing_key='default'),)

CELERY_ALWAYS_EAGER = False
CELERY_ACKS_LATE = True
CELERY_TASK_PUBLISH_RETRY = True
CELERY_DISABLE_RATE_LIMITS = False

CELERY_IGNORE_RESULT = True
CELERY_SEND_TASK_ERROR_EMAILS = False
CELERY_TASK_RESULT_EXPIRES = 600

CELERYD_HIJACK_ROOT_LOGGER = False
CELERYD_PREFETCH_MULTIPLIER = 1
CELERYD_MAX_TASKS_PER_CHILD = 1000

有没有人对如何解决这个问题有任何想法?

1 个答案:

答案 0 :(得分:0)

你没有提到Celery的版本,但从发布日期开始我猜它是v4。

由于将Celery从v3.1更新到v4,我遇到了类似的问题,根据此tutorial,需要在BROKER_URL中将CELERY_BROKER_URL更改为settings.py