烧瓶create_app和芹菜工作者在执行任务时掉线

时间:2019-10-14 23:40:20

标签: flask rabbitmq celery

我一直在努力使芹菜在烧瓶项目中工作。 flask项目正在使用应用程序工厂模式,这已导致应用程序上下文和循环导入问题,试图使任务正常工作。

我发现this answer进行设置并运行了celery,它可以查看并注册我的任务,这些任务可以被调用并显示在消息队列中,并且通过很多工作,我什至可以将它们引导到在redis结果后端中记录失败(到目前为止,仅已撤销)。

任务本身可以正常运行,而且我不会失败。我可以通过错误数据发送任务并获得错误代码。

celery应用程序已设置为使用Rabbitmq作为代理,并作为后端进行Redis。 这两个似乎都起作用。我可以登录Rabbitmq,查看消息进入队列,并查看连接到队列的工作线程。 我可以看到一些结果最终使它成为Redis

我不确定要包含的代码将有助于解决此问题。 我确实有一些我认为是问题的日志记录输出,但是不知道如何排除故障。 从工作人员调试日志中展示

 -------------- celery@sy-devsophia v4.3.0 (rhubarb)
---- **** -----
--- * ***  * -- Linux-3.10.0-1062.el7.x86_64-x86_64-with-redhat-7.6-Maipo         2019-10-14 18:13:20
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         janus:0x7f0f2b715a58
- ** ---------- .> transport:   amqp://janus:**@localhost:5672/janus
- ** ---------- .> results:     redis://localhost:6379/0
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: ON
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
[tasks]
  . celery.accumulate
  . celery.backend_cleanup
  . celery.chain
  . celery.chord
  . celery.chord_unlock
  . celery.chunks
  . celery.group
  . celery.map
  . celery.starmap
  . janus.workers.imports.course_import
  . janus.workers.reports.run_published_report

[2019-10-14 18:13:20,356: DEBUG/MainProcess] | Worker: Starting Hub
[2019-10-14 18:13:20,356: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:20,356: DEBUG/MainProcess] | Worker: Starting Pool
[2019-10-14 18:13:20,442: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:20,443: DEBUG/MainProcess] | Worker: Starting Consumer
[2019-10-14 18:13:20,444: DEBUG/MainProcess] | Consumer: Starting Connection
[2019-10-14 18:13:20,501: INFO/MainProcess] Connected to amqp://janus:**@localhost:5672/janus
[2019-10-14 18:13:20,501: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:20,501: DEBUG/MainProcess] | Consumer: Starting Events
[2019-10-14 18:13:20,547: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:20,547: DEBUG/MainProcess] | Consumer: Starting Mingle
[2019-10-14 18:13:20,547: INFO/MainProcess] mingle: searching for neighbors
[2019-10-14 18:13:21,608: INFO/MainProcess] mingle: all alone
[2019-10-14 18:13:21,608: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,608: DEBUG/MainProcess] | Consumer: Starting Tasks
[2019-10-14 18:13:21,615: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,615: DEBUG/MainProcess] | Consumer: Starting Control
[2019-10-14 18:13:21,619: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,619: DEBUG/MainProcess] | Consumer: Starting Gossip
[2019-10-14 18:13:21,624: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,624: DEBUG/MainProcess] | Consumer: Starting Heart
[2019-10-14 18:13:21,626: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,626: DEBUG/MainProcess] | Consumer: Starting event loop
[2019-10-14 18:13:21,626: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2019-10-14 18:13:21,629: INFO/MainProcess] celery@sy-devsophia ready.

[2019-10-14 18:13:51,174: INFO/MainProcess] Received task: janus.workers.reports.run_published_report[fba8f1e0-be99-4800-a9df-0f564383647a]
[2019-10-14 18:13:51,175: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x7f0f28197840> (args:('janus.workers.reports.run_published_report', 'fba8f1e0-be99-4800-a9df-0f564383647a', {'lang': 'py', 'task': 'janus.workers.reports.run_published_report', 'id': 'fba8f1e0-be99-4800-a9df-0f564383647a', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'fba8f1e0-be99-4800-a9df-0f564383647a', 'parent_id': None, 'argsrepr': "('6201',)", 'kwargsrepr': '{}', 'origin': 'gen13297@sy-devsophia', 'reply_to': '9cd089a7-a28c-35a8-ab34-10440a35f5e2', 'correlation_id': 'fba8f1e0-be99-4800-a9df-0f564383647a', 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': False}}, <memory at 0x7f0f2333f1c8>, 'application/json', 'utf-8') kwargs:{})
[2019-10-14 18:13:51,177: DEBUG/MainProcess] | Worker: Closing Hub...
[2019-10-14 18:13:51,177: DEBUG/MainProcess] | Worker: Closing Pool...
[2019-10-14 18:13:51,177: DEBUG/MainProcess] | Worker: Closing Consumer...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Worker: Stopping Consumer...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Connection...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Events...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Mingle...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Tasks...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Control...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Gossip...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Heart...
[2019-10-14 18:13:51,179: DEBUG/MainProcess] | Consumer: Closing event loop...
[2019-10-14 18:13:51,179: DEBUG/MainProcess] | Consumer: Stopping event loop...
[2019-10-14 18:13:51,179: DEBUG/MainProcess] | Consumer: Stopping Heart...
[2019-10-14 18:13:51,179: DEBUG/MainProcess] | Consumer: Stopping Gossip...
[2019-10-14 18:13:51,186: DEBUG/MainProcess] | Consumer: Stopping Control...
[2019-10-14 18:13:51,188: DEBUG/MainProcess] | Consumer: Stopping Tasks...
[2019-10-14 18:13:51,188: DEBUG/MainProcess] Canceling task consumer...
[2019-10-14 18:13:51,188: DEBUG/MainProcess] | Consumer: Stopping Mingle...
[2019-10-14 18:13:51,189: DEBUG/MainProcess] | Consumer: Stopping Events...
[2019-10-14 18:13:51,189: DEBUG/MainProcess] | Consumer: Stopping Connection...
[2019-10-14 18:13:51,189: DEBUG/MainProcess] | Worker: Stopping Pool...
[2019-10-14 18:13:52,212: DEBUG/MainProcess] result handler: all workers terminated
[2019-10-14 18:13:52,212: DEBUG/MainProcess] | Worker: Stopping Hub...
[2019-10-14 18:13:52,212: DEBUG/MainProcess] | Consumer: Shutdown Heart...

然后在兔子日志中

=INFO REPORT==== 14-Oct-2019::18:21:05 ===
closing AMQP connection <0.15130.5> ([::1]:57688 -> [::1]:5672)
=WARNING REPORT==== 14-Oct-2019::18:21:05 ===
closing AMQP connection <0.15146.5> ([::1]:57690 -> [::1]:5672):
connection_closed_abruptly

因为我可以直接运行任务而不会出错,并且出现在队列中的消息会显示在工作进程中,并且如果我取消任务,那么确实会在redis后端中报告该错误,那么我看不到其他错误消息传递断开,但是除了rabbitmq报告客户端关闭连接之外,我什么也没有得到,没有其他原因说明为什么工作者在执行任务时会烦躁不安并垂死。

我认为我的celery设置存在问题,但是上面链接的答案是我设法使celery与此应用程序兼容的最接近的答案。

任何帮助我指出在哪里可以找到该问题的帮助,我将不胜感激。如果有任何有助于查看的设置或代码,我可以分享。我不确定目前没有什么错误消息可以解决代码中的问题。

1 个答案:

答案 0 :(得分:0)

似乎发现了错误所在。至少我的芹菜装置现在可以正常工作。

我将librabbitmq用于amqp传输,并更改为pyamqp。一旦更改了传输库,它便开始工作。