我正在尝试运行一个简单的python脚本,它有一些像这样的参数:
python2.7 script.py 'a_file_path' some_file_name 'another_file_path'
我在celery任务中使用了subprocess.call来运行以上行:
BASE_DIR = getattr(settings, "BASE_DIR")
OUTPUT_PATH = os.path.abspath(os.path.join(BASE_DIR, "output"))
PYTHON_2_BIN = getattr(settings, "PYTHON_2_BIN")
MAKESPRITES_SCRIPT = getattr(settings, "MAKESPRITES_SCRIPT")
@shared_task
def process_video(*args, **kwargs):
if kwargs.get('video_id') is None:
return "No Video id provided"
vid = kwargs.get('video_id')
video = UploadedVideo.objects.filter(id=vid).first()
if not isinstance(video, UploadedVideo):
return "Video id is not exists"
video_output_dir = os.path.join(OUTPUT_PATH, str(video.id))
filesystem.mkdir_p(video_output_dir)
command = [
PYTHON_2_BIN,
MAKESPRITES_SCRIPT,
os.path.join(BASE_DIR, 'uploaded_files', video.file_name),
video.file_name,
video_output_dir
]
with open(os.devnull, 'w') as FNULL:
subprocess.call(command, stdout=FNULL, stderr=subprocess.STDOUT)
video.state = 2
video.save()
return "thumbnails has been generated"
如果我从命令行手动运行芹菜,那么每个工作都没问题,但是当我使用supervisord(或init.d)时,脚本将无法正常运行并将以非零存在状态终止1
我的主管配置:
[program:celery]
command=/home/develop/video-hosting/env/bin/python3.4 /home/develop/video-hosting/env/bin/celery worker -A VideoHost --loglevel=DEBUG
directory=/home/develop/video-hosting
user=develop
numprocs=1
stdout_logfile=/home/develop/logs/worker.log
stderr_logfile=/home/develop/logs/worker.err
autostart=true
autorestart=true
startsecs=3
stopwaitsecs = 600
killasgroup=true
priority=1000
编辑1: 芹菜任务本身绝对没问题,它会在两种情况下运行(手动和使用supervisord)
编辑2: this是我正在尝试运行的脚本。
编辑3: 即使外部脚本本身也会在两种情况下运行,但是当芹菜与supervisord一起运行时,它们就不会完成。