RUN pip install -r requirements.txt不在Docker容器中安装要求

时间:2019-08-17 20:44:30

标签: django docker pip docker-compose dockerfile

我是django,docker和scrapy的新手,我正在尝试运行一个也使用scrapy的django应用程序(我基本上创建了一个django应用程序,这也是一个scrapy应用程序,并尝试从django视图调用蜘蛛)。尽管在requirements.txt中指定了此scrapy并从Dockerfile运行了pip,但在运行python manage.py runserver 0.0.0.0:8000之前,依赖项并未安装在容器中,并且django应用在系统检查期间失败,从而导致Web容器由于以下异常而退出:

 | Exception in thread django-main-thread:
web_1  | Traceback (most recent call last):
web_1  |   File "/usr/local/lib/python3.7/threading.py", line 926, in _bootstrap_inner
web_1  |     self.run()
web_1  |   File "/usr/local/lib/python3.7/threading.py", line 870, in run
web_1  |     self._target(*self._args, **self._kwargs)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/utils/autoreload.py", line 54, in wrapper
web_1  |     fn(*args, **kwargs)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/management/commands/runserver.py", line 117, in inner_run
web_1  |     self.check(display_num_errors=True)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/management/base.py", line 390, in check
web_1  |     include_deployment_checks=include_deployment_checks,
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/management/base.py", line 377, in _run_checks
web_1  |     return checks.run_checks(**kwargs)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/checks/registry.py", line 72, in run_checks
web_1  |     new_errors = check(app_configs=app_configs)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/checks/urls.py", line 40, in check_url_namespaces_unique
web_1  |     all_namespaces = _load_all_namespaces(resolver)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/core/checks/urls.py", line 57, in _load_all_namespaces
web_1  |     url_patterns = getattr(resolver, 'url_patterns', [])
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/utils/functional.py", line 80, in __get__
web_1  |     res = instance.__dict__[self.name] = self.func(instance)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/urls/resolvers.py", line 579, in url_patterns
web_1  |     patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/utils/functional.py", line 80, in __get__
web_1  |     res = instance.__dict__[self.name] = self.func(instance)
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/urls/resolvers.py", line 572, in urlconf_module
web_1  |     return import_module(self.urlconf_name)
web_1  |   File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
web_1  |     return _bootstrap._gcd_import(name[level:], package, level)
web_1  |   File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
web_1  |   File "<frozen importlib._bootstrap>", line 983, in _find_and_load
web_1  |   File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
web_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
web_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
web_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
web_1  |   File "/code/composeexample/urls.py", line 21, in <module>
web_1  |     path('scrapy/', include('scrapy_app.urls')),
web_1  |   File "/usr/local/lib/python3.7/site-packages/django/urls/conf.py", line 34, in include
web_1  |     urlconf_module = import_module(urlconf_module)
web_1  |   File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
web_1  |     return _bootstrap._gcd_import(name[level:], package, level)
web_1  |   File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
web_1  |   File "<frozen importlib._bootstrap>", line 983, in _find_and_load
web_1  |   File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
web_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
web_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
web_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
web_1  |   File "/code/scrapy_app/urls.py", line 4, in <module>
web_1  |     from scrapy_app import views
web_1  |   File "/code/scrapy_app/views.py", line 1, in <module>
web_1  |     from scrapy.crawler import CrawlerProcess
web_1  | ModuleNotFoundError: No module named 'scrapy'

我尝试使用pip3代替pip pip install --no-cache-dir -r requirements.txt,更改了Dockerfile中语句的顺序,并且还检查了Scrapy==1.7.3是否出现在requirements.txt中。似乎没有任何作用。

这是我的Dockerfile:

FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY requirements.txt /code/
RUN pip install -r requirements.txt
COPY . /code/

这是我的docker-compose.yml:

version: '3'

services:
  db:
    image: postgres
  web:
    build: .
    command: python manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code
    ports:
      - "8000:8000"
    depends_on:
      - db

2 个答案:

答案 0 :(得分:1)

您似乎在scrapy中缺少requirements.txt了!

我尝试使用所有组件构建一个最低版本。希望对您有所帮助。

test.py

import scrapy
from time import sleep


def main():
    while True:
        print(scrapy)
        sleep(1)


if __name__ == "__main__":
    main()

requirements.txt

Scrapy==1.7.3

Dockerfile

FROM python:3

ENV PYTHONUNBUFFERED 1

WORKDIR /code

COPY requirements.txt .
RUN pip3 install -r requirements.txt

COPY . ./

CMD [ "python3", "test.py" ]

docker-compose.yml

version: '3'

services:
  db:
    image: postgres
  web:
    build: .
    ports:
      - "8000:8000"
    depends_on:
      - db

答案 1 :(得分:0)

有点晚了,但是我遇到了这个问题,并最终弄清楚了(并把它放在这里给有同样问题的其他人)。

我第一次尝试构建docker映像时,我的RecyclerView不包含必需的模块。当然,我添加了必需的模块,但是似乎什么也没有发生,这是因为我们需要从头开始重建容器,否则,我们将尝试一次又一次地构建相同的版本。

要使用更新后的文件重建容器,请编写:

requirements.txt

如果这行不通,请尝试相同的操作,但将最后一行替换为docker-compose rm -f docker-compose pull docker-compose up

我是从this答案中得到的。