运行django服务器时为什么会出现以下错误?如何解决? :
EOFError: marshal data too short
Performing system checks...
Unhandled exception in thread started by <function check_errors.<locals>.wrapper at 0x7f6e5dbacea0>
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/django/utils/autoreload.py", line 227, in wrapper
fn(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/django/core/management/commands/runserver.py", line 125, in inner_run
self.check(display_num_errors=True)
File "/usr/local/lib/python3.5/dist-packages/django/core/management/base.py", line 359, in check
include_deployment_checks=include_deployment_checks,
File "/usr/local/lib/python3.5/dist-packages/django/core/management/base.py", line 346, in _run_checks
return checks.run_checks(**kwargs)
File "/usr/local/lib/python3.5/dist-packages/django/core/checks/registry.py", line 81, in run_checks
new_errors = check(app_configs=app_configs)
File "/usr/local/lib/python3.5/dist-packages/django/core/checks/urls.py", line 16, in check_url_config
return check_resolver(resolver)
File "/usr/local/lib/python3.5/dist-packages/django/core/checks/urls.py", line 26, in check_resolver
return check_method()
File "/usr/local/lib/python3.5/dist-packages/django/urls/resolvers.py", line 254, in check
for pattern in self.url_patterns:
File "/usr/local/lib/python3.5/dist-packages/django/utils/functional.py", line 35, in __get__
res = instance.__dict__[self.name] = self.func(instance)
File "/usr/local/lib/python3.5/dist-packages/django/urls/resolvers.py", line 405, in url_patterns
patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
File "/usr/local/lib/python3.5/dist-packages/django/utils/functional.py", line 35, in __get__
res = instance.__dict__[self.name] = self.func(instance)
File "/usr/local/lib/python3.5/dist-packages/django/urls/resolvers.py", line 398, in urlconf_module
return import_module(self.urlconf_name)
File "/usr/lib/python3.5/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 986, in _gcd_import
File "<frozen importlib._bootstrap>", line 969, in _find_and_load
File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 673, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 665, in exec_module
File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed
File "/home/aphya1/work/1-erp-project/ERP/mis/mis/urls.py", line 14, in <module>
url(r'^hr/', include('hr.urls', namespace='hr')),
File "/usr/local/lib/python3.5/dist-packages/django/conf/urls/__init__.py", line 50, in include
urlconf_module = import_module(urlconf_module)
File "/usr/lib/python3.5/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 986, in _gcd_import
File "<frozen importlib._bootstrap>", line 969, in _find_and_load
File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 673, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 665, in exec_module
File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed
File "/home/aphya1/work/1-erp-project/ERP/mis/hr/urls.py", line 2, in <module>
from hr import views
File "<frozen importlib._bootstrap>", line 969, in _find_and_load
File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 673, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 661, in exec_module
File "<frozen importlib._bootstrap_external>", line 765, in get_code
File "<frozen importlib._bootstrap_external>", line 476, in _compile_bytecode
EOFError: marshal data too short
答案 0 :(得分:1)
我也遇到了错误:
EOFError:封送数据太短。
即使没有对任何现有文件进行任何更改,也会引发此错误。
一些* .pyc文件会自动创建,它们是脚本的编译字节码,当运行Python脚本以加快以后的启动速度时,会动态创建这些文件。
我正在使用Oracle VM VirtualBox管理器,正在Python
上工作。
我正在尝试安装pyspark
。
spark文件夹的结构是这样的
spark****/python/pyspark
在此文件夹下包含名为“ __pycache__”的文件夹 我转到pyspark文件夹,并使用以下命令删除了“ __pycache__”文件夹:
rm-r __pycache__/
这为我消除了错误,现在我可以导入pyspark。
希望这会有所帮助。