ImportError:没有名为cqlengine的模块,但是在python命令上工作

时间:2017-05-31 10:27:35

标签: python cassandra scrapy

我是python的新手。 我有一个scrapy项目。我正在使用conda虚拟环境,我编写了一个像这样的管道类:

from cassandra.cqlengine import connection
from cassandra.cqlengine.management import sync_table, create_keyspace_network_topology
from recentnews.cassandra.model.NewsPaperDataModel import NewspaperDataModel

from recentnews.common.Constants import DEFAULT_KEYSPACE


class RecentNewsPipeline(object):
    def __init__(self):
        connection.setup(["192.168.99.100"], DEFAULT_KEYSPACE, protocol_version=3, port=9042)
        create_keyspace_network_topology(DEFAULT_KEYSPACE, {'DC1': 2})
        sync_table(NewspaperDataModel)

    def process_item(self, item, spider):
        NewspaperDataModel.create(
            title=item.title,
            url=item.url,
            domain=item.domain
        )
        return item

当我运行像scrapy crawl author这样的scrapy爬虫时,它会给我这个错误:

(news) (C:\Miniconda2\envs\news) E:\Shoshi\Python Projects\recentnews-scrapy\recentnews>scrapy crawl author
2017-05-31 15:56:29 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: recentnews)
2017-05-31 15:56:29 [scrapy.utils.log] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'recentnews.spiders', 'SPIDER_MODULES': ['recentnews.spiders'], 'ROBOTSTXT_OBEY': True, 'BOT_NAME': 'recentnews'}
2017-05-31 15:56:29 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.logstats.LogStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.corestats.CoreStats']
2017-05-31 15:56:30 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware',
 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2017-05-31 15:56:30 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
Unhandled error in Deferred:
2017-05-31 15:56:30 [twisted] CRITICAL: Unhandled error in Deferred:

2017-05-31 15:56:30 [twisted] CRITICAL:
Traceback (most recent call last):
  File "C:\Miniconda2\envs\news\lib\site-packages\twisted\internet\defer.py", line 1301, in _inlineCallbacks
    result = g.send(result)
  File "C:\Miniconda2\envs\news\lib\site-packages\scrapy\crawler.py", line 95, in crawl
    six.reraise(*exc_info)
  File "C:\Miniconda2\envs\news\lib\site-packages\scrapy\crawler.py", line 77, in crawl
    self.engine = self._create_engine()
  File "C:\Miniconda2\envs\news\lib\site-packages\scrapy\crawler.py", line 102, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "C:\Miniconda2\envs\news\lib\site-packages\scrapy\core\engine.py", line 70, in __init__
    self.scraper = Scraper(crawler)
  File "C:\Miniconda2\envs\news\lib\site-packages\scrapy\core\scraper.py", line 71, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "C:\Miniconda2\envs\news\lib\site-packages\scrapy\middleware.py", line 58, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "C:\Miniconda2\envs\news\lib\site-packages\scrapy\middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "C:\Miniconda2\envs\news\lib\site-packages\scrapy\utils\misc.py", line 44, in load_object
    mod = import_module(module)
  File "C:\Miniconda2\envs\news\lib\importlib\__init__.py", line 37, in import_module
    __import__(name)
  File "E:\Shoshi\Python Projects\recentnews-scrapy\recentnews\recentnews\pipelines.py", line 7, in <module>
    from cassandra.cqlengine import connection
ImportError: No module named cqlengine

我正在使用conda虚拟环境。

但是,当我从python命令行运行此代码时,它工作正常。没有错误:

(news) (C:\Miniconda2\envs\news) E:\Shoshi\Python Projects\recentnews-scrapy\recentnews>python
Python 2.7.13 |Continuum Analytics, Inc.| (default, May 11 2017, 13:17:26) [MSC v.1500 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://anaconda.org
>>> from cassandra.cqlengine import connection
>>> from cassandra.cqlengine.management import sync_table, create_keyspace_network_topology
>>> from recentnews.cassandra.model.NewsPaperDataModel import NewspaperDataModel
>>> from recentnews.common.Constants import DEFAULT_KEYSPACE
>>> connection.setup(["192.168.99.100"], DEFAULT_KEYSPACE, protocol_version=3, port=9042)
>>> create_keyspace_network_topology(DEFAULT_KEYSPACE, {'DC1': 2})
C:\Miniconda2\envs\news\lib\site-packages\cassandra\cqlengine\management.py:545: UserWarning: CQLENG_ALLOW_SCHEMA_MANAGEMENT environment variable is not set. Future versions of this package will require this variable to enable management functions.
  warnings.warn(msg)
>>> sync_table(NewspaperDataModel)
......

您可以看到from cassandra.cqlengine import connection已完美导入。 我错过了什么?当我使用scrapy crawl author

运行此代码时,为什么不能使用此代码?

2 个答案:

答案 0 :(得分:2)

所以在OP的scrapy项目中出现了there was a folder named recentnews/cassandra/(名称空间recentnews.cassandra)。

当scrapy导入项目管道类recentnews.pipelines.RecentNewsPipeline时,importlibfrom cassandra.cqlengine import connection的解释(在recentnews/pipeline.py开头)找到了本地recentnews.cassandra模块在virtualenv安装的cassandra包之前。

检查导入哪个模块的一种方法是在失败的import cassandra; print(cassandra.__file__)语句之前添加import

答案 1 :(得分:0)

创建虚拟环境时,默认情况下不会复制用户安装的软件包。因此,您必须在虚拟环境中运行pip install casandra(或调用任何程序包)。这可能会解决这个问题。