Scrapy-redis无法连接到Docker中的服务器

时间:2019-02-10 13:08:57

标签: docker redis scrapy

我将Redis作为Docker容器运行

docker run  --name redis_env --hostname redis \
            -p 6379:6379 \
            -v $PWD/DBVOL/redis/data:/data:rw \
            --privileged=true \
            -d redis redis-server

我可以通过Redis Desktop Manager连接我的Redis服务器

enter image description here

但是,当scrapy-redis要连接服务器时,它拒绝并显示错误代码111。这是我的setting.py:

# Set up Redis Server
REDIS_URL = 'redis://127.0.0.1:6379/0'

# Enable scheduler stores requested queues
SCHEDULER = 'scrapy_redis.scheduler.Scheduler'

# Ensure all scraper share same duplicates filter through redis
DUPEFILTER_CLASS = 'scrapy_redis.dupefilter.RFPDupeFilter'

# Don't cleanup redis queues
SCHEDULER_PERSIST = True

# Configure item pipelines
# See https://doc.scrapy.org/en/latest/topics/item-pipeline.html

ITEM_PIPELINES = {
   #'GeoCrawler.pipelines.GeocrawlerPipeline': 300,
    'scrapy_redis.pipelines.RedisPipeline': 300
}

这是调试消息:

enter image description here

怎么了?

通过我可以成功连接python控制台的方式:

>>>import redis
>>>redis.Connection("redis://localhost:6379/0")
Connection<host=redis://localhost:6379/0,port=6379,db=0>

0 个答案:

没有答案