不保存数据到postgres

时间:2018-07-07 20:51:52

标签: python-3.x postgresql scrapy

尝试学习一些Web抓取的基本知识时会遇到很多问题,但是我无法将结果保存到我的postgres数据库中。连接有效并且创建了表,但是尽管正在抓取数据,但没有输入足够的行。我将结果保存到JSON文件中,但信息已保存到数据库中,并且未引发任何错误。

piplines.py

from sqlalchemy.orm import sessionmaker
from stack.models import Questions, db_connect, create_questions


class StackPipeline(object):
    def __init__(self):
        engine = db_connect()
        create_questions(engine)
        self.Session = sessionmaker(bind=engine)

    def proccess_item(self, item, spider):
    session = self.Session()
    ques = Questions(**item)

    try:
        session.add(ques)
        session.commit
    except:
        session.rollback()
        raise
    finally:
        session.close()

    return item

models.py

import logging
from sqlalchemy import create_engine, Column, Integer, String, DateTime
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.engine.url import URL
from .settings import DATABASE

DeclarativeBase = declarative_base()


def db_connect():
    db_url = URL(**DATABASE)
logging.info("Creating an SQLAlchemy engine at URL 
'{db_url}'".format(db_url=db_url))
return create_engine(db_url)


def create_questions(engine):
    DeclarativeBase.metadata.create_all(engine)


class Questions(DeclarativeBase):
    __tablename__ = "questions"

    id = Column(Integer, primary_key=True)
    title = Column('title', String)
    url = Column('url', String)

0 个答案:

没有答案