我正在使用序列化程序转储和加载编写sqlalchemy导入/导出脚本。 出口工作,但我有进口问题,主要是由于外键问题。 我正在使用sorted_tables来获取基于依赖项排序的表的列表,这确保我不会有交叉表外键问题但是有类似处理内部外键(指向自身的表)吗?
我基本上在考虑两种可能的解决方案:
但我不确定如何正确地做到这一点......
表格示例:
class Employee(Base):
__tablename__ = "t_employee"
id = sa.Column(Identifier, sa.Sequence('%s_id_seq' % __tablename__), primary_key=True, nullable=False)
first_name = sa.Column(sa.String(30))
last_name = sa.Column(sa.String(30))
manager_id = sa.Column(Identifier, sa.ForeignKey("t_employee.id", ondelete='SET NULL'))
这是我的剧本:
def export_db(tar_file):
print "Exporting Database. This may take some time. Please wait ..."
Base.metadata.create_all(engine)
tables = Base.metadata.tables
with tarfile.open(tar_file, "w:bz2") as tar:
for tbl in tables:
print "Exporting table %s ..." % tbl
table_dump = dumps(engine.execute(tables[tbl].select()).fetchall())
ti = tarfile.TarInfo(tbl)
ti.size = len(table_dump)
tar.addfile(ti, StringIO(table_dump))
print "Database exported! Exiting!"
exit(0)
def import_db(tar_file):
print "Importing to Database. This may take some time. Please wait ..."
print "Dropping all tables ..."
Base.metadata.drop_all(engine)
print "Creating all tables ..."
Base.metadata.create_all(engine)
tables = Base.metadata.sorted_tables
with tarfile.open(tar_file, "r:bz2") as tar:
for tbl in tables:
try:
entry = tar.getmember(tbl.name)
print "Importing table %s ..." % entry.name
fileobj = tar.extractfile(entry)
table_dump = loads(fileobj.read(), Base.metadata, db)
for data in table_dump:
db.execute(tbl.insert(), strip_unicode(dict(**data)))
except:
traceback.print_exc(file=sys.stdout)
exit(0)
db.commit()
print "Database imported! Exiting!"
exit(0)
答案 0 :(得分:3)
对于批量转储,标准技术是禁用约束,执行导入,然后重新启用它们。您还可以在插入上获得更快的性能。