MySQL 5.7.18
Python 2.7.5
熊猫0.17.1
CentOS 7.3
MySQL表:
CREATE TABLE test (
id varchar(12)
) ENGINE=InnoDB;
大小为10GB。
select round(((data_length) / 1024 / 1024 / 1024)) "GB"
from information_schema.tables
where table_name = "test"
10GB
该包装盒具有250GB内存:
$ free -hm
total used free shared buff/cache available
Mem: 251G 15G 214G 2.3G 21G 232G
Swap: 2.0G 1.2G 839M
选择数据:
import psutil
print '1 ' + str(psutil.phymem_usage())
import os
import sys
import time
import pyodbc
import mysql.connector
import pandas as pd
from datetime import date
import gc
print '2 ' + str(psutil.phymem_usage())
db = mysql.connector.connect({snip})
c = db.cursor()
print '3 ' + str(psutil.phymem_usage())
c.execute("select id from test")
print '4 ' + str(psutil.phymem_usage())
e=c.fetchall()
print 'getsizeof: ' + str(sys.getsizeof(e))
print '5 ' + str(psutil.phymem_usage())
d=pd.DataFrame(e)
print d.info()
print '6 ' + str(psutil.phymem_usage())
c.close()
print '7 ' + str(psutil.phymem_usage())
db.close()
print '8 ' + str(psutil.phymem_usage())
del c, db, e
print '9 ' + str(psutil.phymem_usage())
gc.collect()
print '10 ' + str(psutil.phymem_usage())
time.sleep(60)
print '11 ' + str(psutil.phymem_usage())
输出:
1 svmem(total=270194331648L, available=249765777408L, percent=7.6, used=39435464704L, free=230758866944L, active=20528222208, inactive=13648789504, buffers=345387008L, cached=18661523456)
2 svmem(total=270194331648L, available=249729019904L, percent=7.6, used=39472222208L, free=230722109440L, active=20563484672, inactive=13648793600, buffers=345387008L, cached=18661523456)
3 svmem(total=270194331648L, available=249729019904L, percent=7.6, used=39472222208L, free=230722109440L, active=20563484672, inactive=13648793600, buffers=345387008L, cached=18661523456)
4 svmem(total=270194331648L, available=249729019904L, percent=7.6, used=39472222208L, free=230722109440L, active=20563484672, inactive=13648793600, buffers=345387008L, cached=18661523456)
getsizeof: 1960771816
5 svmem(total=270194331648L, available=181568315392L, percent=32.8, used=107641655296L, free=162552676352L, active=88588271616, inactive=13656334336, buffers=345395200L, cached=18670243840)
<class 'pandas.core.frame.DataFrame'>
Int64Index: 231246823 entries, 0 to 231246822
Data columns (total 1 columns):
0 object
dtypes: object(1)
memory usage: 3.4+ GB
None
6 svmem(total=270194331648L, available=181571620864L, percent=32.8, used=107638353920L, free=162555977728L, active=88587603968, inactive=13656334336, buffers=345395200L, cached=18670247936)
7 svmem(total=270194331648L, available=181571620864L, percent=32.8, used=107638353920L, free=162555977728L, active=88587603968, inactive=13656334336, buffers=345395200L, cached=18670247936)
8 svmem(total=270194331648L, available=181571620864L, percent=32.8, used=107638353920L, free=162555977728L, active=88587603968, inactive=13656334336, buffers=345395200L, cached=18670247936)
9 svmem(total=270194331648L, available=183428308992L, percent=32.1, used=105781678080L, free=164412653568L, active=86735921152, inactive=13656334336, buffers=345395200L, cached=18670260224)
10 svmem(total=270194331648L, available=183428308992L, percent=32.1, used=105781678080L, free=164412653568L, active=86735921152, inactive=13656334336, buffers=345395200L, cached=18670260224)
11 svmem(total=270194331648L, available=183427203072L, percent=32.1, used=105782812672L, free=164411518976L, active=86736560128, inactive=13656330240, buffers=345395200L, cached=18670288896)
我什至删除了数据库连接并称为垃圾回收。
10GB的表怎么会占用我60GB的内存?
答案 0 :(得分:2)
简短的答案:python数据结构的内存开销。
您的表中有〜231M行,占用约10GB,因此每行约有4个字节。
fetchall
将其翻译成这样的元组列表:
[('abcd',), ('1234',), ... ]
您的列表中有〜231M个元素,并使用了〜19GB的内存:平均每个元组使用8.48个字节。
$ python
Python 2.7.12 (default, Nov 19 2016, 06:48:10)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import sys
元组:
>>> a = ('abcd',)
>>> sys.getsizeof(a)
64
一个元组的列表:
>>> al = [('abcd',)]
>>> sys.getsizeof(al)
80
两个元组的列表:
>>> al2 = [('abcd',), ('1234',)]
>>> sys.getsizeof(al2)
88
包含10个元组的列表:
>>> al10 = [ ('abcd',) for x in range(10)]
>>> sys.getsizeof(al10)
200
包含1M个元组的列表:
>>> a_realy_long = [ ('abcd',) for x in range(1000000)]
>>> sys.getsizeof(a_realy_long )
8697472
几乎是我们的数字:列表中每个元组8.6字节。
不幸的是,您在这里无能为力:mysql.connector
选择数据结构,而dict cursor将使用更多的内存。
如果需要减少内存使用,则必须使用带有适当大小参数的fetchmany。
答案 1 :(得分:0)
编辑: 0 1
1 1
2 2
3 2
4 0
5 2
6 2
7 1
8 2
dtype: int64
仅接受SQLAlchemy连接。首先使用来自SQLAlchemy的pd.read_sql
连接到您的数据库:
create_engine
然后在生成的对象上调用from sqlalchemy import create_engine
engine = create_engine('mysql://database')
:
.connect()
将该连接传递到connection = engine.connect()
:
pd.read_sql
这将减少您的内存占用。
尝试上述操作后,您介意发布内存使用结果吗?