加快从netcdf插入点数据的速度

时间:2019-01-21 19:42:28

标签: python postgresql postgis psycopg2

我有这个netcdf的气象数据(数千个需要postgresql提取的数据之一)。目前,我可以将每个频段以每个频段大约20-23秒的速度插入启用了Postgis的表中。 (对于每月数据,还有我尚未测试的每日数据。)

我听说过使用COPY FROM,移除gid,使用ssds等加快速度的不同方法...但是我是python的新手,不知道如何将netcdf数据存储到我可以使用COPY FROM或最佳路线。

如果有人对如何加快速度有其他想法,请分享!

这是提取脚本

import netCDF4, psycopg2, time

# Establish connection
db1 = psycopg2.connect("host=localhost dbname=postgis_test user=********** password=********")
cur = db1.cursor()

# Create Table in postgis
print(str(time.ctime()) + " CREATING TABLE")
try:
    cur.execute("DROP TABLE IF EXISTS table_name;")
    db1.commit()
    cur.execute(
        "CREATE TABLE table_name (gid serial PRIMARY KEY not null, thedate DATE, thepoint geometry, lon decimal, lat decimal, thevalue decimal);")
    db1.commit()
    print("TABLE CREATED")
except:
    print(psycopg2.DatabaseError)
    print("TABLE CREATION FAILED")

rawvalue_nc_file = 'netcdf_file.nc'
nc = netCDF4.Dataset(rawvalue_nc_file, mode='r')
nc.variables.keys()

lat = nc.variables['lat'][:]
lon = nc.variables['lon'][:]
time_var = nc.variables['time']
dtime = netCDF4.num2date(time_var[:], time_var.units)
newtime = [fdate.strftime('%Y-%m-%d') for fdate in dtime]
rawvalue = nc.variables['tx_max'][:]

lathash = {}
lonhash = {}
entry1 = 0
entry2 = 0

lattemp = nc.variables['lat'][:].tolist()
for entry1 in range(lat.size):
    lathash[entry1] = lattemp[entry1]

lontemp = nc.variables['lon'][:].tolist()
for entry2 in range(lon.size):
    lonhash[entry2] = lontemp[entry2]

for timestep in range(dtime.size):
    print(str(time.ctime()) + " " + str(timestep + 1) + "/180")
    for _lon in range(lon.size):
        for _lat in range(lat.size):
            latitude = round(lathash[_lat], 6)
            longitude = round(lonhash[_lon], 6)
            thedate = newtime[timestep]
            thevalue = round(float(rawvalue.data[timestep, _lat, _lon] - 273.15), 3)
            if (thevalue > -100):
                cur.execute("INSERT INTO table_name (thedate, thepoint, thevalue) VALUES (%s, ST_MakePoint(%s,%s,0), %s)",(thedate, longitude, latitude, thevalue))
    db1.commit()
cur.close()
db1.close()

print(" Done!")

1 个答案:

答案 0 :(得分:1)

如果您确定大部分时间都花在PostgreSQL上,而不花在自己的任何其他代码上,那么您可能需要查看fast execution helpers,即您的情况下的cur.execute_values()

此外,您可能要确保您正在事务中,因此数据库不会退回到自动提交模式。 (“如果不发出BEGIN命令,则每个单独的语句都有一个隐式的BEGIN和(如果成功的话)在其周围包裹着COMMIT。”)

像这样的东西可以解决问题-尽管未测试。

for timestep in range(dtime.size):
    print(str(time.ctime()) + " " + str(timestep + 1) + "/180")
    values = []

    cur.execute("BEGIN")

    for _lon in range(lon.size):
        for _lat in range(lat.size):
            latitude = round(lathash[_lat], 6)
            longitude = round(lonhash[_lon], 6)
            thedate = newtime[timestep]
            thevalue = round(
                float(rawvalue.data[timestep, _lat, _lon] - 273.15), 3
            )
            if thevalue > -100:
                values.append((thedate, longitude, latitude, thevalue))

    psycopg2.extras.execute_values(
        cur,
        "INSERT INTO table_name (thedate, thepoint, thevalue) VALUES %s",
        values,
        template="(%s, ST_MakePoint(%s,%s,0), %s)"
    )
    db1.commit()