在尝试优化由ODE系统建模的过程时,每当我尝试增加对参数的温度依赖性时,我都会不断遇到错误。如何解决此代码以使其正常运行以及我在哪里出错?
这是我和我的同事们第一次尝试在ODE模型中添加函数。我尝试将温度(T)变量作为变量,并简单地将它们设置为函数,而不将其定义为Constraint-s。我也尝试过不明确定义它们,而只是将温度相关变量的公式插入ode。
#importing all of the packages that we will need for optimization
from pyomo.environ import *
from pyomo.dae import *
import math as m
import numpy as np
#Initialize Constants
#rate constants
#constants for energy balance
rho = 1040 #g/L
C_p = 4.016 #J/grams/K
DH_FG = -91.2 #KJ/gmol
DH_FM = -226.3 #kJ/gmol
DH_FN = -361.3 #kJ/gmol
Tc = 5 + 273.15 #K
R = 8.314 #J/mol/K
#We are not exactly sure what these are. We think they the gains for the Temperature Controls
p1 = 1.6
p2 = 0.188
p3 = 0.1
p = (p1*p2)/p3
e = m.exp(1)
#creating the final time parameter
t_f = 300 #final time
#Creating initial conditions for each of the equations
#Initial values...
u_x = .5 #h^-1
X_int = 125 #biomass
G_int = 70 #glucose
M_int = 220 #Maltose
N_int = 40 #Maltotriose
T_int = 20 #Temperature of system
u = 400 #Control Variable
uG_int = m.exp(35.77)
KprimeG_int = 2
KM_int = m.exp(-19.5)
KG_int = m.exp(-121.3)
u1_int = uG_int*G_int/(KG_int +G_int)
u2_int = 1
u3_int = 1
#Activation Energies
E_uG = 22.6*4184 # J/mol
E_KG = -68.6*4184 #J/mol
g = ConcreteModel()
#Growth Model
#Establishing the time variable
g.t = ContinuousSet(bounds=(0, t_f)) #creating time parameter with bounds from 0 to tf
#Establishing the dependent variables on time
g.X = Var(g.t) #Amount of Biomass in the system as a function of t
g.G = Var(g.t) #amount of Glucose in the system as a function of t
g.M = Var(g.t) #amount of Maltose in the system as a function of t
g.N = Var(g.t) #Amount of Maltotriose in the system is a function of t
g.T = Var(g.t)
g.KG = Var(g.t)
g.uG = Var(g.t)
g.u1 = Var(g.t)
#Creating Derivatives of each of the depend variables
g.dXdt = DerivativeVar(g.X, wrt=g.t)
g.dGdt = DerivativeVar(g.G, wrt=g.t)
g.dMdt = DerivativeVar(g.M, wrt=g.t)
g.dNdt = DerivativeVar(g.N, wrt=g.t)
g.dTdt = DerivativeVar(g.T, wrt=g.t)
#Temperature dependent variables
g.uGeq = Constraint(g.t, rule=lambda g, t: g.uG[t] == uG_int*e**(-E_uG/R/g.T[t]) )
g.KGeq = Constraint(g.t, rule=lambda g, t: g.KG[t] == KG_int*e**(-E_KG/R/g.T[t]) )
#creating the system of ode's
g.u1eq = Constraint(g.t, rule=lambda g, t: g.u1[t] == g.uG[t]*g.G[t]/(g.KG[t] + g.G[t]) )
g.X_ode = Constraint(g.t, rule=lambda g, t: g.dXdt[t] == u_x*g.X[t])
g.G_ode = Constraint(g.t, rule=lambda g, t: g.dGdt[t] == g.u1[t]*g.X[t])
g.M_ode = Constraint(g.t, rule=lambda g, t: g.dMdt[t] == -u2_int*g.X[t])
g.N_ode = Constraint(g.t, rule=lambda g, t: g.dNdt[t] == -u3_int*g.X[t])
g.T_ode = Constraint(g.t, rule=lambda g, t: g.dTdt[t] == (g.u1[t]*DH_FG + u2_int*DH_FM + u3_int*DH_FN)*g.X[t] + u*(Tc - g.T[t])/(rho*C_p))
g.KG[0].fix(KG_int)
g.uG[0].fix(uG_int)
g.X[0].fix(X_int)
g.G[0].fix(G_int)
g.M[0].fix(M_int)
g.N[0].fix(N_int)
g.T[0].fix(T_int)
def solve(g):
TransformationFactory('dae.finite_difference').apply_to(g, nfe=200, method='backwards')
SolverFactory('ipopt', executable=ipopt_executable).solve(g)
#creating lists to store the different values as a function of time
tsim = [t for t in g.t]
Xsim = [g.X[t]() for t in g.t]
Gsim = [g.G[t]() for t in g.t]
Msim = [g.M[t]() for t in g.t]
Nsim = [g.N[t]() for t in g.t]
Tsim = [g.T[t]() for t in g.t]
uGsim = [g.uG[t]() for t in g.t]
KGsim = [g.KG[t]() for t in g.t]
u1sim = [g.u1[t]() for t in g.t]
#run the solver and see what happens
solve(g)
我们希望不同的变量会随时间变化,并使用mathplotlib绘制结果图
答案 0 :(得分:0)
我认为由于g.KGeq
的限制,您看到了该错误。如果将g.T
初始化为T_int=20
,则该方程式中的指数约为1711,这会导致Python溢出错误。您可能有一个带负号的错字。
我看到的另一件事是,您在一些约束中被g.T
和g.KG + g.G
除,但是您从未初始化或绑定这些变量。 IPOPT中的默认初始化是将它们全部设置为0,这将导致评估错误。
如何应用离散化转换也存在一个错误。应该是:
TransformationFactory('dae.finite_difference').apply_to(g, nfe=200, scheme='BACKWARD')