我目前正在尝试理解为什么figure(1)
会影响以下代码。虽然该值不存储在任何地方,并且它不适用于稍后使用的对象,但我不能简单地删除它。所以它必须以某种方式改变状态。
figure(1)
在以下代码中做了什么?
#!/usr/bin/env python
# Source: http://pybrain.org/docs/tutorial/fnn.html
from pybrain.datasets import ClassificationDataSet
from pybrain.utilities import percentError
from pybrain.tools.shortcuts import buildNetwork
from pybrain.supervised.trainers import BackpropTrainer
from pybrain.structure.modules import SoftmaxLayer
# Only needed for data generation and graphical output
from pylab import ion, ioff, figure, draw, contourf, clf, show, hold, plot
from scipy import diag, arange, meshgrid, where
from numpy.random import multivariate_normal
INPUT_FEATURES = 2
CLASSES = 3
HIDDEN_NODES = 5
means = [(-1, 0), (2, 4), (3, 1)]
cov = [diag([1, 1]), diag([0.5, 1.2]), diag([1.5, 0.7])]
alldata = ClassificationDataSet(INPUT_FEATURES, 1, nb_classes=CLASSES)
for n in range(400):
for klass in range(CLASSES):
features = multivariate_normal(means[klass], cov[klass])
alldata.addSample(features, [klass])
tstdata, trndata = alldata.splitWithProportion(0.25)
trndata._convertToOneOfMany()
tstdata._convertToOneOfMany()
print("Number of training patterns: %i" % len(trndata))
print("Input and output dimensions: %i, %i" % (trndata.indim, trndata.outdim))
print("First sample (input, target, class):")
print(trndata['input'][0], trndata['target'][0], trndata['class'][0])
fnn = buildNetwork(trndata.indim, HIDDEN_NODES, trndata.outdim,
outclass=SoftmaxLayer)
trainer = BackpropTrainer(fnn, dataset=trndata, momentum=0.1,
verbose=True, weightdecay=0.01)
ticks = arange(-3., 6., 0.2)
X, Y = meshgrid(ticks, ticks)
# need column vectors in dataset, not arrays
griddata = ClassificationDataSet(INPUT_FEATURES, 1, nb_classes=CLASSES)
for i in range(X.size):
griddata.addSample([X.ravel()[i], Y.ravel()[i]], [0])
for i in range(20):
trainer.trainEpochs(1)
trnresult = percentError(trainer.testOnClassData(),
trndata['class'])
tstresult = percentError(trainer.testOnClassData(
dataset=tstdata), tstdata['class'])
print("epoch: %4d" % trainer.totalepochs,
" train error: %5.2f%%" % trnresult,
" test error: %5.2f%%" % tstresult)
out = fnn.activateOnDataset(griddata)
out = out.argmax(axis=1) # the highest output activation gives the class
out = out.reshape(X.shape)
figure(1)
ioff() # interactive graphics off
clf() # clear the plot
#hold(True) # overplot on
for c in [0, 1, 2]:
here, _ = where(tstdata['class'] == c)
plot(tstdata['input'][here, 0], tstdata['input'][here, 1], 'o')
if out.max() != out.min(): # safety check against flat field
contourf(X, Y, out) # plot the contour
ion() # interactive graphics on
draw() # update the plot
ioff()
show()
答案 0 :(得分:0)
这个功能确实操纵了一些内部状态。您可以查看源文件以查看它是否正在调用" private"它所包含的类的属性:
_pylab_helpers.Gcf.set_active(figManager)
源代码还记录了这个功能:
Create a new figure. call signature:: figure(num=None, figsize=(8, 6), dpi=80, facecolor='w', edgecolor='k') Create a new figure and return a :class:`matplotlib.figure.Figure` instance. If *num* = *None*, the figure number will be incremented and a new figure will be created. The returned figure objects have a *number* attribute holding this number. If *num* is an integer, and ``figure(num)`` already exists, make it active and return a reference to it. If ``figure(num)`` does not exist it will be created. Numbering starts at 1, MATLAB style:: figure(1)
当您致电figure(2)
时,会创建并激活新的数字。稍后,如果您调用figure(1)
(已存在),则该数字将变为活动状态。是的,它是画布 - 你可以看到,你正在设置大小,dpi,颜色等。