全局变量的值在函数内部更改,但是我不明白为什么会发生

时间:2019-03-27 10:24:00

标签: python deep-learning global-variables

当我研究“从头开始的深度学习”及其示例代码时,我看到全局变量的值在函数内部发生了变化。

我试图理解这一点,但是失败了。我希望有人告诉我这段代码是如何工作的。

详细地,在函数中接收到“ net.W”的关键字“ weight”改变之后,对象“ net.W”也发生了变化。我认为“ net.W”将不会更改,因为这是没有全局变量的全局变量。但这已经改变了,我受不了了。

已根据https://github.com/oreilly-japan/deep-learning-from-scratch修改了代码。

import sys
import os
import numpy as np

class simpleNet:
    def __init__(self):
        self.W = np.random.rand(2,3)

    def predict(self, x):
        return np.dot(x, self.W)

    def loss(self, x, t):
        z = self.predict(x)
        z = np.sum(z)
        return z

def numerical_gradient(f, weight):
    h = 1.0
    grad = np.zeros_like(weight)

    it = np.nditer(weight, 
                   flags=['multi_index'], 
                   op_flags=['readwrite'])

    while not it.finished:
        idx = it.multi_index

        print("Before net.W: \n",net.W)

        weight[idx] = weight[idx] + h

        print("After net.W: \n",net.W)
        print(' ')
        it.iternext()

net = simpleNet()
print('Originla net.W: \n', net.W)
x = np.array([0.6, 0.9])
t = np.array([0,0,1])
dW = numerical_gradient(f, net.W)

我希望打印的“ net.W”对象是相同的,但是每次更改关键字“ weight”之后,“ net.W”的组件都会更改(+1.0 by h)。

''''''
Originla net.W: 
[[0.74942285 0.28408918 0.43493956]
 [0.50182208 0.19556924 0.75924396]]
''''''

''''''
Before net.W: 
 [[0.74942285 0.28408918 0.43493956]
 [0.50182208 0.19556924 0.75924396]]
After net.W: 
 [[1.74942285 0.28408918 0.43493956]
 [0.50182208 0.19556924 0.75924396]]

Before net.W: 
 [[1.74942285 0.28408918 0.43493956]
 [0.50182208 0.19556924 0.75924396]]
After net.W: 
 [[1.74942285 1.28408918 0.43493956]
 [0.50182208 0.19556924 0.75924396]]

Before net.W: 
 [[1.74942285 1.28408918 0.43493956]
 [0.50182208 0.19556924 0.75924396]]
After net.W: 
 [[1.74942285 1.28408918 1.43493956]
 [0.50182208 0.19556924 0.75924396]]

Before net.W: 
 [[1.74942285 1.28408918 1.43493956]
 [0.50182208 0.19556924 0.75924396]]
After net.W: 
 [[1.74942285 1.28408918 1.43493956]
 [1.50182208 0.19556924 0.75924396]]

Before net.W: 
 [[1.74942285 1.28408918 1.43493956]
 [1.50182208 0.19556924 0.75924396]]
After net.W: 
 [[1.74942285 1.28408918 1.43493956]
 [1.50182208 1.19556924 0.75924396]]

Before net.W: 
 [[1.74942285 1.28408918 1.43493956]
 [1.50182208 1.19556924 0.75924396]]
After net.W: 
 [[1.74942285 1.28408918 1.43493956]
 [1.50182208 1.19556924 1.75924396]]
''''''

0 个答案:

没有答案