父类python中super关键字的含义

时间:2015-06-15 15:42:51

标签: python object super

我不理解super关键字在子类中没有使用时的含义。

这个问题来自这个我在git hub项目中找到的课程(链接是https://github.com/statsmodels/statsmodels/pull/2374/files

查看代码fit出现的res = super(PenalizedMixin, self).fit(method=method, **kwds) +方法的示例

"""
+Created on Sun May 10 08:23:48 2015
+
+Author: Josef Perktold
+License: BSD-3
+"""
+
+import numpy as np
+from ._penalties import SCADSmoothed
+
+class PenalizedMixin(object):
+    """Mixin class for Maximum Penalized Likelihood
+
+
+    TODO: missing **kwds or explicit keywords
+
+    TODO: do we really need `pen_weight` keyword in likelihood methods?
+
+    """
+
+    def __init__(self, *args, **kwds):
+        super(PenalizedMixin, self).__init__(*args, **kwds)
+
+        penal = kwds.pop('penal', None)
+        # I keep the following instead of adding default in pop for future changes
+        if penal is None:
+            # TODO: switch to unpenalized by default
+            self.penal = SCADSmoothed(0.1, c0=0.0001)
+        else:
+            self.penal = penal
+
+        # TODO: define pen_weight as average pen_weight? i.e. per observation
+        # I would have prefered len(self.endog) * kwds.get('pen_weight', 1)
+        # or use pen_weight_factor in signature
+        self.pen_weight =  kwds.get('pen_weight', len(self.endog))
+
+        self._init_keys.extend(['penal', 'pen_weight'])
+
+
+
+    def loglike(self, params, pen_weight=None):
+        if pen_weight is None:
+            pen_weight = self.pen_weight
+
+        llf = super(PenalizedMixin, self).loglike(params)
+        if pen_weight != 0:
+            llf -= pen_weight * self.penal.func(params)
+
+        return llf
+
+
+    def loglikeobs(self, params, pen_weight=None):
+        if pen_weight is None:
+            pen_weight = self.pen_weight
+
+        llf = super(PenalizedMixin, self).loglikeobs(params)
+        nobs_llf = float(llf.shape[0])
+
+        if pen_weight != 0:
+            llf -= pen_weight / nobs_llf * self.penal.func(params)
+
+        return llf
+
+
+    def score(self, params, pen_weight=None):
+        if pen_weight is None:
+            pen_weight = self.pen_weight
+
+        sc = super(PenalizedMixin, self).score(params)
+        if pen_weight != 0:
+            sc -= pen_weight * self.penal.grad(params)
+
+        return sc
+
+
+    def scoreobs(self, params, pen_weight=None):
+        if pen_weight is None:
+            pen_weight = self.pen_weight
+
+        sc = super(PenalizedMixin, self).scoreobs(params)
+        nobs_sc = float(sc.shape[0])
+        if pen_weight != 0:
+            sc -= pen_weight / nobs_sc  * self.penal.grad(params)
+
+        return sc
+
+
+    def hessian_(self, params, pen_weight=None):
+        if pen_weight is None:
+            pen_weight = self.pen_weight
+            loglike = self.loglike
+        else:
+            loglike = lambda p: self.loglike(p, pen_weight=pen_weight)
+
+        from statsmodels.tools.numdiff import approx_hess
+        return approx_hess(params, loglike)
+
+
+    def hessian(self, params, pen_weight=None):
+        if pen_weight is None:
+            pen_weight = self.pen_weight
+
+        hess = super(PenalizedMixin, self).hessian(params)
+        if pen_weight != 0:
+            h = self.penal.deriv2(params)
+            if h.ndim == 1:
+                hess -= np.diag(pen_weight * h)
+            else:
+                hess -= pen_weight * h
+
+        return hess
+
+
+    def fit(self, method=None, trim=None, **kwds):
+        # If method is None, then we choose a default method ourselves
+
+        # TODO: temporary hack, need extra fit kwds
+        # we need to rule out fit methods in a model that will not work with
+        # penalization
+        if hasattr(self, 'family'):  # assume this identifies GLM
+            kwds.update({'max_start_irls' : 0})
+
+        # currently we use `bfgs` by default
+        if method is None:
+            method = 'bfgs'
+
+        if trim is None:
+            trim = False  # see below infinite recursion in `fit_constrained
+
+        res = super(PenalizedMixin, self).fit(method=method, **kwds)
+
+        if trim is False:
+            # note boolean check for "is False" not evaluates to False
+            return res
+        else:
+            # TODO: make it penal function dependent
+            # temporary standin, only works for Poisson and GLM,
+            # and is computationally inefficient
+            drop_index = np.nonzero(np.abs(res.params) < 1e-4) [0]
+            keep_index = np.nonzero(np.abs(res.params) > 1e-4) [0]
+            rmat = np.eye(len(res.params))[drop_index]
+
+            # calling fit_constrained raise
+            # "RuntimeError: maximum recursion depth exceeded in __instancecheck__"
+            # fit_constrained is calling fit, recursive endless loop
+            if drop_index.any():
+                # todo : trim kwyword doesn't work, why not?
+                #res_aux = self.fit_constrained(rmat, trim=False)
+                res_aux = self._fit_zeros(keep_index, **kwds)
+                return res_aux
+            else:
+                return res
+
+

我尝试使用更简单的示例重现此代码,但它不起作用:

class A(object):
    def __init__(self):
        return

    def funz(self, x):
        print(x)

    def funz2(self, x):
        llf = super(A, self).funz2(x)
        print(x + 1)

a = A()
a.funz(3)
a.funz2(4)


Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/donbeo/Desktop/prova.py", line 15, in <module>
    a.funz2(4)
  File "/home/donbeo/Desktop/prova.py", line 10, in funz2
    llf = super(A, self).funz2(x)
AttributeError: 'super' object has no attribute 'funz2'
>>> 

2 个答案:

答案 0 :(得分:3)

PenalizedMixin 是一个子类:它是super的孩子。

然而,顾名思义,它意味着混合。也就是说,它旨在用作多继承场景中的一个父节点。 __init__调用方法解析顺序中的下一个类,该顺序不一定是该类的父级。

无论如何,我不明白你的“更简单”的例子。原始代码工作的原因是超类确实有object方法。 funz2没有<div class="container"> <div class="popup">Popup 1</div> <div class="popup r">Popup 2</div> <div class="popup b">Popup 3</div> <div class="popup g">Popup 4</div> <div class="popup y">Popup 5</div> </div> 方法。

答案 1 :(得分:3)

您应该始终使用super,因为否则类可能会在多重继承方案中错过,特别(在使用混合类时这是不可避免的)。例如:

class BaseClass(object):

    def __init__(self):
        print 'BaseClass.__init__'


class MixInClass(object):

    def __init__(self):
        print 'MixInClass.__init__'


class ChildClass(BaseClass, MixInClass):

    def __init__(self):
        print 'ChildClass.__init__'
        super(ChildClass, self).__init__()  # -> BaseClass.__init__


if __name__ == '__main__':
    child = ChildClass()

给出:

ChildClass.__init__
BaseClass.__init__

错过了MixInClass.__init__ ,而:

class BaseClass(object):

    def __init__(self):
        print 'BaseClass.__init__'
        super(BaseClass, self).__init__()  # -> MixInClass.__init__


class MixInClass(object):

    def __init__(self):
        print 'MixInClass.__init__'
        super(MixInClass, self).__init__()  # -> object.__init__


class ChildClass(BaseClass, MixInClass):

    def __init__(self):
        print 'ChildClass.__init__'
        super(ChildClass, self).__init__()  # -> BaseClass.__init__


if __name__ == '__main__':
    child = ChildClass()

给出:

ChildClass.__init__
BaseClass.__init__
MixInClass.__init__

ChildClass.__mro__&#34;方法解析顺序&#34; 在两种情况下均相同:

(<class '__main__.ChildClass'>, <class '__main__.BaseClass'>, <class '__main__.MixInClass'>, <type 'object'>)

BaseClassMixInClass都只从object继承(即他们是&#34; new-style&#34; 类),但你仍然需要使用super来确保调用MRO中类的方法的任何其他实现。要启用此用法,object.__init__已实施,但实际上并没有多大帮助!