我正在尝试用定义为的分段线性函数替换Keras Sigmoid函数:
def custom_activation_4(x):
if x < -6:
return 0
elif x >= -6 and x < -4:
return (0.0078*x + 0.049)
elif x >= -4 and x < 0:
return (0.1205*x + 0.5)
elif x >= 0 and x < 4:
return (0.1205*x + 0.5)
elif x >= 4 and x < 6:
return (0.0078*x + 0.951)
else:
return 1;
当我尝试运行为:
classifier_4.add(Dense(output_dim = 18, init = 'uniform', activation = custom_activation_4, input_dim = 9))
编译器抛出错误:
Using a `tf.Tensor` as a Python `bool` is not allowed.
我对此进行了研究,并了解到,我将变量x视为简单的python变量,而将其作为张量。这就是为什么不能将其视为简单的布尔变量的原因。我还尝试使用 tensorflow cond 方法。这里如何处理和使用x作为张量?预先感谢您提供的所有帮助。
答案 0 :(得分:2)
我测试了答案中的代码,因为我打算编写类似的激活函数,但是发生了以下错误
raise TypeError("Using a tf.Tensor as a Python bool is not allowed. " TypeError: Using a tf.Tensor as a Python bool is not allowed. Use if t is not None: instead of if t: to test if a tensor is defined, and use TensorFlow ops such as tf.cond to execute subgraphs conditioned on the value of a tensor
原因是我们无法在tf.Tensor上使用Python逻辑运算符。因此,我在tf doc中进行了一些搜索,结果发现我们必须像这样使用他们的代码,这是我的代码,但与您的代码非常相似。
import tensorflow as tf
class QPWC(Layer):
def __init__(self, sharp=100, **kwargs):
super(QPWC, self).__init__(**kwargs)
self.supports_masking = True
self.sharp = K.cast_to_floatx(sharp)
def call(self, inputs):
orig = inputs
inputs = tf.where(orig <= 0.0, tf.zeros_like(inputs), inputs)
inputs = tf.where(tf.math.logical_and(tf.greater(orig, 0), tf.less(orig, 0.25)), 0.25 / (1+tf.exp(-self.sharp*((inputs-0.125)/0.5))), inputs)
inputs = tf.where(tf.math.logical_and(tf.greater(orig, 0.25), tf.less(orig, 0.5)), 0.25 / (1+tf.exp(-self.sharp*((inputs-0.5)/0.5))) + 0.25, inputs)
inputs = tf.where(tf.math.logical_and(tf.greater(orig, 0.5), tf.less(orig, 0.75)), 0.25 / (1+tf.exp(-self.sharp*((inputs-0.75)/0.5))) + 0.5, inputs)
return tf.where(tf.greater(orig, 0.75), tf.ones_like(inputs), inputs)
def get_config(self):
config = {'sharp': float(self.sharp)}
base_config = super(QPWC, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
def compute_output_shape(self, input_shape):
return input_shape
答案 1 :(得分:1)
您的自定义激活被编写为单个浮点数的函数,但是您希望将其应用于整个张量。最好的方法是使用tf.where
。像
def custom_activation_4(x):
orig = x
x = tf.where(orig < -6, tf.zeros_like(x), x)
x = tf.where(orig >= -6 and orig < -4, (0.0078*x + 0.049), x)
x = tf.where(orig >= -4 and orig < 0, (0.1205*x + 0.5), x)
x = tf.where(orig >= 0 and orig < 4, (0.1205*x + 0.5), x)
x = tf.where(orig >= 4 and orig < 6, (0.0078*x + 0.951), x)
return tf.where(orig >= 6, 1, x)