用Halley方法使用张量流找到4次多项式的根

时间:2018-08-22 07:42:27

标签: python tensorflow google-cloud-platform polynomials

我刚刚开始学习张量流,基本上我正在学习使用张量流进行各种数值计算,而不是直接跳入ML中。我正在Google Cloud Platform上执行此操作,遇到了这个问题并被卡住了。

Roots of 4th degree polynomial

我正在使用惰性求值,我可以使用占位符在张量流图中制作a0,a1,a2 ...,a4的实例,我也可以写出该函数。但是我该如何使用张量流进行初步猜测呢?而且,即使我得到x0的值,也应该如何使用tf.while_loop来应用循环 我仔细阅读了它的文档和此post,但是对于如何进行我仍然一无所知。我试图找到具有类似问题或内容的帖子,但找不到使用tensorflow的帖子。 如果我能获得使用固有的tensorflow函数和命令的见解或方法,那就太好了:)预先感谢!

2 个答案:

答案 0 :(得分:2)

当我从here执行第一个示例时,我得到了这些值。请注意,方程式是不同的。

  

1.4999999969612645

     

1.411188880378198

     

1.4142132016669995

     

1.4142135623730898

但这似乎是一个很好的例子。

import tensorflow as tf

h = tf.constant(.00000001, dtype='float64')
eps = tf.constant(.000001, dtype='float64')
b = tf.constant(2.0, tf.float64)

def f(x):
    return tf.subtract( tf.multiply(x , x ) , 2. )

def fp(x):
    return  tf.divide( tf.subtract( f(tf.add(x, h)) ,
                                    f(x)
                                  ) ,
                       h
                     )

def fpp(x):
    return tf.divide( tf.subtract( fp( tf.add(x , h)) ,
                                   fp(x)
                                 ),
                       h
                     )

def cond(i, x_new, x_prev):
    return tf.logical_and( i < 5,
            tf.less_equal( tf.abs( tf.cast(tf.subtract( x_new ,
                                                       x_prev
                                                     ),dtype='float64')),
                          eps
                        )
                        )

def body( i, x_new, x_prev ):
    fx = f( x_prev )
    fpx = fp( x_prev )
    x_new = tf.subtract( x_prev ,
                         tf.divide( b * fx * fpx  ,
                                    tf.subtract(b * fpx * fpx,
                                                fx * fpp( x_prev )
                                               )
                                  )
                       )

    xnew = tf.Print(x_new, [x_new], message="The root is : ")

    with tf.control_dependencies([x_new,xnew]):
        x_prev = tf.identity(xnew)

    return [i + 1, xnew, x_prev ]

sess = tf.Session()
sess.run(tf.global_variables_initializer())


print( sess.run(tf.while_loop(cond, body, [1, b - fpp(b), b])) )
  

根为:[1.4999999969612645]

     

根为:[1.411188880378198]

     

根为:[1.4142132016669995]

     

根为:[1.4142135623730898]

     

[5,1.4142135623730898,1.4142135623730898]

答案 1 :(得分:0)

Here is my implementation with eager evaluation, use tensorflow GradientTape for calculating the derivatives:

import tensorflow as tf
print("Tensorflow-CPU version is {0}".format(tf.__version__))

stop_variation = 0.00001 # Variation threshold from previous iteration to stop iteration

def halley(i, coeffs, x_new, x_prev):
    """
    Halley's Method implementation
    """

    a0 = coeffs[0]
    a1 = coeffs[1]
    a2 = coeffs[2]
    a3 = coeffs[3]
    a4 = coeffs[4]

    with tf.GradientTape() as g:
      g.watch(x_new)
      with tf.GradientTape() as gg:
        gg.watch(x_new)
        f =  a0 + a1 * x_new + a2 *  x_new**2 + a3 * x_new**3 + a4 * x_new**4
      df_dx = gg.gradient(f, x_new)
    df_dx2 = g.gradient(df_dx, x_new)

    numerator = 2 * f * df_dx
    denominator = 2 * df_dx*df_dx - f*df_dx2
    new_x  = x_new - (numerator/denominator)
    prev_x = x_new
    print("Root approximation in step {0} = {1}".format(i, new_x))
    return [i+1, coeffs, new_x, prev_x]

def condition(i, a, x_new, x_prev):
    variation = tf.abs(x_new - x_prev)
    return tf.less(stop_variation, variation)

tf.enable_eager_execution()

a = tf.constant(
        [2.0, -4.0, 1.0, 2.0, 0.0]
)  
x = tf.constant(40.0)
xprev = tf.constant(100.0)

roots =  tf.while_loop(     
          condition, 
          halley, 
          loop_vars=[1, a, x, xprev],
          maximum_iterations=1000)

print("Result after {0} iterations is {1}.".format(roots[0]-1, roots[2]))