在其上使用conv-pool图层后如何打印tensorflow对象的值?

时间:2019-05-27 11:39:58

标签: tensorflow keras conv-neural-network

将其应用于张量流变量后,我只能得到对象的形状, 在这些层之后如何获取变量的值?

def model_part2(a):  
    #q=tf.global_variables_initializer()
    p=tf.keras.layers.Conv1D(192,1)(a)
    #print(p.eval())
    p=tf.keras.layers.ReLU()(p)
    p=tf.keras.layers.MaxPool1D(1,2)(p)
    p=tf.keras.layers.Conv1D(256,1)(p)
    p=tf.keras.layers.ReLU()(p)
    p=tf.keras.layers.MaxPool1D(1,2)(p)
    p=tf.keras.layers.Conv1D(512,1)(p)
    p=tf.keras.layers.ReLU()(p)
    p=tf.keras.layers.MaxPool1D(1,2)(p)
    return p

`

1 个答案:

答案 0 :(得分:1)

Tensorflow Version 2.x 中,这非常简单。工作代码如下所示:

import tensorflow as tf

a =tf.constant(1.0,shape=[128,8192,1])

def model_part2(a):  
    #q=tf.global_variables_initializer()
    p=tf.keras.layers.Conv1D(192,1)(a)
    #print(p.eval())
    p=tf.keras.layers.ReLU()(p)
    p=tf.keras.layers.MaxPool1D(1,2)(p)
    p=tf.keras.layers.Conv1D(256,1)(p)
    p=tf.keras.layers.ReLU()(p)
    p=tf.keras.layers.MaxPool1D(1,2)(p)
    p=tf.keras.layers.Conv1D(512,1)(p)
    p=tf.keras.layers.ReLU()(p)
    p=tf.keras.layers.MaxPool1D(1,2)(p)
    return p

q = model_part2(a)

print(q)

以上代码的输出如下所示:

tf.Tensor(
[[[0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  ...
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]]

 [[0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  ...
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]]

 [[0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  ...
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]]

 ...

 [[0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  ...
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]]

 [[0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  ...
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]]

 [[0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  ...
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]
  [0.06378555 0.00853285 0.03427356 ... 0.         0.         0.        ]]], shape=(128, 1024, 512), dtype=float32)