AttributeError:“ TensorSliceDataset”对象没有属性“ dtype”

时间:2019-01-28 22:54:22

标签: python tensorflow

这就是我所做的:

def prepare_data(self, features, labels):
  assert features.shape[0] == labels.shape[0]
  print("DEBUG: features: shape = " + str(features.shape) \
    + " , dtype(0,0) = " + str(type(features[0,0])))
  print("DEBUG: labels: shape = " + str(labels.shape) \
    + ", dtype(0) = " + str(type(labels[0])))
  dataset = tf.data.Dataset.from_tensor_slices( (features, labels) )
  iterator = dataset.make_one_shot_iterator()
  return dataset, iterator

...

self.train_features = np.asarray(train_features_list)
self.train_labels = np.asarray(train_labels_list)
self.train_data, self.train_it = \
    self.prepare_data(self.train_features, self.train_labels)

hidden1 = tf.layers.dense(self.train_data,
    self.input_layer_size * 40,
    activation=tf.nn.relu,
    name='hidden1')

这就是我所拥有的:

DEBUG: features: shape = (4000, 3072) , dtype(0,0) = <class 'numpy.uint8'>
DEBUG: labels: shape = (4000,), dtype(0) = <class 'numpy.int64'>
...
AttributeError: 'TensorSliceDataset' object has no attribute 'dtype'

在tensorflow / python / layers / core.py中,错误位置指向此代码:

layer = Dense(units,
            activation=activation,
            use_bias=use_bias,
            kernel_initializer=kernel_initializer,
            bias_initializer=bias_initializer,
            kernel_regularizer=kernel_regularizer,
            bias_regularizer=bias_regularizer,
            activity_regularizer=activity_regularizer,
            kernel_constraint=kernel_constraint,
            bias_constraint=bias_constraint,
            trainable=trainable,
            name=name,
            dtype=inputs.dtype.base_dtype,
            _scope=name,
            _reuse=reuse)

您能告诉我我在做什么错吗?

1 个答案:

答案 0 :(得分:1)

您的tf.layers.dense接受张量作为输入,但是您正在向其提供tf数据对象。这就是为什么它可能引发此错误的原因。

我已经用示例修改了您的代码,并且没有引发错误。 另外,密集层将以2维为输入,因此我将批处理包含在函数中,以使其为2个暗淡。

def prepare_data(features, labels):
  assert features.shape[0] == labels.shape[0]
  print("DEBUG: features: shape = " + str(features.shape) \
    + " , dtype(0,0) = " + str(type(features[0,0])))
  print("DEBUG: labels: shape = " + str(labels.shape) \
    + ", dtype(0) = " + str(type(labels[0])))
  dataset = tf.data.Dataset.from_tensor_slices( (features, labels) )
  iterator = dataset.batch(1).make_one_shot_iterator() # Modified here
  return iterator # Returned only the iterator

train_features = np.random.randn(4000, 3072) 
train_labels = np.random.randn(4000)
train_it = prepare_data(train_features, train_labels)

input_data, input_label = train_it.get_next() # Getting the input feature from the iterator
hidden1 = tf.layers.dense(input_data, 40, activation=tf.nn.relu, name='hidden1') # Used 40 as an example

结果:

DEBUG: features: shape = (4000, 3072) , dtype(0,0) = <class 'numpy.float64'>
DEBUG: labels: shape = (4000,), dtype(0) = <class 'numpy.float64'>