TypeError:('Not JSON Serializable:',Dimension(2048))

时间:2018-02-08 17:31:01

标签: keras

该模型是从初始模型中获取的2048个传递值的输入形状。

我想要实现的是尝试将此代码https://github.com/Hvass-Labs/TensorFlow-Tutorials/blob/master/08_Transfer_Learning.ipynb重做为Keras API。

一切顺利,直到我试图保存它。

当我尝试保存它时,它会引发TypeError :('Not JSON Serializable:',Dimension(2048))

我可以毫无问题地保存其他型号。

我不明白为什么这个不起作用。

我尝试将它保存在Windows 10上,python_ver = 3.6,tensorflow_ver = 1.6rcu,Ubuntu 16.04,python_ver = 3.6,tensorflow_ver = 1.3。

我用下面的代码创建了模型。

from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import InputLayer
from tensorflow.python.keras.layers import Dense


# Declare variables for model.
transfer_len = 2048
num_classes = 3


# Model creation.
model = Sequential()
# Input layer of shape 2048.
model.add(InputLayer(input_shape = (transfer_len,)))
# Fully connected 1024.
model.add(Dense(1024, activation='relu'))
# Output layer.
model.add(Dense(num_classes, activation='softmax'))


from tensorflow.python.keras.optimizers import Adam

optimizer = Adam(lr=1e-3)

model.compile(optimizer = optimizer,
             loss = 'categorical_crossentropy',
             metrics=['accuracy'])

model.fit(x = transfer_values_train,
         y = labels_train,
         epochs = 20, batch_size = 100, verbose=0)

output_path = "model.keras"
model.save(output_path)

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-22-6a252d3d7102> in <module>()
----> 1 model.save(output_path)

~\Anaconda3\envs\gpu\lib\site-packages\tensorflow\python\keras\_impl\keras\engine\topology.py in save(self, filepath, overwrite, include_optimizer)
   1044     """
   1045     from tensorflow.python.keras._impl.keras.models import save_model  # pylint: disable=g-import-not-at-top
-> 1046     save_model(self, filepath, overwrite, include_optimizer)
   1047 
   1048   def save_weights(self, filepath, overwrite=True):

~\Anaconda3\envs\gpu\lib\site-packages\tensorflow\python\keras\_impl\keras\models.py in save_model(model, filepath, overwrite, include_optimizer)
    131             'config': model.get_config()
    132         },
--> 133         default=get_json_type).encode('utf8')
    134 
    135     model_weights_group = f.create_group('model_weights')

~\Anaconda3\envs\gpu\lib\json\__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
    236         check_circular=check_circular, allow_nan=allow_nan, indent=indent,
    237         separators=separators, default=default, sort_keys=sort_keys,
--> 238         **kw).encode(obj)
    239 
    240 

~\Anaconda3\envs\gpu\lib\json\encoder.py in encode(self, o)
    197         # exceptions aren't as detailed.  The list call should be roughly
    198         # equivalent to the PySequence_Fast that ''.join() would do.
--> 199         chunks = self.iterencode(o, _one_shot=True)
    200         if not isinstance(chunks, (list, tuple)):
    201             chunks = list(chunks)

~\Anaconda3\envs\gpu\lib\json\encoder.py in iterencode(self, o, _one_shot)
    255                 self.key_separator, self.item_separator, self.sort_keys,
    256                 self.skipkeys, _one_shot)
--> 257         return _iterencode(o, 0)
    258 
    259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,

~\Anaconda3\envs\gpu\lib\site-packages\tensorflow\python\keras\_impl\keras\models.py in get_json_type(obj)
    113       return obj.__name__
    114 
--> 115     raise TypeError('Not JSON Serializable:', obj)
    116 
    117   from tensorflow.python.keras._impl.keras import __version__ as keras_version  # pylint: disable=g-import-not-at-top

TypeError: ('Not JSON Serializable:', Dimension(2048)

1 个答案:

答案 0 :(得分:2)

好的,所以transfer_len变量是一个类型&#39; tensorflow.python.framework.tensor_shape.Dimension&#39;。

更改为int并正常保存。