Tensorflow:带有输入层的tf.slice

时间:2019-12-17 11:01:21

标签: tensorflow

我正在尝试在张量流中实现动态嵌入,即每个时间步的嵌入。我正在尝试切出相关月份以进行计算,但是我没有设法切出该维度以提供输入。

换句话说:给定输入月份,对于每个用户,我需要切出嵌入数组的不同部分(长度为month * latent_dim)。这可能吗?

latent_dim_users = 2
latent_dim_item = 2
learning_rate = 0.001

# Input placeholders
item_input = layers.Input(shape=(1,),  name='Item')
user_input = layers.Input(shape=(1,), name='User')
month_input = layers.Input(shape=(1,), name='Month', dtype='int32')

# User embedding per month
latent_dim_users_time = latent_dim_users * months_count  # embedding pér month
user_embedding_layer = layers.Embedding(user_count, latent_dim_users_time, 
                                        name='user_month_embedding', input_length=1)
user_embedding = layers.Flatten()(user_embedding_layer(pupil_input))
user_month_embedding = user_embedding[:, month_input:month_input+latent_dim_users]  # WHAT TO DO HERE

# Embedding layer for the items
item_embedding_layer = layers.Embedding(item_count, latent_dim_items, 
                                        name='item_embedding', input_length=1)
item_embedding = layers.Flatten()(item_embedding_layer(item_input))


# Calculate scores (proxies for the ratings)
scores = layers.Dot(axes=1, name='scores')([pupil_month_embedding, exercise_embedding])

# Output 
# There are some choices here between e.g. sigmoid (not exploding) or relu (better gradients)
# - RELU: better gradient. When using this, also use from_logits=True in BinaryCrossentropy
# - sigmoid: good probability, worse gradient
probs = layers.Activation('sigmoid')(scores)

# Define model and compile
model = keras.Model([pupil_input, exercise_input], probs)
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=learning_rate), 
              loss=tf.keras.losses.BinaryCrossentropy(),
              metrics=['accuracy'])

0 个答案:

没有答案