在包含 mb_regex_encoding("UTF-8");
$new_text = 'Hello™ World!';
$new_text = mb_ereg_replace("™", "&trade",$new_text);
echo $new_text ;
的R中,共享层可以定义为
keras
可以在https://keras.rstudio.com/articles/functional_api.html或https://tensorflow.rstudio.com/keras/articles/about_keras_models.html找到。
当您查看tweet_a <- layer_input(shape = c(140, 256))
tweet_b <- layer_input(shape = c(140, 256))
# This layer can take as input a matrix and will return a vector of size 64
shared_lstm <- layer_lstm(units = 64)
# When we reuse the same layer instance multiple times, the weights of the layer are also
# being reused (it is effectively *the same* layer)
encoded_a <- tweet_a %>% shared_lstm
encoded_b <- tweet_b %>% shared_lstm
# We can then concatenate the two vectors and add a logistic regression on top
predictions <- layer_concatenate(c(encoded_a, encoded_b), axis=-1) %>%
layer_dense(units = 1, activation = 'sigmoid')
# We define a trainable model linking the tweet inputs to the predictions
model <- keras_model(inputs = c(tweet_a, tweet_b), outputs = predictions)
是什么时,您会得到shared_lstm
,即它不是符号张量。至少,当我尝试与
<keras.layers.recurrent.LSTM>
因为我收到以下错误。
shared_lstm <- layer_lstm(units = 64, return_sequences = TRUE) %>% layer_lstm(units = 64)
我能让它发挥作用的唯一方法是
Error in py_call_impl(callable, dots$args, dots$keywords) :
ValueError: Layer lstm_15 was called with an input that isn't a symbolic tensor.
Received type: <class 'keras.layers.recurrent.LSTM'>.
Full input: [<keras.layers.recurrent.LSTM object at 0x000000016CF8A2B0>].
All inputs to the layer should be tensors.
结构更大,很快就会变得很麻烦......这样做的正确方法是什么?也许创建一个“层节点”?