BERT-从所有图层提取值

时间:2019-06-29 15:42:05

标签: python tensorflow tensorflow-estimator word-embedding bert-language-model

我正在尝试从BERT获得“单词嵌入”。 我已经微调了“情感分类”模型,该模型可以预测句子是肯定的还是否定的。 但是我需要从所有图层中提取值才能获得特征化表示。

我已经尝试过类似的事情

all_layers = model.get_all_encoder_layers()

但不起作用

# Compute # train and warmup steps from batch size
num_train_steps = int(len(train_features) / BATCH_SIZE * NUM_TRAIN_EPOCHS)
num_warmup_steps = int(num_train_steps * WARMUP_PROPORTION)

# Specify outpit directory and number of checkpoint steps to save
run_config = tf.estimator.RunConfig(
    model_dir=OUTPUT_DIR,
    save_summary_steps=SAVE_SUMMARY_STEPS,
    save_checkpoints_steps=SAVE_CHECKPOINTS_STEPS)

model_fn = model_fn_builder(
    num_labels=len(label_list),
    learning_rate=LEARNING_RATE,
    num_train_steps=num_train_steps,
    num_warmup_steps=num_warmup_steps)

estimator = tf.estimator.Estimator(
    model_fn=model_fn,
    config=run_config,
    params={"batch_size": BATCH_SIZE})

# Create an input function for training. drop_remainder = True for TPUs.
train_input_fn = bert.run_classifier.input_fn_builder(
    features=train_features,
    seq_length=MAX_SEQ_LENGTH,
    is_training=True,
    drop_remainder=False)

我必须更改model_fn才能从BERT的所有层获取值?

0 个答案:

没有答案