ValueError:获取参数<tf.variable'dense_1 =“”内核:0'=“” shape =“(2048,” 512)=“” dtype =“ float32_ref”>无法解释为张量

时间:2019-03-04 14:30:40

标签: keras grid-search

我正在尝试解决二进制分类问题,其中分类器根据从Morgan Fingerprints提取的位预测分子是否为“诱变剂”(典型的生物信息学问题)。对于那些不熟悉Morgan Fingerprints的人,只需将数据集视为“一个观察值是一个位元组(1、0、1、1、1、0、1、0、0 ...)和位元组是功能。

所附的代码是我进行网格搜索以调整超参数的地方。我对这些超参数最感兴趣:

  • 学习率(最重要)和动力
  • 激活功能
  • 退出率
  • 层数
  • 隐藏层中的神经元数量

代码运行完美,但是,经过多次迭代(〜811),我遇到了以下错误,我不知道该错误:

  

ValueError:提取参数不能解释为张量。 (Tensor Tensor(“ dense_1 / kernel:0”,shape =(2048,512),dtype = float32_ref)不是该图的元素。)

我没有在Stack Overflow上找到任何相关信息。人们在使用tf.Graph()时通常会发生此错误,但是这里不是这种情况。 对于Stack Overflow帖子,代码相对较长,因此,我留下了指向GitHub page的链接,请您看一下。在此存储库中查找StackOverflow/目录。在这里,您会找到错误的屏幕截图以及完整的代码。另外,我认为,最有趣的是显示我在代码中使用的Grid Search函数,因为我碰巧实现了自己的函数,而不是使用sklearn中的函数,仅仅是因为我受命这样做。因此,我强烈建议该函数中发生错误。

在此问题上,我将不胜感激!

## Fix random seed for reproducibility
seed = 7
np.random.seed(seed)
def gridSearch(hyperparameters):
    '''
    Wrapper function that takes permuations of hyperparameters from the class itertools.product,
    builds a new model in accord with these parameters. Analogous to the GridSerach_CV form 
    sklearn. Lasty, it returns best Roc AUC score and best parameters
    Also prints out all parameters it tests (the progress of the Grid Search).

    Parameters:
    -------
        hyperparameters:
            (Iterable) product from class itertools representing all possible permutations of hyperparameters.
    '''
    ## Values to be reported
    best_score = 0
    params = {}
    best_params = {}
    best_model = Sequential()
    model_nr = 1
    best_model_nr = 0

    for permute in hyperparameters:
        ## Record of permuted values
        model_nr +=1
        params['units'] = permute[0]
        params['activation'] = permute[1]
        params['optimizer'] = permute[2]
        params['learning_rate'] = permute[3]
        params['num_layers'] = permute[4]
#    params['dropout_rate'] = permute[5]
        params['epochs'] = permute[5]
        params['batch_size'] = permute[6]

        ## Build desired model from the wrapped function
        model = build_classifier(units=permute[0], activation=permute[1],
                                 optimizer=permute[2], learning_rate=permute[3],
                                 num_layers=permute[4])

        ## Train the model
        model.fit(X_train, y_train, batch_size=permute[6], epochs=permute[5])
        ## Predict on the test date
        y_pred = model.predict(X_test)
        ## Evaluate model performatnce using hte roc_auc score as a metric
        auc = roc_auc_score(y_test, y_pred)

        ## Report performance after each permuation
        message = "Model number {}:\n".format(model_nr)
        print(message)
        f.write(message)
        message = "Used Hyperparameters:\n" + str(params) +'\n'
        print(message)
        f.write(message)
        message = 'auc score: {}'.format(auc)+'\n\n'
        print(message)
        f.write(message)

        if best_score < auc:
            best_score = auc
            best_params = params
            best_model = model
            best_model_nr = model_nr
        #### end of for loop! ####

    ## Serialize model to JSON
    with open("best_model.json", "w") as json_file:
        json_file.write(best_model.to_json())
    ## Serialize weights to HDF5
    best_model.save_weights("model.h5")
    print("Saved model to disk\n")

    message = '\n\nBest model (Nr. {}):\n'.format(best_model_nr)
    print(message)
    f.write(message)
    message = "Used Hyperparameters:\n" + str(best_params) +'\n'
    print(message)
    f.write(message)
    message = 'auc score: {}'.format(best_score)+'\n'
    print(message)
    f.write(message)

    return best_score, best_params, best_model

0 个答案:

没有答案