为什么我的代码每次都会给出不同的结果?

时间:2020-05-05 08:36:11

标签: python tensorflow neural-network

几天后,我的代码每次都可以重现-现在不行了!我不知道发生了什么,我只是更改了几行代码,也不知道如何解决它!

# Code reproduzierbar machen
import numpy as np
import os
import random as rn
import tensorflow as tf
import keras
from keras import backend as K

#-----------------------------Keras reproducible------------------#
SEED = 1234

tf.set_random_seed(SEED)
os.environ['PYTHONHASHSEED'] = str(SEED)
np.random.seed(SEED)
rn.seed(SEED)

session_conf = tf.ConfigProto(
    intra_op_parallelism_threads=1, 
    inter_op_parallelism_threads=1
)
sess = tf.Session(
    graph=tf.get_default_graph(), 
    config=session_conf
)
K.set_session(sess)

# Importiere Datasets (Training und Test)
import pandas as pd
poker_train = pd.read_csv("C:/Users/elihe/Documents/Studium Master/WS 19 und 20/Softwareprojekt/poker-hand-training-true.data")
poker_test = pd.read_csv("C:/Users/elihe/Documents/Studium Master/WS 19 und 20/Softwareprojekt/poker-hand-testing.data")

X_tr = poker_train.iloc[:, 0:10]
y_train = poker_train.iloc[:, 10:11]
X_te = poker_test.iloc[:, 0:10]
y_test = poker_test.iloc[:, 10:11]

from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train = sc.fit_transform(X_tr)
X_test = sc.transform(X_te)

# NN mit Keras erstellen
import keras
from keras.models import Sequential
from keras.layers import Dense

nen = Sequential()
nen.add(Dense(100, input_dim = 10, activation = 'relu'))
nen.add(Dense(50, activation = 'relu'))
nen.add(Dense(10, activation = 'softmax'))

# Kompilieren
from keras import metrics
nen.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
nen_fit = nen.fit(X_train, y_train,epochs=2, batch_size=50, verbose=1, validation_split = 0.2, shuffle = False)

我只花了2个纪元,所以我可以立即看到我的输出是否相同,通常会有500个纪元。直到今天,代码的第一行才可以重现,但现在不行了!我更改了X_te和X_tr的部分,因为首先我使用类y_train和y_test进行了OneHotEncoding,但现在我没有这样做。我也将激活功能从sigmoid更改为relu,将优化器从RMSprop更改为adam。我不知道该怎么办:(

0 个答案:

没有答案
相关问题