car dehko 数据集上的线性回归,验证损失低于训练

时间:2021-01-09 20:00:04

标签: python keras linear-regression loss-function

我正在对 car dehko 数据集(版本 3,你可以找到它 here)进行简单的回归,我发现验证损失总是低于训练损失。

一般来说,我做了一些基本的预处理。我从一些列中提取了数值,删除了一列(扭矩)和一个异常值,对特征进行了标准化,对分类特征进行了虚拟化。然后,我使用 keras 运行回归。

这是我的代码

import pandas as pd
import matplotlib.pyplot as plt
import tensorflow as tf
print(tf.__version__)
from tensorflow import keras
from tensorflow.keras import layers

pd.options.display.max_columns = None
pd.options.display.width=None


def pre_process(df_data):
    ## check NaNs and drop rows if any
    print(df_data.isnull().sum())
    df_data.dropna(inplace=True)

    ## drop weird outlier, turns out it has 1 km_driven
    df_data.drop([7913], inplace=True)

    ## taking only first word in these columns (removing car models and units of measures)
    df_data['selling_price'] = df_data['selling_price']/1000
    df_data['name'] = df_data['name'].map(lambda x: x.split(' ')[0])
    df_data['owner'] = df_data['owner'].map(lambda x: x.split(' ')[0])
    df_data['mileage'] = (df_data['mileage'].astype(str).apply(lambda x: x.split(' ')[0])).astype(float)
    df_data['engine'] = (df_data['engine'].astype(str).apply(lambda x: x.split(' ')[0])).astype(float)
    df_data['max_power'] = (df_data['max_power'].astype(str).apply(lambda x: x.split(' ')[0])).astype(float)
    df_data.drop(['torque'], axis=1, inplace=True)


    ## dummify categorical features
    df_data = pd.get_dummies(df_data, drop_first=True)

    ## data normalization (min-max)
    print('\tData normalization')
    df_data = normalize(df_data)
    return df_data

def normalize(df):
    result = df.copy()
    for feature_name in df.columns:
        ## not normalizing target
        if feature_name == 'selling_price':
            pass
        else:
            # print(f'Normalizing {feature_name}')
            result[feature_name] = (df[feature_name] - df[feature_name].min()) / (df[feature_name].max() - df[feature_name].min())
            if result[feature_name].isnull().values.any():
                result.drop([feature_name], axis=1, inplace=True)
                print(f'Something wrong in {feature_name}, dropped.')
                print(f'now shape is {len(result)}, {len(result.columns)}')
    print(f'Returning {len(result)}, {len(result.columns)}')
    return result



def build_model():

    model = keras.Sequential([
    layers.Dense(1)
     ])
  #model = keras.Sequential([
   # layers.Dense(64, activation='relu', input_shape=[len(train_dataset.keys())]),
    #layers.Dense(64, activation='relu'),
    #layers.Dense(1)
  #])

    optimizer = tf.keras.optimizers.RMSprop(0.001)

    model.compile(loss='mse',
                optimizer=optimizer,
                metrics=['mae', 'mse'])
    return model



df_data = pd.read_csv('sample_data/car_details_v3.csv')

## data pre-processing
df_data = pre_process(df_data)

X = df_data.copy()
Y = X.pop('selling_price')


model = build_model()
history = model.fit(X, Y, validation_split = 0.4, epochs=100, batch_size=500)


plt.plot(history.history['mse'])
plt.plot(history.history['val_mse'])
plt.title('model mse')
plt.ylabel('mse')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='upper left')
plt.show()

这是训练输出:

Epoch 1/100
10/10 [==============================] - 1s 23ms/step - loss: 1153458.8352 - mae: 674.3291 - mse: 1153458.8352 - val_loss: 971128.0000 - val_mae: 623.6071 - val_mse: 971128.0000
Epoch 2/100
10/10 [==============================] - 0s 6ms/step - loss: 1215337.1818 - mae: 674.7663 - mse: 1215337.1818 - val_loss: 971033.8125 - val_mae: 623.5282 - val_mse: 971033.8125
Epoch 3/100
10/10 [==============================] - 0s 7ms/step - loss: 1184839.4545 - mae: 679.9471 - mse: 1184839.4545 - val_loss: 970946.9375 - val_mae: 623.4553 - val_mse: 970946.9375
Epoch 4/100
10/10 [==============================] - 0s 6ms/step - loss: 1077032.0000 - mae: 651.7002 - mse: 1077032.0000 - val_loss: 970861.5625 - val_mae: 623.3839 - val_mse: 970861.5625
Epoch 5/100
10/10 [==============================] - 0s 6ms/step - loss: 1143421.6080 - mae: 663.8522 - mse: 1143421.6080 - val_loss: 970777.5625 - val_mae: 623.3135 - val_mse: 970777.5625
Epoch 6/100
10/10 [==============================] - 0s 6ms/step - loss: 1116523.0341 - mae: 657.3438 - mse: 1116523.0341 - val_loss: 970693.1250 - val_mae: 623.2427 - val_mse: 970693.1250
Epoch 7/100
10/10 [==============================] - 0s 6ms/step - loss: 1108121.0341 - mae: 659.6135 - mse: 1108121.0341 - val_loss: 970609.1875 - val_mae: 623.1724 - val_mse: 970609.1875
Epoch 8/100
10/10 [==============================] - 0s 7ms/step - loss: 1186299.5682 - mae: 672.1452 - mse: 1186299.5682 - val_loss: 970526.1250 - val_mae: 623.1025 - val_mse: 970526.1250
Epoch 9/100
10/10 [==============================] - 0s 6ms/step - loss: 1272202.3295 - mae: 679.9844 - mse: 1272202.3295 - val_loss: 970442.6250 - val_mae: 623.0325 - val_mse: 970442.6250
Epoch 10/100
10/10 [==============================] - 0s 6ms/step - loss: 1146808.7159 - mae: 668.4530 - mse: 1146808.7159 - val_loss: 970359.3125 - val_mae: 622.9626 - val_mse: 970359.3125
Epoch 11/100
10/10 [==============================] - 0s 6ms/step - loss: 1097632.0682 - mae: 649.6506 - mse: 1097632.0682 - val_loss: 970274.7500 - val_mae: 622.8919 - val_mse: 970274.7500
Epoch 12/100
10/10 [==============================] - 0s 5ms/step - loss: 1177763.6705 - mae: 673.2304 - mse: 1177763.6705 - val_loss: 970191.2500 - val_mae: 622.8217 - val_mse: 970191.2500
Epoch 13/100
10/10 [==============================] - 0s 6ms/step - loss: 1113369.6477 - mae: 659.2293 - mse: 1113369.6477 - val_loss: 970107.6875 - val_mae: 622.7516 - val_mse: 970107.6875
Epoch 14/100
10/10 [==============================] - 0s 6ms/step - loss: 1109816.4659 - mae: 658.1920 - mse: 1109816.4659 - val_loss: 970023.4375 - val_mae: 622.6811 - val_mse: 970023.4375
Epoch 15/100
10/10 [==============================] - 0s 6ms/step - loss: 1157198.1477 - mae: 658.9856 - mse: 1157198.1477 - val_loss: 969939.3750 - val_mae: 622.6106 - val_mse: 969939.3750
Epoch 16/100
10/10 [==============================] - 0s 6ms/step - loss: 1143274.8750 - mae: 657.9985 - mse: 1143274.8750 - val_loss: 969855.3750 - val_mae: 622.5402 - val_mse: 969855.3750
Epoch 17/100
10/10 [==============================] - 0s 6ms/step - loss: 1165513.4375 - mae: 670.8922 - mse: 1165513.4375 - val_loss: 969772.4375 - val_mae: 622.4705 - val_mse: 969772.4375
Epoch 18/100
10/10 [==============================] - 0s 7ms/step - loss: 1193134.1705 - mae: 671.8726 - mse: 1193134.1705 - val_loss: 969688.5000 - val_mae: 622.4000 - val_mse: 969688.5000
Epoch 19/100
10/10 [==============================] - 0s 8ms/step - loss: 1162164.2614 - mae: 663.2649 - mse: 1162164.2614 - val_loss: 969605.5000 - val_mae: 622.3302 - val_mse: 969605.5000
Epoch 20/100
10/10 [==============================] - 0s 6ms/step - loss: 1123151.9091 - mae: 658.6773 - mse: 1123151.9091 - val_loss: 969521.0625 - val_mae: 622.2594 - val_mse: 969521.0625
Epoch 21/100
10/10 [==============================] - 0s 6ms/step - loss: 1180219.1705 - mae: 672.3833 - mse: 1180219.1705 - val_loss: 969437.4375 - val_mae: 622.1892 - val_mse: 969437.4375
Epoch 22/100
10/10 [==============================] - 0s 6ms/step - loss: 1110395.1705 - mae: 658.6029 - mse: 1110395.1307 - val_loss: 969354.0625 - val_mae: 622.1191 - val_mse: 969354.0625
Epoch 23/100
10/10 [==============================] - 0s 6ms/step - loss: 1192435.0227 - mae: 670.6691 - mse: 1192435.0227 - val_loss: 969270.6250 - val_mae: 622.0491 - val_mse: 969270.6250
Epoch 24/100
10/10 [==============================] - 0s 6ms/step - loss: 1172744.5000 - mae: 668.4421 - mse: 1172744.5000 - val_loss: 969187.2500 - val_mae: 621.9789 - val_mse: 969187.2500
Epoch 25/100
10/10 [==============================] - 0s 6ms/step - loss: 1103317.3011 - mae: 655.8360 - mse: 1103317.3011 - val_loss: 969103.0625 - val_mae: 621.9084 - val_mse: 969103.0625
Epoch 26/100
10/10 [==============================] - 0s 6ms/step - loss: 1129796.2614 - mae: 660.2945 - mse: 1129796.2614 - val_loss: 969019.8125 - val_mae: 621.8384 - val_mse: 969019.8125
Epoch 27/100
10/10 [==============================] - 0s 6ms/step - loss: 1162134.1023 - mae: 665.4788 - mse: 1162134.1023 - val_loss: 968935.9375 - val_mae: 621.7680 - val_mse: 968935.9375
Epoch 28/100
10/10 [==============================] - 0s 6ms/step - loss: 1107790.3011 - mae: 660.0697 - mse: 1107790.1932 - val_loss: 968852.5000 - val_mae: 621.6978 - val_mse: 968852.5000
Epoch 29/100
10/10 [==============================] - 0s 6ms/step - loss: 1224679.1023 - mae: 675.6062 - mse: 1224679.1023 - val_loss: 968768.8750 - val_mae: 621.6275 - val_mse: 968768.8750
Epoch 30/100
10/10 [==============================] - 0s 6ms/step - loss: 1148625.7386 - mae: 663.3963 - mse: 1148625.7386 - val_loss: 968685.8125 - val_mae: 621.5575 - val_mse: 968685.8125
Epoch 31/100
10/10 [==============================] - 0s 22ms/step - loss: 1188917.5909 - mae: 670.0622 - mse: 1188917.5909 - val_loss: 968602.8125 - val_mae: 621.4877 - val_mse: 968602.8125
Epoch 32/100
10/10 [==============================] - 0s 6ms/step - loss: 1066631.7443 - mae: 648.4988 - mse: 1066631.7443 - val_loss: 968518.1875 - val_mae: 621.4167 - val_mse: 968518.1875
Epoch 33/100
10/10 [==============================] - 0s 7ms/step - loss: 1114294.8239 - mae: 661.9696 - mse: 1114294.8239 - val_loss: 968434.8125 - val_mae: 621.3465 - val_mse: 968434.8125
Epoch 34/100
10/10 [==============================] - 0s 6ms/step - loss: 1074721.6307 - mae: 650.8288 - mse: 1074721.6307 - val_loss: 968350.6250 - val_mae: 621.2759 - val_mse: 968350.6250
Epoch 35/100
10/10 [==============================] - 0s 6ms/step - loss: 1171792.5682 - mae: 663.4328 - mse: 1171792.5682 - val_loss: 968267.8750 - val_mae: 621.2062 - val_mse: 968267.8750
Epoch 36/100
10/10 [==============================] - 0s 6ms/step - loss: 1163355.4773 - mae: 669.8160 - mse: 1163355.4659 - val_loss: 968184.5625 - val_mae: 621.1361 - val_mse: 968184.5625
Epoch 37/100
10/10 [==============================] - 0s 6ms/step - loss: 1238443.6477 - mae: 680.7899 - mse: 1238443.6477 - val_loss: 968101.5000 - val_mae: 621.0661 - val_mse: 968101.5000
Epoch 38/100
10/10 [==============================] - 0s 6ms/step - loss: 1169701.4545 - mae: 667.5217 - mse: 1169701.4545 - val_loss: 968018.0000 - val_mae: 620.9960 - val_mse: 968018.0000
Epoch 39/100
10/10 [==============================] - 0s 6ms/step - loss: 1271506.4205 - mae: 687.1342 - mse: 1271506.4205 - val_loss: 967934.8750 - val_mae: 620.9259 - val_mse: 967934.8750
Epoch 40/100
10/10 [==============================] - 0s 6ms/step - loss: 1121816.8295 - mae: 660.9865 - mse: 1121816.8295 - val_loss: 967850.7500 - val_mae: 620.8553 - val_mse: 967850.7500
Epoch 41/100
10/10 [==============================] - 0s 6ms/step - loss: 1096724.3977 - mae: 658.3204 - mse: 1096724.3977 - val_loss: 967766.9375 - val_mae: 620.7850 - val_mse: 967766.9375
Epoch 42/100
10/10 [==============================] - 0s 6ms/step - loss: 1140753.9205 - mae: 659.0186 - mse: 1140753.9205 - val_loss: 967683.1875 - val_mae: 620.7144 - val_mse: 967683.1875
Epoch 43/100
10/10 [==============================] - 0s 6ms/step - loss: 1225529.2273 - mae: 678.8100 - mse: 1225529.2159 - val_loss: 967600.2500 - val_mae: 620.6446 - val_mse: 967600.2500
Epoch 44/100
10/10 [==============================] - 0s 6ms/step - loss: 1225968.1477 - mae: 681.2146 - mse: 1225968.1477 - val_loss: 967517.0000 - val_mae: 620.5745 - val_mse: 967517.0000
Epoch 45/100
10/10 [==============================] - 0s 6ms/step - loss: 1133017.5852 - mae: 659.2329 - mse: 1133017.5852 - val_loss: 967434.1250 - val_mae: 620.5045 - val_mse: 967434.1250
Epoch 46/100
10/10 [==============================] - 0s 7ms/step - loss: 1161041.1477 - mae: 665.9619 - mse: 1161041.1477 - val_loss: 967350.5625 - val_mae: 620.4342 - val_mse: 967350.5625
Epoch 47/100
10/10 [==============================] - 0s 6ms/step - loss: 1142406.4489 - mae: 658.8358 - mse: 1142406.4489 - val_loss: 967266.9375 - val_mae: 620.3637 - val_mse: 967266.9375
Epoch 48/100
10/10 [==============================] - 0s 6ms/step - loss: 1180957.3295 - mae: 667.2453 - mse: 1180957.3295 - val_loss: 967183.6250 - val_mae: 620.2936 - val_mse: 967183.6250
Epoch 49/100
10/10 [==============================] - 0s 6ms/step - loss: 1221151.4545 - mae: 670.1398 - mse: 1221151.4545 - val_loss: 967100.0000 - val_mae: 620.2233 - val_mse: 967100.0000
Epoch 50/100
10/10 [==============================] - 0s 6ms/step - loss: 1147189.1136 - mae: 667.5689 - mse: 1147189.1136 - val_loss: 967016.7500 - val_mae: 620.1532 - val_mse: 967016.7500
Epoch 51/100
10/10 [==============================] - 0s 6ms/step - loss: 1083726.7898 - mae: 656.2614 - mse: 1083726.7898 - val_loss: 966932.6250 - val_mae: 620.0825 - val_mse: 966932.6250
Epoch 52/100
10/10 [==============================] - 0s 6ms/step - loss: 1091075.5625 - mae: 651.7170 - mse: 1091075.5625 - val_loss: 966849.5000 - val_mae: 620.0122 - val_mse: 966849.5000
Epoch 53/100
10/10 [==============================] - 0s 6ms/step - loss: 1208690.3523 - mae: 674.1986 - mse: 1208690.3523 - val_loss: 966766.5625 - val_mae: 619.9423 - val_mse: 966766.5625
Epoch 54/100
10/10 [==============================] - 0s 6ms/step - loss: 1147322.6023 - mae: 660.8913 - mse: 1147322.6023 - val_loss: 966683.3750 - val_mae: 619.8720 - val_mse: 966683.3750
Epoch 55/100
10/10 [==============================] - 0s 6ms/step - loss: 1176865.5000 - mae: 669.8265 - mse: 1176865.5000 - val_loss: 966600.3125 - val_mae: 619.8020 - val_mse: 966600.3125
Epoch 56/100
10/10 [==============================] - 0s 6ms/step - loss: 1120454.6193 - mae: 654.7875 - mse: 1120454.6193 - val_loss: 966516.3125 - val_mae: 619.7313 - val_mse: 966516.3125
Epoch 57/100
10/10 [==============================] - 0s 6ms/step - loss: 1142349.9489 - mae: 663.7417 - mse: 1142349.9489 - val_loss: 966433.0625 - val_mae: 619.6611 - val_mse: 966433.0625
Epoch 58/100
10/10 [==============================] - 0s 6ms/step - loss: 1215809.9205 - mae: 674.0759 - mse: 1215809.9205 - val_loss: 966350.5625 - val_mae: 619.5913 - val_mse: 966350.5625
Epoch 59/100
10/10 [==============================] - 0s 6ms/step - loss: 1111082.5227 - mae: 650.4422 - mse: 1111082.5227 - val_loss: 966267.0625 - val_mae: 619.5209 - val_mse: 966267.0625
Epoch 60/100
10/10 [==============================] - 0s 6ms/step - loss: 1044930.6364 - mae: 646.1296 - mse: 1044930.6364 - val_loss: 966183.5625 - val_mae: 619.4506 - val_mse: 966183.5625
Epoch 61/100
10/10 [==============================] - 0s 6ms/step - loss: 1099017.7898 - mae: 652.3196 - mse: 1099017.7898 - val_loss: 966099.8125 - val_mae: 619.3799 - val_mse: 966099.8125
Epoch 62/100
10/10 [==============================] - 0s 6ms/step - loss: 1141697.9432 - mae: 663.8691 - mse: 1141697.9432 - val_loss: 966016.8125 - val_mae: 619.3100 - val_mse: 966016.8125
Epoch 63/100
10/10 [==============================] - 0s 6ms/step - loss: 1088156.5000 - mae: 651.8880 - mse: 1088156.5000 - val_loss: 965934.0625 - val_mae: 619.2400 - val_mse: 965934.0625
Epoch 64/100
10/10 [==============================] - 0s 6ms/step - loss: 1122994.9886 - mae: 657.3662 - mse: 1122994.9886 - val_loss: 965850.1250 - val_mae: 619.1694 - val_mse: 965850.1250
Epoch 65/100
10/10 [==============================] - 0s 7ms/step - loss: 1107060.2557 - mae: 654.3957 - mse: 1107060.2557 - val_loss: 965767.1875 - val_mae: 619.0991 - val_mse: 965767.1875
Epoch 66/100
10/10 [==============================] - 0s 7ms/step - loss: 1117652.1761 - mae: 659.7799 - mse: 1117652.1761 - val_loss: 965684.1250 - val_mae: 619.0292 - val_mse: 965684.1250
Epoch 67/100
10/10 [==============================] - 0s 6ms/step - loss: 1128311.1761 - mae: 662.3255 - mse: 1128311.1761 - val_loss: 965601.8125 - val_mae: 618.9595 - val_mse: 965601.8125
Epoch 68/100
10/10 [==============================] - 0s 7ms/step - loss: 1298959.3864 - mae: 693.7738 - mse: 1298959.3864 - val_loss: 965519.4375 - val_mae: 618.8898 - val_mse: 965519.4375
Epoch 69/100
10/10 [==============================] - 0s 6ms/step - loss: 1220975.2614 - mae: 678.0371 - mse: 1220975.2614 - val_loss: 965436.2500 - val_mae: 618.8194 - val_mse: 965436.2500
Epoch 70/100
10/10 [==============================] - 0s 6ms/step - loss: 1186716.1023 - mae: 670.5737 - mse: 1186716.1023 - val_loss: 965352.6875 - val_mae: 618.7488 - val_mse: 965352.6875
Epoch 71/100
10/10 [==============================] - 0s 6ms/step - loss: 1184281.0341 - mae: 668.3951 - mse: 1184281.0341 - val_loss: 965269.0625 - val_mae: 618.6783 - val_mse: 965269.0625
Epoch 72/100
10/10 [==============================] - 0s 7ms/step - loss: 1131357.1705 - mae: 663.4544 - mse: 1131357.1705 - val_loss: 965185.2500 - val_mae: 618.6077 - val_mse: 965185.2500
Epoch 73/100
10/10 [==============================] - 0s 6ms/step - loss: 1250248.5682 - mae: 676.6544 - mse: 1250248.5114 - val_loss: 965102.3750 - val_mae: 618.5378 - val_mse: 965102.3750
Epoch 74/100
10/10 [==============================] - 0s 6ms/step - loss: 1139111.5000 - mae: 662.3826 - mse: 1139111.3864 - val_loss: 965019.9375 - val_mae: 618.4678 - val_mse: 965019.9375
Epoch 75/100
10/10 [==============================] - 0s 20ms/step - loss: 1174616.6591 - mae: 660.1651 - mse: 1174616.6477 - val_loss: 964936.8750 - val_mae: 618.3977 - val_mse: 964936.8750
Epoch 76/100
10/10 [==============================] - 0s 6ms/step - loss: 1148855.6477 - mae: 667.6373 - mse: 1148855.6477 - val_loss: 964853.6250 - val_mae: 618.3275 - val_mse: 964853.6250
Epoch 77/100
10/10 [==============================] - 0s 6ms/step - loss: 1141488.4205 - mae: 666.2675 - mse: 1141488.4205 - val_loss: 964770.6250 - val_mae: 618.2573 - val_mse: 964770.6250
Epoch 78/100
10/10 [==============================] - 0s 6ms/step - loss: 1170440.5568 - mae: 667.3746 - mse: 1170440.5568 - val_loss: 964687.6875 - val_mae: 618.1871 - val_mse: 964687.6875
Epoch 79/100
10/10 [==============================] - 0s 7ms/step - loss: 1156989.6250 - mae: 663.8672 - mse: 1156989.6250 - val_loss: 964604.6250 - val_mae: 618.1169 - val_mse: 964604.6250
Epoch 80/100
10/10 [==============================] - 0s 6ms/step - loss: 1158916.8295 - mae: 659.0662 - mse: 1158916.8295 - val_loss: 964521.3125 - val_mae: 618.0464 - val_mse: 964521.3125
Epoch 81/100
10/10 [==============================] - 0s 6ms/step - loss: 1141873.2841 - mae: 659.3038 - mse: 1141873.2841 - val_loss: 964437.8125 - val_mae: 617.9760 - val_mse: 964437.8125
Epoch 82/100
10/10 [==============================] - 0s 7ms/step - loss: 1141813.4545 - mae: 659.3099 - mse: 1141813.4545 - val_loss: 964354.5000 - val_mae: 617.9056 - val_mse: 964354.5000
Epoch 83/100
10/10 [==============================] - 0s 6ms/step - loss: 1157313.7386 - mae: 664.9635 - mse: 1157313.7386 - val_loss: 964271.7500 - val_mae: 617.8356 - val_mse: 964271.7500
Epoch 84/100
10/10 [==============================] - 0s 7ms/step - loss: 1088132.7898 - mae: 647.5032 - mse: 1088132.7898 - val_loss: 964188.5625 - val_mae: 617.7652 - val_mse: 964188.5625
Epoch 85/100
10/10 [==============================] - 0s 7ms/step - loss: 1143195.9886 - mae: 661.8720 - mse: 1143195.9886 - val_loss: 964106.3125 - val_mae: 617.6954 - val_mse: 964106.3125
Epoch 86/100
10/10 [==============================] - 0s 6ms/step - loss: 1129616.2500 - mae: 664.1908 - mse: 1129616.2386 - val_loss: 964022.8750 - val_mae: 617.6249 - val_mse: 964022.8750
Epoch 87/100
10/10 [==============================] - 0s 6ms/step - loss: 1090240.8636 - mae: 646.6818 - mse: 1090240.8636 - val_loss: 963939.6875 - val_mae: 617.5547 - val_mse: 963939.6875
Epoch 88/100
10/10 [==============================] - 0s 6ms/step - loss: 1061185.4318 - mae: 644.1078 - mse: 1061185.4318 - val_loss: 963856.3750 - val_mae: 617.4843 - val_mse: 963856.3750
Epoch 89/100
10/10 [==============================] - 0s 6ms/step - loss: 1172663.6818 - mae: 658.7974 - mse: 1172663.6818 - val_loss: 963773.6250 - val_mae: 617.4142 - val_mse: 963773.6250
Epoch 90/100
10/10 [==============================] - 0s 7ms/step - loss: 1174682.0795 - mae: 662.5005 - mse: 1174682.0795 - val_loss: 963691.0000 - val_mae: 617.3441 - val_mse: 963691.0000
Epoch 91/100
10/10 [==============================] - 0s 7ms/step - loss: 1149222.9602 - mae: 658.8547 - mse: 1149222.9602 - val_loss: 963607.8125 - val_mae: 617.2737 - val_mse: 963607.8125
Epoch 92/100
10/10 [==============================] - 0s 6ms/step - loss: 1145790.4091 - mae: 661.5331 - mse: 1145790.4091 - val_loss: 963525.3125 - val_mae: 617.2037 - val_mse: 963525.3125
Epoch 93/100
10/10 [==============================] - 0s 5ms/step - loss: 1195008.4545 - mae: 666.8613 - mse: 1195008.4545 - val_loss: 963442.2500 - val_mae: 617.1335 - val_mse: 963442.2500
Epoch 94/100
10/10 [==============================] - 0s 6ms/step - loss: 1078283.8011 - mae: 645.5068 - mse: 1078283.8011 - val_loss: 963358.9375 - val_mae: 617.0630 - val_mse: 963358.9375
Epoch 95/100
10/10 [==============================] - 0s 7ms/step - loss: 1039502.5227 - mae: 646.7638 - mse: 1039502.5227 - val_loss: 963276.0000 - val_mae: 616.9927 - val_mse: 963276.0000
Epoch 96/100
10/10 [==============================] - 0s 7ms/step - loss: 1194586.2159 - mae: 670.2529 - mse: 1194586.2159 - val_loss: 963193.6250 - val_mae: 616.9229 - val_mse: 963193.6250
Epoch 97/100
10/10 [==============================] - 0s 6ms/step - loss: 1116803.8693 - mae: 653.2743 - mse: 1116803.8693 - val_loss: 963110.4375 - val_mae: 616.8525 - val_mse: 963110.4375
Epoch 98/100
10/10 [==============================] - 0s 6ms/step - loss: 1150970.0341 - mae: 660.6168 - mse: 1150970.0341 - val_loss: 963027.9375 - val_mae: 616.7825 - val_mse: 963027.9375
Epoch 99/100
10/10 [==============================] - 0s 7ms/step - loss: 1121963.6591 - mae: 657.9420 - mse: 1121963.6591 - val_loss: 962944.5625 - val_mae: 616.7120 - val_mse: 962944.5625
Epoch 100/100
10/10 [==============================] - 0s 6ms/step - loss: 1136820.0341 - mae: 660.7863 - mse: 1136820.0341 - val_loss: 962861.8750 - val_mae: 616.6418 - val_mse: 962861.8750

Validation and training loss over training

我还通过添加图层尝试了不同的模型,总体趋势始终相同。 我发现的大多数关于此类情况的帖子都表明,在这些情况下,原因是训练数据集比验证数据集更“复杂”,但我尝试了不同的时间来改变分割,结果总是相似,所以我想知道我是不是做错了什么。

0 个答案:

没有答案