我的神经网络没有改善原因

时间:2017-03-07 04:03:31

标签: python numpy

我的张量流程计划没有任何改善。有人请帮忙。 以下是代码:

import tensorflow as tf
import numpy as np
import pandas as pd

import matplotlib.pyplot as plt

data = pd.read_csv("data.csv", sep='|')
data = data[41320:41335]
inputX = data.drop(['Name', 'md5', 'legitimate'], axis=1).as_matrix()
inputY = data['legitimate'].as_matrix()
inputY = inputY.reshape([-1,1])

This is the data for X. It has 52 features.

inputX = array([[  3.32000000e+02,   2.24000000e+02,   8.45000000e+03,
      8.00000000e+00,   0.00000000e+00,   5.32480000e+04,
      1.63840000e+04,   0.00000000e+00,   5.40480000e+04,
      4.09600000e+03,   5.73440000e+04,   2.08594534e+09,
      4.09600000e+03,   4.09600000e+03,   4.00000000e+00,
      0.00000000e+00,   8.00000000e+00,   0.00000000e+00,
      4.00000000e+00,   0.00000000e+00,   7.37280000e+04,
      4.09600000e+03,   1.20607000e+05,   2.00000000e+00,
      3.20000000e+02,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   4.00000000e+00,   2.70373594e+00,
      1.05637876e+00,   6.22819008e+00,   1.63840000e+04,
      4.09600000e+03,   5.32480000e+04,   1.59390000e+04,
      9.92000000e+02,   5.28640000e+04,   6.00000000e+00,
      1.37000000e+02,   8.10000000e+01,   2.50000000e+01,
      1.00000000e+00,   3.52426821e+00,   3.52426821e+00,
      3.52426821e+00,   8.92000000e+02,   8.92000000e+02,
      8.92000000e+02,   7.20000000e+01,   1.60000000e+01], 
   [  3.32000000e+02,   2.24000000e+02,   8.45000000e+03,
      8.00000000e+00,   0.00000000e+00,   5.27360000e+04,
      1.12640000e+04,   0.00000000e+00,   5.35300000e+04,
      4.09600000e+03,   5.73440000e+04,   2.08699392e+09,
      4.09600000e+03,   5.12000000e+02,   4.00000000e+00,
      0.00000000e+00,   8.00000000e+00,   0.00000000e+00,
      4.00000000e+00,   0.00000000e+00,   7.37280000e+04,
      1.02400000e+03,   8.92300000e+04,   2.00000000e+00,
      3.20000000e+02,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   4.00000000e+00,   4.31899422e+00,
      3.30769150e+00,   6.15499505e+00,   1.42080000e+04,
      1.02400000e+03,   5.27360000e+04,   1.57382500e+04,
      9.92000000e+02,   5.22730000e+04,   6.00000000e+00,
      1.33000000e+02,   8.10000000e+01,   2.50000000e+01,
      1.00000000e+00,   3.54207119e+00,   3.54207119e+00,
      3.54207119e+00,   8.92000000e+02,   8.92000000e+02,
      8.92000000e+02,   7.20000000e+01,   1.60000000e+01],
   [  3.32000000e+02,   2.24000000e+02,   8.45000000e+03,
      8.00000000e+00,   0.00000000e+00,   4.09600000e+04,
      2.04800000e+04,   0.00000000e+00,   2.66080000e+04,
      4.09600000e+03,   4.50560000e+04,   1.92151552e+09,
      4.09600000e+03,   4.09600000e+03,   4.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      4.00000000e+00,   0.00000000e+00,   6.55360000e+04,
      4.09600000e+03,   1.21734000e+05,   2.00000000e+00,
      3.20000000e+02,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   5.00000000e+00,   3.58061262e+00,
      8.04176679e-02,   6.23193618e+00,   1.22880000e+04,
      4.09600000e+03,   4.09600000e+04,   1.04442000e+04,
      9.64000000e+02,   3.76480000e+04,   2.00000000e+00,
      6.80000000e+01,   0.00000000e+00,   1.12000000e+02,
      6.00000000e+00,   3.00438262e+00,   2.40651198e+00,
      3.59262288e+00,   6.10333333e+02,   1.24000000e+02,
      1.41200000e+03,   7.20000000e+01,   1.60000000e+01],
   [  3.32000000e+02,   2.24000000e+02,   2.58000000e+02,
      1.10000000e+01,   0.00000000e+00,   3.54816000e+05,
      2.57024000e+05,   0.00000000e+00,   1.83632000e+05,
      4.09600000e+03,   3.60448000e+05,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   5.00000000e+00,
      1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   1.00000000e+00,   6.26688000e+05,
      1.02400000e+03,   0.00000000e+00,   2.00000000e+00,
      3.30880000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   5.00000000e+00,   4.59039653e+00,
      2.37894684e+00,   6.29682587e+00,   1.20524800e+05,
      7.68000000e+03,   3.54816000e+05,   1.22148600e+05,
      1.64680000e+04,   3.54799000e+05,   7.00000000e+00,
      1.38000000e+02,   0.00000000e+00,   0.00000000e+00,
      7.00000000e+00,   3.91441476e+00,   1.44168828e+00,
      7.67709054e+00,   7.29842857e+03,   1.60000000e+01,
      2.84380000e+04,   7.20000000e+01,   0.00000000e+00],
   [  3.32000000e+02,   2.24000000e+02,   2.71000000e+02,
      6.00000000e+00,   0.00000000e+00,   2.40640000e+04,
      1.64864000e+05,   1.02400000e+03,   1.25380000e+04,
      4.09600000e+03,   2.86720000e+04,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   4.00000000e+00,
      0.00000000e+00,   6.00000000e+00,   0.00000000e+00,
      4.00000000e+00,   0.00000000e+00,   2.41664000e+05,
      1.02400000e+03,   0.00000000e+00,   2.00000000e+00,
      3.27680000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   5.00000000e+00,   4.10454072e+00,
      0.00000000e+00,   6.44010555e+00,   6.75840000e+03,
      0.00000000e+00,   2.40640000e+04,   4.62608000e+04,
      3.14400000e+03,   1.54712000e+05,   8.00000000e+00,
      1.55000000e+02,   1.00000000e+00,   0.00000000e+00,
      6.00000000e+00,   3.19910735e+00,   1.97133529e+00,
      5.21481585e+00,   4.52000000e+02,   3.40000000e+01,
      9.58000000e+02,   0.00000000e+00,   1.50000000e+01],
   [  3.32000000e+02,   2.24000000e+02,   2.58000000e+02,
      1.00000000e+01,   0.00000000e+00,   1.18784000e+05,
      3.81952000e+05,   0.00000000e+00,   5.99140000e+04,
      4.09600000e+03,   1.22880000e+05,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   5.00000000e+00,
      1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   1.00000000e+00,   5.20192000e+05,
      1.02400000e+03,   5.58287000e+05,   2.00000000e+00,
      3.30880000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   5.00000000e+00,   5.66240790e+00,
      4.18369159e+00,   7.96187140e+00,   1.00147200e+05,
      9.21600000e+03,   3.34848000e+05,   1.01559800e+05,
      9.36800000e+03,   3.34440000e+05,   7.00000000e+00,
      1.14000000e+02,   0.00000000e+00,   0.00000000e+00,
      1.80000000e+01,   6.53094643e+00,   2.45849223e+00,
      7.99268848e+00,   1.85234444e+04,   4.80000000e+01,
      3.39450000e+04,   7.20000000e+01,   1.40000000e+01],
   [  3.32000000e+02,   2.24000000e+02,   2.58000000e+02,
      1.00000000e+01,   0.00000000e+00,   1.74592000e+05,
      3.00032000e+05,   0.00000000e+00,   1.17140000e+05,
      4.09600000e+03,   1.80224000e+05,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   5.00000000e+00,
      1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   1.00000000e+00,   4.87424000e+05,
      1.02400000e+03,   5.13173000e+05,   2.00000000e+00,
      3.30880000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   5.00000000e+00,   5.73547047e+00,
      4.75826034e+00,   7.36431335e+00,   9.30816000e+04,
      1.53600000e+04,   1.92000000e+05,   9.46988000e+04,
      2.15000000e+04,   1.91664000e+05,   1.10000000e+01,
      2.54000000e+02,   1.50000000e+01,   0.00000000e+00,
      1.50000000e+01,   5.73239307e+00,   2.85236422e+00,
      7.98772639e+00,   1.27061333e+04,   1.18000000e+02,
      6.05000000e+04,   7.20000000e+01,   1.40000000e+01],
   [  3.32000000e+02,   2.24000000e+02,   2.58000000e+02,
      9.00000000e+00,   0.00000000e+00,   4.75648000e+05,
      3.48672000e+05,   0.00000000e+00,   3.19769000e+05,
      4.09600000e+03,   4.83328000e+05,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   5.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   0.00000000e+00,   8.56064000e+05,
      1.02400000e+03,   1.82072586e+09,   2.00000000e+00,
      3.30880000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   5.00000000e+00,   5.13993423e+00,
      4.48079036e+00,   6.55814891e+00,   1.64864000e+05,
      1.38240000e+04,   4.75648000e+05,   1.68145200e+05,
      3.08400000e+04,   4.75580000e+05,   1.40000000e+01,
      4.21000000e+02,   1.50000000e+01,   0.00000000e+00,
      5.90000000e+01,   2.82782573e+00,   9.60953136e-01,
      7.21232881e+00,   2.63703390e+03,   2.00000000e+01,
      6.76240000e+04,   7.20000000e+01,   0.00000000e+00],
   [  3.32000000e+02,   2.24000000e+02,   2.59000000e+02,
      9.00000000e+00,   0.00000000e+00,   1.57696000e+05,
      6.24640000e+04,   0.00000000e+00,   6.70150000e+04,
      4.09600000e+03,   1.63840000e+05,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   5.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   0.00000000e+00,   2.33472000e+05,
      1.02400000e+03,   2.72988000e+05,   2.00000000e+00,
      3.30240000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   4.00000000e+00,   4.81988481e+00,
      2.97736539e+00,   6.48512410e+00,   5.50400000e+04,
      3.58400000e+03,   1.57696000e+05,   5.56267500e+04,
      6.70000000e+03,   1.57297000e+05,   2.00000000e+00,
      7.60000000e+01,   0.00000000e+00,   0.00000000e+00,
      1.30000000e+01,   3.94329633e+00,   1.81444345e+00,
      6.12204520e+00,   2.70815385e+03,   1.32000000e+02,
      9.64000000e+03,   7.20000000e+01,   1.40000000e+01],
   [  3.32000000e+02,   2.24000000e+02,   2.59000000e+02,
      8.30000000e+01,   8.20000000e+01,   7.24992000e+05,
      2.30604800e+06,   0.00000000e+00,   4.24345600e+06,
      3.52256000e+06,   4.30899200e+06,   4.19430400e+06,
      4.09600000e+03,   4.09600000e+03,   5.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   0.00000000e+00,   6.70924800e+06,
      4.09600000e+03,   3.07704700e+06,   2.00000000e+00,
      3.27680000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   9.00000000e+00,   3.78312500e+00,
      0.00000000e+00,   7.99951830e+00,   3.36782222e+05,
      0.00000000e+00,   1.88416000e+06,   7.44182333e+05,
      2.27200000e+03,   3.06129900e+06,   4.00000000e+00,
      2.43000000e+02,   0.00000000e+00,   0.00000000e+00,
      2.10000000e+01,   3.98746295e+00,   2.64215931e+00,
      6.47369968e+00,   1.42880000e+04,   7.60000000e+01,
      2.70376000e+05,   0.00000000e+00,   0.00000000e+00],
   [  3.32000000e+02,   2.24000000e+02,   2.58000000e+02,
      1.00000000e+01,   0.00000000e+00,   1.20320000e+05,
      3.85024000e+05,   0.00000000e+00,   6.15780000e+04,
      4.09600000e+03,   1.26976000e+05,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   5.00000000e+00,
      1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   1.00000000e+00,   5.28384000e+05,
      1.02400000e+03,   5.66330000e+05,   2.00000000e+00,
      3.30880000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   5.00000000e+00,   5.64644365e+00,
      4.11726412e+00,   7.96277585e+00,   1.01068800e+05,
      9.72800000e+03,   3.30752000e+05,   1.02623800e+05,
      9.40400000e+03,   3.39652000e+05,   3.00000000e+00,
      8.90000000e+01,   0.00000000e+00,   0.00000000e+00,
      6.00000000e+00,   3.72982391e+00,   2.45849223e+00,
      5.31755236e+00,   2.73950000e+03,   4.80000000e+01,
      9.64000000e+03,   7.20000000e+01,   1.50000000e+01],
   [  3.32000000e+02,   2.24000000e+02,   2.59000000e+02,
      1.00000000e+01,   0.00000000e+00,   2.33984000e+05,
      1.37779200e+06,   0.00000000e+00,   9.31200000e+04,
      4.09600000e+03,   2.41664000e+05,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   5.00000000e+00,
      1.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   1.00000000e+00,   1.63020800e+06,
      1.02400000e+03,   1.66150900e+06,   2.00000000e+00,
      3.32800000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   3.00000000e+00,   5.46068132e+00,
      3.13962777e+00,   7.09009944e+00,   5.37258667e+05,
      5.63200000e+03,   1.37216000e+06,   5.39602667e+05,
      1.33160000e+04,   1.37185600e+06,   1.00000000e+00,
      8.00000000e+01,   0.00000000e+00,   0.00000000e+00,
      1.80000000e+01,   4.32832189e+00,   2.32321967e+00,
      7.06841290e+00,   7.61582778e+04,   9.00000000e+00,
      1.34273500e+06,   7.20000000e+01,   1.90000000e+01],
   [  3.32000000e+02,   2.24000000e+02,   2.71000000e+02,
      6.00000000e+00,   0.00000000e+00,   4.91520000e+04,
      5.61152000e+05,   0.00000000e+00,   3.38800000e+04,
      4.09600000e+03,   5.32480000e+04,   4.19430400e+06,
      4.09600000e+03,   4.09600000e+03,   4.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      4.00000000e+00,   0.00000000e+00,   6.14400000e+05,
      4.09600000e+03,   0.00000000e+00,   2.00000000e+00,
      0.00000000e+00,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   4.00000000e+00,   3.69925758e+00,
      0.00000000e+00,   6.48297395e+00,   1.94600000e+04,
      1.60000000e+01,   4.91520000e+04,   1.50074000e+05,
      1.60000000e+01,   5.48460000e+05,   4.00000000e+00,
      1.19000000e+02,   1.00000000e+01,   0.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00],
   [  3.32000000e+02,   2.24000000e+02,   2.58000000e+02,
      1.00000000e+01,   0.00000000e+00,   2.91840000e+04,
      4.45952000e+05,   1.68960000e+04,   1.48190000e+04,
      4.09600000e+03,   3.68640000e+04,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   5.00000000e+00,
      0.00000000e+00,   6.00000000e+00,   0.00000000e+00,
      5.00000000e+00,   0.00000000e+00,   1.76537600e+06,
      1.02400000e+03,   5.94294000e+05,   2.00000000e+00,
      3.41120000e+04,   1.04857600e+06,   4.09600000e+03,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   6.00000000e+00,   3.76419176e+00,
      0.00000000e+00,   6.47970818e+00,   7.93600000e+03,
      0.00000000e+00,   2.91840000e+04,   2.92339333e+05,
      2.53600000e+03,   1.28204800e+06,   8.00000000e+00,
      1.71000000e+02,   1.00000000e+00,   0.00000000e+00,
      6.00000000e+00,   3.15203588e+00,   2.16096405e+00,
      5.21367450e+00,   3.54333333e+02,   2.00000000e+01,
      7.44000000e+02,   0.00000000e+00,   0.00000000e+00],
   [  3.32000000e+02,   2.24000000e+02,   3.31670000e+04,
      2.00000000e+00,   2.50000000e+01,   3.78880000e+04,
      1.53600000e+04,   0.00000000e+00,   4.00000000e+04,
      4.09600000e+03,   4.50560000e+04,   4.19430400e+06,
      4.09600000e+03,   5.12000000e+02,   1.00000000e+00,
      0.00000000e+00,   0.00000000e+00,   0.00000000e+00,
      4.00000000e+00,   0.00000000e+00,   8.19200000e+04,
      1.02400000e+03,   6.78554440e+07,   2.00000000e+00,
      0.00000000e+00,   1.04857600e+06,   1.63840000e+04,
      1.04857600e+06,   4.09600000e+03,   0.00000000e+00,
      1.60000000e+01,   8.00000000e+00,   2.33301385e+00,
      0.00000000e+00,   6.63664803e+00,   6.65600000e+03,
      0.00000000e+00,   3.78880000e+04,   7.19800000e+03,
      8.00000000e+00,   3.77320000e+04,   8.00000000e+00,
      9.60000000e+01,   0.00000000e+00,   0.00000000e+00,
      1.40000000e+01,   3.42918455e+00,   2.41356665e+00,
      5.05007355e+00,   7.17142857e+02,   4.40000000e+01,
      2.21600000e+03,   0.00000000e+00,   1.50000000e+01]])

This is the Y data. It shows the expected output for the 10 dataset.

 inputY = array([ 1.,  1.,  1.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0., 0.,  0.])

n_samples = inputY.size
n_feature = inputX.shape[1]
n_node = 500
display_step = 50
learning_rate=0.0001

# 4 hidden layer 
n_nodes_hl1 = n_node 
n_nodes_hl2 = n_node
n_nodes_hl3 = n_node
n_nodes_hl4 = n_node

n_output = 1 #number of output

x = tf.placeholder(tf.float32,[None,n_feature]) #feature
y = tf.placeholder(tf.float32,[None,n_output]) #label

x1 = tf.placeholder(tf.float32,[None,n_feature]) #feature
y1 = tf.placeholder(tf.float32,[None,n_output]) #label

# 4 layers Neural Network
hidden_1_layer = {'weights':tf.Variable(tf.random_normal([n_feature,     n_nodes_hl1])),'biases':tf.Variable(tf.random_normal([n_nodes_hl1]))}          #define some dictionary
hidden_2_layer = {'weights':tf.Variable(tf.zeros([n_nodes_hl1, n_nodes_hl2])),'biases':tf.Variable(tf.zeros([n_nodes_hl2]))} 
hidden_3_layer = {'weights':tf.Variable(tf.zeros([n_nodes_hl2, n_nodes_hl3])),'biases':tf.Variable(tf.zeros([n_nodes_hl3]))} 
hidden_4_layer = {'weights':tf.Variable(tf.zeros([n_nodes_hl3, n_nodes_hl4])),'biases':tf.Variable(tf.zeros([n_nodes_hl4]))} 
output_layer =   {'weights':tf.Variable(tf.zeros([n_nodes_hl4, n_output])),   'biases':tf.Variable(tf.zeros([n_output]))} 

l1 = tf.add(tf.matmul(x,hidden_1_layer['weights']),     hidden_1_layer['biases'])
l1 = tf.nn.relu(l1) #activation function

l2 = tf.add(tf.matmul(l1,hidden_2_layer['weights']), hidden_2_layer['biases'])
l2 = tf.nn.relu(l2) #activation function

l3 = tf.add(tf.matmul(l2,hidden_3_layer['weights']), hidden_3_layer['biases'])
l3 = tf.nn.relu(l3) #activation function   

l4 = tf.add(tf.matmul(l3,hidden_4_layer['weights']), hidden_4_layer['biases'])
l4 = tf.nn.softmax(l4) #activation function  

output = tf.matmul(l4,output_layer['weights']) + output_layer['biases']
output = tf.nn.softmax(output) #Classification function

cost = tf.reduce_sum(tf.pow(y - output, 2))/(2*n_samples)
optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)

hm_epochs = 1000

init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)

for i in range(hm_epochs):
    # Take a gradient descent step using our inputs and labels
    _,c = sess.run([optimizer,cost],feed_dict={x: inputX, y: inputY})
 #   sess.run(optimizer1, feed_dict={x1: inputX, y1: inputY}) 
    epoch_loss = 0
 #   epoch_loss1 = 0
    if (i) % display_step == 0:
 #       c1 = sess.run(cost1, feed_dict={x1: inputX, y1:inputY})
        epoch_loss += c
#        epoch_loss1 += c1
        print "Training step:", i, "cost=", c, 'loss=', epoch_loss  #, \"W=", sess.run(W), "b=", sess.run(b)
print "Optimization Finished!"
correct = tf.equal(tf.argmax(output,1),tf.argmax(y,1)) #compare the max value of prediction and y
accuracy = tf.reduce_mean(tf.cast(correct,'float'))

o1 = sess.run(output, feed_dict={x: inputX })
o1 = o1.reshape([1,-1])
org = inputY.reshape([1,-1])
print "Actual data : ", org, "\n 3 layer : ",o1, "\n"

----------------------------------------------------------------------------
Output
----------------------------------------------------------------------------
Training step: 0 cost= 0.4 loss= 0.40000000596
Training step: 50 cost= 0.4 loss= 0.40000000596
Training step: 100 cost= 0.4 loss= 0.40000000596
Training step: 150 cost= 0.4 loss= 0.40000000596
Training step: 200 cost= 0.4 loss= 0.40000000596
Training step: 250 cost= 0.4 loss= 0.40000000596
Training step: 300 cost= 0.4 loss= 0.40000000596
Training step: 350 cost= 0.4 loss= 0.40000000596
Training step: 400 cost= 0.4 loss= 0.40000000596
Training step: 450 cost= 0.4 loss= 0.40000000596
Training step: 500 cost= 0.4 loss= 0.40000000596
Training step: 550 cost= 0.4 loss= 0.40000000596
Training step: 600 cost= 0.4 loss= 0.40000000596
Training step: 650 cost= 0.4 loss= 0.40000000596
Training step: 700 cost= 0.4 loss= 0.40000000596
Training step: 750 cost= 0.4 loss= 0.40000000596
Training step: 800 cost= 0.4 loss= 0.40000000596
Training step: 850 cost= 0.4 loss= 0.40000000596
Training step: 900 cost= 0.4 loss= 0.40000000596
Training step: 950 cost= 0.4 loss= 0.40000000596
Optimization Finished!
Actual data :  [[ 1.  1.  1.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.  0.]] 
 3 layer :  [[ 1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.  1.]] 

有人请给我一些好的建议。也许我需要减少功能,但是如何。或者我可以规范化数据集。或者有更好的方法。感谢。

1 个答案:

答案 0 :(得分:0)

为了减少功能,您可以使用主成分分析:

from sklearn.decomposition import PCA
pca_input = PCA(n_components = 3)
inputX_pca = pca_input.fit_transform(inputX)

请注意,您可以使用所需的列数而不是3。但它无论如何都会成为一个数组,它会发现相关的功能有多少(如果有2个相关的功能,它会使它有2列,即使你给3) 要进行规范化,您可以使用以下两种方法之一:

from sklearn.preprocessing import  StandardScaler, MinMaxScaler

所以:

inputX = MinMaxScaler().fit_transform(inputX)

或者:

inputX = StandardScaler().fit_transform(inputX)