我在. neuralnet_ 我最近尝试了一系列前馈神经网络,每次都给每个相同的数据集,无论我如何调整算法、隐藏层、神经元大小、最大迭代或错误阈值,这两个函数都将它们的预测收敛到大约他们正在训练的任何东西的平均值。nnetR
就拟合而言,线性回归对每个序列的拟合效果更好,而且这两个包似乎rnorm都比真实数据更能拟合来自函数的随机数据。关于问题的数学问题,可能是什么原因造成的,我应该如何解决?我在下面有示例代码,如果需要,可以在下面粘贴示例数据集。谢谢!
model6 <- neuralnet(
target ~ 1 + majorholiday + mon + sat + sun + thu + tue + wed + tickets + l1_target + l7_target, data = data_nn
,algorithm = "rprop+", hidden = c(8), stepmax = 500000
,err.fct = "sse", threshold = 0.01, lifesign = "full", lifesign.step = 100
, linear.output= T)
编辑
一位用户要求我粘贴一些数据。这是下面的一组,我只是在上传之前再次尝试了相同的代码,同样的事情发生了,收敛到target大约 17.45的平均值
row.names target majorholiday mon sat sun thu tue wed backtickets l1_target l7_target
1 8 18.976573088 0 0 0 0 0 0 0 13806 18.114001584 36.521334684
2 9 20.701716096 0 1 0 0 0 0 0 15308 18.976573088 35.477867979
3 10 25.014573616 0 0 1 0 0 0 0 13439 20.701716096 28.173601042
4 11 15.706877377 1 0 0 0 0 0 0 11283 25.014573616 27.602288128
5 12 19.633596721 0 0 0 0 1 0 0 12272 15.706877377 13.801144064
6 13 20.049395337 0 0 0 0 0 1 0 9528 19.633596721 32.777717152
7 14 21.720178282 0 0 0 1 0 0 0 13747 20.049395337 18.114001584
8 15 23.390961226 0 0 0 0 0 0 0 15277 21.720178282 18.976573088
9 16 16.707829447 0 1 0 0 0 0 0 16058 23.390961226 20.701716096
10 17 15.872437975 0 0 1 0 0 0 0 14218 16.707829447 25.014573616
11 18 23.295531996 1 0 0 0 0 0 0 11249 15.872437975 15.706877377
12 19 22.363710716 0 0 0 0 1 0 0 13993 23.295531996 19.633596721
13 20 24.227353276 0 0 0 0 0 1 0 13402 22.363710716 20.049395337
14 21 20.500068156 0 0 0 1 0 0 0 14244 24.227353276 21.720178282
15 22 26.090995836 0 0 0 0 0 0 0 14502 20.500068156 23.390961226
16 23 18.636425597 0 1 0 0 0 0 0 16296 26.090995836 16.707829447
17 24 15.840961757 0 0 1 0 0 0 0 13694 18.636425597 15.872437975
18 25 20.650050308 1 0 0 0 0 0 0 10774 15.840961757 23.295531996
19 26 13.467424114 0 0 0 0 1 0 0 12348 20.650050308 22.363710716
20 27 19.752222033 0 0 0 0 0 1 0 12936 13.467424114 24.227353276
21 28 27.832676502 0 0 0 1 0 0 0 14342 19.752222033 20.500068156
22 29 18.854393759 0 0 0 0 0 0 0 14390 27.832676502 26.090995836
23 30 10.773939291 0 1 0 0 0 0 0 16724 18.854393759 18.636425597
24 31 12.569595839 0 0 1 0 0 0 0 14091 10.773939291 15.840961757
25 32 28.153882107 1 0 0 0 0 0 0 11250 12.569595839 20.650050308
26 33 24.400031160 0 0 0 0 1 0 0 12803 28.153882107 13.467424114
27 34 21.584642949 0 0 0 0 0 1 0 13318 24.400031160 19.752222033
28 35 27.215419370 0 0 0 1 0 0 0 14193 21.584642949 27.832676502
29 36 21.584642949 0 0 0 0 0 0 0 14312 27.215419370 18.854393759
30 37 15.015403791 0 1 0 0 0 0 0 16445 21.584642949 10.773939291
31 38 26.276956633 0 0 1 0 0 0 0 13753 15.015403791 12.569595839
32 39 15.139500902 1 0 0 0 0 0 0 11619 26.276956633 28.153882107
33 40 12.467824272 0 0 0 0 1 0 0 14006 15.139500902 24.400031160
34 41 21.373413039 0 0 0 0 0 1 0 14098 12.467824272 21.584642949
35 42 8.015029889 0 0 0 1 0 0 0 14462 21.373413039 27.215419370
36 43 16.030059779 0 0 0 0 0 0 0 15367 8.015029889 21.584642949
37 44 19.592295285 0 1 0 0 0 0 0 17868 16.030059779 15.015403791
38 45 18.701736409 0 0 1 0 0 0 0 15052 19.592295285 26.276956633
39 46 16.002499062 1 0 0 0 0 0 0 10035 18.701736409 15.139500902
40 47 16.943822536 0 0 0 0 1 0 0 13708 16.002499062 12.467824272
41 48 11.295881691 0 0 0 0 0 1 0 13463 16.943822536 21.373413039
42 49 19.767792959 0 0 0 1 0 0 0 13998 11.295881691 8.015029889
43 50 19.767792959 0 0 0 0 0 0 0 14745 19.767792959 16.030059779
44 51 16.943822536 0 1 0 0 0 0 0 16156 19.767792959 19.592295285
45 52 14.119852113 0 0 1 0 0 0 0 13552 16.943822536 18.701736409
46 53 22.869570079 1 0 0 0 0 0 0 11554 14.119852113 16.002499062
47 54 10.481886286 0 0 0 0 1 0 0 13437 22.869570079 16.943822536
48 55 19.057975066 0 0 0 0 0 1 0 14076 10.481886286 11.295881691
49 56 20.010873819 0 0 0 1 0 0 0 14567 19.057975066 19.767792959
50 57 9.528987533 0 0 0 0 0 0 0 14277 20.010873819 19.767792959
51 58 21.916671326 0 1 0 0 0 0 0 16545 9.528987533 16.943822536
52 59 11.000000000 1 0 0 0 0 0 1 15599 21.916671326 14.119852113
53 60 17.000000000 0 0 0 0 1 0 1 17463 11.000000000 22.869570079
54 61 10.000000000 0 0 0 0 0 1 1 17935 17.000000000 10.481886286
55 62 20.000000000 0 0 0 1 0 0 1 18357 10.000000000 19.057975066
56 63 19.000000000 0 0 0 0 0 0 1 19246 20.000000000 20.010873819
57 64 17.000000000 0 1 0 0 0 0 1 21234 19.000000000 9.528987533
58 65 11.000000000 0 0 1 0 0 0 1 18493 17.000000000 21.916671326
59 66 9.000000000 1 0 0 0 0 0 1 15315 11.000000000 11.000000000
60 67 22.000000000 0 0 0 0 1 0 1 17841 9.000000000 17.000000000
61 68 9.000000000 0 0 0 0 0 1 1 18312 22.000000000 10.000000000
62 69 11.000000000 0 0 0 1 0 0 1 17880 9.000000000 20.000000000
63 70 5.000000000 0 0 0 0 0 0 1 19371 11.000000000 19.000000000
64 71 15.000000000 0 1 0 0 0 0 1 21696 5.000000000 17.000000000
65 72 12.000000000 0 0 1 0 0 0 1 18829 15.000000000 11.000000000
66 73 10.000000000 1 0 0 0 0 0 1 14749 12.000000000 9.000000000
67 74 15.000000000 0 0 0 0 1 0 1 17928 10.000000000 22.000000000
68 75 7.000000000 0 0 0 0 0 1 1 18254 15.000000000 9.000000000