最近实现了一个小小的神经网络预测数字类别。下面就来看看是怎么实现的吧。go…
导包
‘’’Python
1 | import numpy as np |
‘’’
读取数据并显示
‘’’Python
1 | #设置自己下载的数据的路径 |
‘’’
loadtxt(fname,dtype=<class ‘float>,comments=’#’,delimiter=None,converters=None,skiprows=0,
usecols=None,unpack=False,ndmin=0)
fname:要读取的文件,文件名,或生成器
dtype:数据类型,默认float
comments:注释
delimiter:分隔符,默认空格
skiprows:跳过前几行读取,默认是0,必须是int整型
usecols:要读取哪些列,0是第一列,例如,usecols=(1,4,5)将提取第2,第5和第6列。默认读取所有列。
unpack:如果为True,将分列读取
数据重塑
‘’’Python
1 | #Rehape image in 3 dimensions(height=28px,width=28px,canal=1) 重塑3维图像 |
‘’’
搭建网络
‘’’Python
1 | #批量标准化是一种技术手段,可以加速训练速度。Dropout是一种正则化方法,其中 |
‘’’
模型训练(含数据增强)
‘’’Python
1 | ''' |
‘’’
模型评估
‘’’Python
1 | #评估 |
‘’’
模型预测
‘’’Python
1 | ''' |
‘’’
保存结果
‘’’Python
1 | #将结果写入文件 |
‘’’
实验结果
Epoch 1/50
- 176s - loss: 0.2878 - acc: 0.9149 - val_loss: 0.0615 - val_acc: 0.9875
Epoch 2/50 - 177s - loss: 0.1885 - acc: 0.9473 - val_loss: 0.1000 - val_acc: 0.9800
Epoch 3/50 - 176s - loss: 0.1515 - acc: 0.9583 - val_loss: 0.0388 - val_acc: 0.9875
Epoch 4/50 - 188s - loss: 0.1333 - acc: 0.9637 - val_loss: 0.0239 - val_acc: 0.9925
Epoch 5/50 - 174s - loss: 0.1149 - acc: 0.9689 - val_loss: 0.0104 - val_acc: 0.9950
Epoch 6/50 - 177s - loss: 0.1031 - acc: 0.9706 - val_loss: 0.0245 - val_acc: 0.9925
Epoch 7/50 - 184s - loss: 0.0913 - acc: 0.9748 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 8/50 - 185s - loss: 0.0845 - acc: 0.9761 - val_loss: 0.0198 - val_acc: 0.9900
Epoch 9/50 - 201s - loss: 0.0785 - acc: 0.9775 - val_loss: 0.0103 - val_acc: 0.9975
Epoch 10/50 - 195s - loss: 0.0692 - acc: 0.9806 - val_loss: 0.0043 - val_acc: 1.0000
Epoch 11/50 - 189s - loss: 0.0687 - acc: 0.9803 - val_loss: 0.0285 - val_acc: 0.9925
Epoch 12/50 - 181s - loss: 0.0660 - acc: 0.9817 - val_loss: 0.0194 - val_acc: 0.9900
Epoch 13/50 - 197s - loss: 0.0564 - acc: 0.9839 - val_loss: 0.0119 - val_acc: 0.9950
Epoch 14/50 - 200s - loss: 0.0599 - acc: 0.9836 - val_loss: 0.0141 - val_acc: 0.9975
Epoch 15/50 - 201s - loss: 0.0567 - acc: 0.9840 - val_loss: 0.0111 - val_acc: 0.9975
Epoch 16/50 - 191s - loss: 0.0499 - acc: 0.9857 - val_loss: 0.0061 - val_acc: 0.9975
Epoch 17/50 - 189s - loss: 0.0530 - acc: 0.9848 - val_loss: 0.0129 - val_acc: 0.9925
Epoch 18/50 - 194s - loss: 0.0468 - acc: 0.9858 - val_loss: 0.0085 - val_acc: 0.9950
Epoch 19/50 - 199s - loss: 0.0486 - acc: 0.9865 - val_loss: 0.0065 - val_acc: 0.9975
Epoch 20/50 - 192s - loss: 0.0442 - acc: 0.9866 - val_loss: 0.0071 - val_acc: 0.9975
Epoch 21/50 - 190s - loss: 0.0434 - acc: 0.9883 - val_loss: 0.0038 - val_acc: 1.0000
Epoch 22/50 - 173s - loss: 0.0426 - acc: 0.9876 - val_loss: 0.0088 - val_acc: 0.9950
Epoch 23/50 - 191s - loss: 0.0419 - acc: 0.9879 - val_loss: 0.0094 - val_acc: 0.9950
Epoch 24/50 - 189s - loss: 0.0424 - acc: 0.9877 - val_loss: 0.0060 - val_acc: 0.9975
Epoch 25/50 - 191s - loss: 0.0384 - acc: 0.9886 - val_loss: 0.0060 - val_acc: 0.9975
Epoch 26/50 - 182s - loss: 0.0395 - acc: 0.9890 - val_loss: 0.0055 - val_acc: 0.9975
Epoch 27/50 - 173s - loss: 0.0385 - acc: 0.9889 - val_loss: 0.0072 - val_acc: 0.9950
Epoch 28/50 - 172s - loss: 0.0348 - acc: 0.9892 - val_loss: 0.0057 - val_acc: 0.9950
Epoch 29/50 - 171s - loss: 0.0371 - acc: 0.9891 - val_loss: 0.0047 - val_acc: 0.9975
Epoch 30/50 - 171s - loss: 0.0372 - acc: 0.9897 - val_loss: 0.0059 - val_acc: 0.9950
Epoch 31/50 - 171s - loss: 0.0340 - acc: 0.9898 - val_loss: 0.0092 - val_acc: 0.9950
Epoch 32/50 - 173s - loss: 0.0374 - acc: 0.9892 - val_loss: 0.0071 - val_acc: 0.9975
Epoch 33/50 - 196s - loss: 0.0352 - acc: 0.9894 - val_loss: 0.0063 - val_acc: 0.9950
Epoch 34/50 - 209s - loss: 0.0347 - acc: 0.9895 - val_loss: 0.0035 - val_acc: 0.9975
Epoch 35/50 - 192s - loss: 0.0363 - acc: 0.9900 - val_loss: 0.0040 - val_acc: 0.9975
Epoch 36/50 - 190s - loss: 0.0352 - acc: 0.9899 - val_loss: 0.0043 - val_acc: 0.9975
Epoch 37/50 - 203s - loss: 0.0363 - acc: 0.9896 - val_loss: 0.0051 - val_acc: 0.9950
Epoch 38/50 - 199s - loss: 0.0345 - acc: 0.9900 - val_loss: 0.0043 - val_acc: 0.9975
Epoch 39/50 - 190s - loss: 0.0346 - acc: 0.9897 - val_loss: 0.0044 - val_acc: 0.9975
Epoch 40/50 - 187s - loss: 0.0351 - acc: 0.9899 - val_loss: 0.0034 - val_acc: 0.9975
Epoch 41/50 - 181s - loss: 0.0325 - acc: 0.9906 - val_loss: 0.0043 - val_acc: 0.9975
Epoch 42/50 - 184s - loss: 0.0340 - acc: 0.9905 - val_loss: 0.0049 - val_acc: 0.9950
Epoch 43/50 - 173s - loss: 0.0342 - acc: 0.9897 - val_loss: 0.0053 - val_acc: 0.9975
Epoch 44/50 - 190s - loss: 0.0339 - acc: 0.9898 - val_loss: 0.0059 - val_acc: 0.9950
Epoch 45/50 - 206s - loss: 0.0370 - acc: 0.9900 - val_loss: 0.0048 - val_acc: 0.9975
Epoch 46/50 - 204s - loss: 0.0332 - acc: 0.9905 - val_loss: 0.0047 - val_acc: 0.9950
Epoch 47/50 - 212s - loss: 0.0345 - acc: 0.9903 - val_loss: 0.0048 - val_acc: 0.9975
Epoch 48/50 - 198s - loss: 0.0305 - acc: 0.9910 - val_loss: 0.0053 - val_acc: 0.9975
Epoch 49/50 - 174s - loss: 0.0330 - acc: 0.9905 - val_loss: 0.0050 - val_acc: 0.9975
Epoch 50/50 - 179s - loss: 0.0323 - acc: 0.9905 - val_loss: 0.0049 - val_acc: 0.9950
Final loss:0.0209,final accuracy:0.9950