Skip to content

Commit e23d56b

Browse files
committed
Neural Network
1 parent ab45e10 commit e23d56b

File tree

2 files changed

+8
-2
lines changed

2 files changed

+8
-2
lines changed

formula/NeuralNetwork.wmf

7.78 KB
Binary file not shown.

readme.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -414,9 +414,15 @@ from sklearn.linear_model import LogisticRegression
414414
### 2、代价函数
415415
- 假设最后输出的![{h_\Theta }(x) \in {R^K}](http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=%7Bh_%5CTheta%20%7D%28x%29%20%5Cin%20%7BR%5EK%7D),即代表输出层有K个单元
416416
- ![J(\Theta ) = - \frac{1}{m}\sum\limits_{i = 1}^m {\sum\limits_{k = 1}^K {[y_k^{(i)}\log {{({h_\Theta }({x^{(i)}}))}_k}} } + (1 - y_k^{(i)})\log {(1 - {h_\Theta }({x^{(i)}}))_k}]](http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=J%28%5CTheta%20%29%20%3D%20%20-%20%5Cfrac%7B1%7D%7Bm%7D%5Csum%5Climits_%7Bi%20%3D%201%7D%5Em%20%7B%5Csum%5Climits_%7Bk%20%3D%201%7D%5EK%20%7B%5By_k%5E%7B%28i%29%7D%5Clog%20%7B%7B%28%7Bh_%5CTheta%20%7D%28%7Bx%5E%7B%28i%29%7D%7D%29%29%7D_k%7D%7D%20%7D%20%20%2B%20%281%20-%20y_k%5E%7B%28i%29%7D%29%5Clog%20%7B%281%20-%20%7Bh_%5CTheta%20%7D%28%7Bx%5E%7B%28i%29%7D%7D%29%29_k%7D%5D) 其中,![{({h_\Theta }(x))_i}](http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=%7B%28%7Bh_%5CTheta%20%7D%28x%29%29_i%7D)代表第`i`个单元输出
417-
- 与逻辑回归的代价函数差不多,就是累加上每个输出
418-
417+
- 与逻辑回归的代价函数![J(\theta ) = - \frac{1}{m}\sum\limits_{i = 1}^m {[{y^{(i)}}\log ({h_\theta }({x^{(i)}}) + (1 - } {y^{(i)}})\log (1 - {h_\theta }({x^{(i)}})]](http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=J%28%5Ctheta%20%29%20%3D%20%20-%20%5Cfrac%7B1%7D%7Bm%7D%5Csum%5Climits_%7Bi%20%3D%201%7D%5Em%20%7B%5B%7By%5E%7B%28i%29%7D%7D%5Clog%20%28%7Bh_%5Ctheta%20%7D%28%7Bx%5E%7B%28i%29%7D%7D%29%20%2B%20%281%20-%20%7D%20%7By%5E%7B%28i%29%7D%7D%29%5Clog%20%281%20-%20%7Bh_%5Ctheta%20%7D%28%7Bx%5E%7B%28i%29%7D%7D%29%5D)差不多,就是累加上每个输出(共有K个输出)
419418

419+
### 3、正则化
420+
- `L`-->所有层的个数
421+
- ![{S_l}](http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=%7BS_l%7D)-->第`l`层unit的个数
422+
- 正则化后的**代价函数**
423+
![J(\Theta ) = - \frac{1}{m}\sum\limits_{i = 1}^m {\sum\limits_{k = 1}^K {[y_k^{(i)}\log {{({h_\Theta }({x^{(i)}}))}_k}} } + (1 - y_k^{(i)})\log {(1 - {h_\Theta }({x^{(i)}}))_k}] + \frac{\lambda }{{2m}}\sum\limits_{l = 1}^{L - 1} {\sum\limits_{i = 1}^{{S_l}} {\sum\limits_{j = 1}^{{S_{l + 1}}} {{{(\theta _{ij}^{(l)})}^2}} } } ](http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=J%28%5CTheta%20%29%20%3D%20%20-%20%5Cfrac%7B1%7D%7Bm%7D%5Csum%5Climits_%7Bi%20%3D%201%7D%5Em%20%7B%5Csum%5Climits_%7Bk%20%3D%201%7D%5EK%20%7B%5By_k%5E%7B%28i%29%7D%5Clog%20%7B%7B%28%7Bh_%5CTheta%20%7D%28%7Bx%5E%7B%28i%29%7D%7D%29%29%7D_k%7D%7D%20%7D%20%20%2B%20%281%20-%20y_k%5E%7B%28i%29%7D%29%5Clog%20%7B%281%20-%20%7Bh_%5CTheta%20%7D%28%7Bx%5E%7B%28i%29%7D%7D%29%29_k%7D%5D%20%2B%20%5Cfrac%7B%5Clambda%20%7D%7B%7B2m%7D%7D%5Csum%5Climits_%7Bl%20%3D%201%7D%5E%7BL%20-%201%7D%20%7B%5Csum%5Climits_%7Bi%20%3D%201%7D%5E%7B%7BS_l%7D%7D%20%7B%5Csum%5Climits_%7Bj%20%3D%201%7D%5E%7B%7BS_%7Bl%20%2B%201%7D%7D%7D%20%7B%7B%7B%28%5Ctheta%20_%7Bij%7D%5E%7B%28l%29%7D%29%7D%5E2%7D%7D%20%7D%20%7D%20)
424+
- ![\theta ](http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=%5Ctheta%20)共有`L-1`层,
425+
- 然后是对应每一层的theta矩阵,注意不包含加上偏置项对应的theta(0)
420426

421427

422428

0 commit comments

Comments
 (0)