@@ -650,7 +650,7 @@ def predict(Theta1,Theta2,X):
650
650
![ enter description here] [ 25 ]
651
651
- 最终得到的代价函数为:
652
652
![ J(\theta ) = C\sum\limits_ {i = 1}^m {[ {y^{(i)}}\cos {t_1}({\theta ^T}{x^{(i)}}) + (1 - {y^{(i)}})\cos {t_0}({\theta ^T}{x^{(i)}})} ] + \frac{1}{2}\sum\limits_ {j = 1}^{\text{n}} {\theta _ j^2} ] ( http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=J%28%5Ctheta%20%29%20%3D%20C%5Csum%5Climits_%7Bi%20%3D%201%7D%5Em%20%7B%5B%7By%5E%7B%28i%29%7D%7D%5Ccos%20%7Bt_1%7D%28%7B%5Ctheta%20%5ET%7D%7Bx%5E%7B%28i%29%7D%7D%29%20%2B%20%281%20-%20%7By%5E%7B%28i%29%7D%7D%29%5Ccos%20%7Bt_0%7D%28%7B%5Ctheta%20%5ET%7D%7Bx%5E%7B%28i%29%7D%7D%29%7D%20%5D%20%2B%20%5Cfrac%7B1%7D%7B2%7D%5Csum%5Climits_%7Bj%20%3D%201%7D%5E%7B%5Ctext%7Bn%7D%7D%20%7B%5Ctheta%20_j%5E2%7D%20 )
653
- 最后我们想要![ {\min }\limits_ \theta J(\theta )] ( http://chart.apis.google. com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=% 5Cmathop%20%7B%5Cmin%20%7D%5Climits_%5Ctheta%20 %20J%28%5Ctheta%20%29 )
653
+ 最后我们想要![ \mathop {\min }\limits_ \theta J(\theta )] ( http://latex.codecogs. com/gif.latex?%5Clarge%20% 5Cmathop%20%7B%5Cmin%20%7D%5Climits_%5Ctheta%20J%28%5Ctheta%20%29 )
654
654
- 之前我们逻辑回归中的代价函数为:
655
655
![ J(\theta ) = - \frac{1}{m}\sum\limits_ {i = 1}^m {[ {y^{(i)}}\log ({h_ \theta }({x^{(i)}}) + (1 - } {y^{(i)}})\log (1 - {h_ \theta }({x^{(i)}})] + \frac{\lambda }{{2m}}\sum\limits_ {j = 1}^n {\theta _ j^2} ] ( http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=J%28%5Ctheta%20%29%20%3D%20%20-%20%5Cfrac%7B1%7D%7Bm%7D%5Csum%5Climits_%7Bi%20%3D%201%7D%5Em%20%7B%5B%7By%5E%7B%28i%29%7D%7D%5Clog%20%28%7Bh_%5Ctheta%20%7D%28%7Bx%5E%7B%28i%29%7D%7D%29%20%2B%20%281%20-%20%7D%20%7By%5E%7B%28i%29%7D%7D%29%5Clog%20%281%20-%20%7Bh_%5Ctheta%20%7D%28%7Bx%5E%7B%28i%29%7D%7D%29%5D%20%2B%20%5Cfrac%7B%5Clambda%20%7D%7B%7B2m%7D%7D%5Csum%5Climits_%7Bj%20%3D%201%7D%5En%20%7B%5Ctheta%20_j%5E2%7D%20 )
656
656
可以认为这里的![ C = \frac{m}{\lambda }] ( http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=C%20%3D%20%5Cfrac%7Bm%7D%7B%5Clambda%20%7D ) ,只是表达形式问题,这里` C ` 的值越大,SVM的决策边界的` margin ` 也越大,下面会说明
@@ -699,9 +699,10 @@ def predict(Theta1,Theta2,X):
699
699
- 对于给出的` x ` ,计算` f ` ,令:![ f_0^{(i)} = 1] ( http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=f_0%5E%7B%28i%29%7D%20%3D%201 ) 所以:![ {f^{(i)}} \in {R^{m + 1}}] ( http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=%7Bf%5E%7B%28i%29%7D%7D%20%5Cin%20%7BR%5E%7Bm%20%2B%201%7D%7D )
700
700
- 最小化` J ` 求出` θ ` ,
701
701
![ J(\theta ) = C\sum\limits_ {i = 1}^m {[ {y^{(i)}}\cos {t_1}({\theta ^T}{f^{(i)}}) + (1 - {y^{(i)}})\cos {t_0}({\theta ^T}{f^{(i)}})} ] + \frac{1}{2}\sum\limits_ {j = 1}^{\text{n}} {\theta _ j^2} ] ( http://chart.apis.google.com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl=J%28%5Ctheta%20%29%20%3D%20C%5Csum%5Climits_%7Bi%20%3D%201%7D%5Em%20%7B%5B%7By%5E%7B%28i%29%7D%7D%5Ccos%20%7Bt_1%7D%28%7B%5Ctheta%20%5ET%7D%7Bf%5E%7B%28i%29%7D%7D%29%20%2B%20%281%20-%20%7By%5E%7B%28i%29%7D%7D%29%5Ccos%20%7Bt_0%7D%28%7B%5Ctheta%20%5ET%7D%7Bf%5E%7B%28i%29%7D%7D%29%7D%20%5D%20%2B%20%5Cfrac%7B1%7D%7B2%7D%5Csum%5Climits_%7Bj%20%3D%201%7D%5E%7B%5Ctext%7Bn%7D%7D%20%7B%5Ctheta%20_j%5E2%7D%20 )
702
- - 如果![ {\theta ^T}f \geqslant 0] ( http://chart.apis.google. com/chart?cht=tx&chs=1x0&chf=bg,s,FFFFFF00&chco=000000&chl= %7B%5Ctheta%20%5ET%7Df%20%5Cgeqslant%200 ) ,==》预测` y=1 `
702
+ - 如果![ {\theta ^T}f \geqslant 0] ( http://latex.codecogs. com/gif.latex?%5Clarge%20 %7B%5Ctheta%20%5ET%7Df%20%5Cgeqslant%200 ) ,==》预测` y=1 `
703
703
704
704
### 4、使用` scikit-learn ` 中的` SVM ` 模型代码
705
+ - [ 全部代码] ( /SVM/SVM_scikit-learn.py )
705
706
- 线性可分的,指定核函数为` linear ` :
706
707
```
707
708
'''data1——线性分类'''
0 commit comments