Skip to content

Commit

Permalink
Add L1-L2 (ElasticNet) regularization
Browse files Browse the repository at this point in the history
  • Loading branch information
nizhib committed May 10, 2015
1 parent 9390d63 commit 9a08390
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/sources/regularizers.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,4 @@ model.add(Dense(64, 64, W_regularizer = l2(.01)))

- __l1__(l=0.01): L1 regularization penalty, also known as LASSO
- __l2__(l=0.01): L2 regularization penalty, also known as weight decay, or Ridge
- __l1l2__(l1=0.01, l2=0.01): L1-L2 regularization penalty, also known as ElasticNet
7 changes: 7 additions & 0 deletions keras/regularizers.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,5 +15,12 @@ def l2wrap(g, p):
return g
return l2wrap

def l1l2(l1=.01, l2=.01):
def l1l2wrap(g, p):
g += T.sgn(p) * l1
g += p * l2
return g
return l1l2wrap

def identity(g, p):
return g

0 comments on commit 9a08390

Please sign in to comment.