Regularized linear regression cost function
This commit is contained in:
parent
d93d111106
commit
6530916642
1 changed files with 7 additions and 15 deletions
|
@ -19,16 +19,8 @@ grad = zeros(size(theta));
|
|||
% You should set J to the cost and grad to the gradient.
|
||||
%
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
J = 1/(2*m) * sum(((X*theta)-y).^2) ...
|
||||
+ lambda/(2*m) * sum(theta(2:end).^2);
|
||||
|
||||
% =========================================================================
|
||||
|
||||
|
|
Reference in a new issue