1
0
Fork 0

Regularized linear regression cost function

master
neingeist 10 years ago
parent d93d111106
commit 6530916642

@ -19,16 +19,8 @@ grad = zeros(size(theta));
% You should set J to the cost and grad to the gradient.
%
J = 1/(2*m) * sum(((X*theta)-y).^2) ...
+ lambda/(2*m) * sum(theta(2:end).^2);
% =========================================================================