|
f8c0087ff3
|
Find closest centroids
|
2014-11-17 23:01:44 +01:00 |
|
|
229023b69c
|
Add exercise 7
|
2014-11-17 21:58:10 +01:00 |
|
|
348d6325cb
|
Email feature extraction
|
2014-11-13 23:50:52 +01:00 |
|
|
f0d4b4d208
|
Preprocess email
|
2014-11-13 23:47:04 +01:00 |
|
|
203cbc997c
|
Implement grid search and determine best parameters for C and sigma
|
2014-11-13 23:34:04 +01:00 |
|
|
e67166bc8e
|
Implement Gaussian kernel
|
2014-11-13 22:59:48 +01:00 |
|
|
7ab47a4d35
|
Add exercise 6
|
2014-11-10 23:44:39 +01:00 |
|
|
2ab445f9a8
|
Compute the test set error
|
2014-11-06 13:49:17 +01:00 |
|
|
78830aaea7
|
Validation curve function
|
2014-11-06 13:31:44 +01:00 |
|
|
3751214442
|
Use lambda=1 for regularizing the polynomial fit
|
2014-11-06 12:23:09 +01:00 |
|
|
717ea8c788
|
Polynomial feature mapping
|
2014-11-06 12:13:13 +01:00 |
|
|
1cc58802eb
|
Learning curve function
|
2014-11-06 12:04:53 +01:00 |
|
|
90f2928cee
|
Move .gitignore to top-level directory
|
2014-11-06 01:19:29 +01:00 |
|
|
2d6da3e3d4
|
Regularized linear regression gradient
|
2014-11-06 01:12:38 +01:00 |
|
|
6530916642
|
Regularized linear regression cost function
|
2014-11-06 00:53:49 +01:00 |
|
|
d93d111106
|
Add programming exercise 5
|
2014-11-05 11:20:50 +01:00 |
|
|
eccdcc0d81
|
Regularized NN gradient
|
2014-11-02 13:48:34 +01:00 |
|
|
bdecab8cf8
|
Implement back propagation
|
2014-11-02 13:27:11 +01:00 |
|
|
052f0625c3
|
Random initialization
|
2014-11-01 21:22:34 +01:00 |
|
|
863f1d7157
|
Compute the sigmoid gradient
|
2014-11-01 21:18:32 +01:00 |
|
|
395c5676dc
|
Add regularization to the cost function
|
2014-11-01 20:42:58 +01:00 |
|
|
f2154a8cc1
|
Compute cost function for the neural network
|
2014-11-01 20:30:58 +01:00 |
|
|
be6f3cbdef
|
Move PDFs in top directory
|
2014-11-01 14:54:47 +01:00 |
|
|
a17f47e396
|
Add programming exercise 4
|
2014-11-01 14:54:22 +01:00 |
|
|
073fbf0204
|
Add neural network prediction function
|
2014-10-23 23:36:15 +02:00 |
|
|
3bf3d9fdc3
|
Add prediction function for one-vs-all classification
|
2014-10-23 22:24:55 +02:00 |
|
|
cf0d25440c
|
Train num_labels one-vs-all logistic regression classifiers
|
2014-10-23 21:17:20 +02:00 |
|
|
9117809537
|
Vectorized regularized logistic regression, again
|
2014-10-21 21:20:26 +02:00 |
|
|
326a924044
|
Add exercise 3
|
2014-10-21 20:59:55 +02:00 |
|
|
f9243ef593
|
Simplify regularization term
|
2014-10-16 01:03:58 +02:00 |
|
|
9e9b9990bb
|
Compute the gradient for regularized logistic regression
|
2014-10-15 19:54:07 +02:00 |
|
|
f391ac661e
|
Add cost function for regularized logistic regression
|
2014-10-15 10:10:30 +02:00 |
|
|
224a17e4d3
|
Plot negative samples in red
|
2014-10-14 10:19:09 +02:00 |
|
|
c0b4d95f75
|
Predict
|
2014-10-14 10:14:29 +02:00 |
|
|
31c4ac1967
|
Compute the gradient for logistic regression
|
2014-10-14 07:46:10 +02:00 |
|
|
8c100a3f49
|
Compute the cost function for logistic regression
|
2014-10-14 07:40:42 +02:00 |
|
|
b863a3863e
|
.gitignore ml_login_data.mat
|
2014-10-13 23:15:15 +02:00 |
|
|
efe94282b4
|
Compute the sigmoid function
|
2014-10-13 23:14:44 +02:00 |
|
|
d52fa95328
|
Plot the data
|
2014-10-13 23:03:23 +02:00 |
|
|
580e52ddc5
|
Initial commit of ex2
|
2014-10-13 22:56:53 +02:00 |
|
|
ba377e29ba
|
Linear regressing using the normal equation
|
2014-10-02 22:48:53 +02:00 |
|
|
ad9ab582de
|
Gradient descent for multiple features
|
2014-10-02 22:33:18 +02:00 |
|
|
3dc8897634
|
Compute cost for multiple variables
|
2014-10-02 22:32:56 +02:00 |
|
|
e38033c00d
|
Clean up whitespace
|
2014-10-02 22:32:37 +02:00 |
|
|
8559c243c5
|
Normalize features
|
2014-10-02 22:20:44 +02:00 |
|
|
613220bb3e
|
Clean up whitespace
|
2014-10-02 22:20:31 +02:00 |
|
|
38064db8b3
|
Print out the cost function J
|
2014-10-02 22:08:08 +02:00 |
|
|
2bf076c2fc
|
Gradient descent for one variable
|
2014-10-02 21:32:24 +02:00 |
|
|
514431e135
|
.gitignore login data
|
2014-10-01 22:06:15 +02:00 |
|
|
14c85aa85c
|
Compute Cost (for one variable)
|
2014-10-01 22:05:13 +02:00 |
|