neingeist
|
beb652a5be
|
Recover data
|
10 years ago |
neingeist
|
798f82ecc1
|
Project data
|
10 years ago |
neingeist
|
2b98bd80f0
|
Implement PCA
|
10 years ago |
neingeist
|
5f3f65c69c
|
Random initialization
|
10 years ago |
neingeist
|
6c51a29ca2
|
Compute centroid means (vectorized)
|
10 years ago |
neingeist
|
39b09f144a
|
Compute centroid means (unvectorized)
|
10 years ago |
neingeist
|
f8c0087ff3
|
Find closest centroids
|
10 years ago |
neingeist
|
229023b69c
|
Add exercise 7
|
10 years ago |
neingeist
|
348d6325cb
|
Email feature extraction
|
10 years ago |
neingeist
|
f0d4b4d208
|
Preprocess email
|
10 years ago |
neingeist
|
203cbc997c
|
Implement grid search and determine best parameters for C and sigma
|
10 years ago |
neingeist
|
e67166bc8e
|
Implement Gaussian kernel
|
10 years ago |
neingeist
|
7ab47a4d35
|
Add exercise 6
|
10 years ago |
neingeist
|
2ab445f9a8
|
Compute the test set error
|
10 years ago |
neingeist
|
78830aaea7
|
Validation curve function
|
10 years ago |
neingeist
|
3751214442
|
Use lambda=1 for regularizing the polynomial fit
|
10 years ago |
neingeist
|
717ea8c788
|
Polynomial feature mapping
|
10 years ago |
neingeist
|
1cc58802eb
|
Learning curve function
|
10 years ago |
neingeist
|
90f2928cee
|
Move .gitignore to top-level directory
|
10 years ago |
neingeist
|
2d6da3e3d4
|
Regularized linear regression gradient
|
10 years ago |
neingeist
|
6530916642
|
Regularized linear regression cost function
|
10 years ago |
neingeist
|
d93d111106
|
Add programming exercise 5
|
10 years ago |
neingeist
|
eccdcc0d81
|
Regularized NN gradient
|
10 years ago |
neingeist
|
bdecab8cf8
|
Implement back propagation
|
10 years ago |
neingeist
|
052f0625c3
|
Random initialization
|
10 years ago |
neingeist
|
863f1d7157
|
Compute the sigmoid gradient
|
10 years ago |
neingeist
|
395c5676dc
|
Add regularization to the cost function
|
10 years ago |
neingeist
|
f2154a8cc1
|
Compute cost function for the neural network
|
10 years ago |
neingeist
|
be6f3cbdef
|
Move PDFs in top directory
|
10 years ago |
neingeist
|
a17f47e396
|
Add programming exercise 4
|
10 years ago |
neingeist
|
073fbf0204
|
Add neural network prediction function
|
10 years ago |
neingeist
|
3bf3d9fdc3
|
Add prediction function for one-vs-all classification
|
10 years ago |
neingeist
|
cf0d25440c
|
Train num_labels one-vs-all logistic regression classifiers
|
10 years ago |
neingeist
|
9117809537
|
Vectorized regularized logistic regression, again
|
10 years ago |
neingeist
|
326a924044
|
Add exercise 3
|
10 years ago |
neingeist
|
f9243ef593
|
Simplify regularization term
|
10 years ago |
neingeist
|
9e9b9990bb
|
Compute the gradient for regularized logistic regression
|
10 years ago |
neingeist
|
f391ac661e
|
Add cost function for regularized logistic regression
|
10 years ago |
neingeist
|
224a17e4d3
|
Plot negative samples in red
|
10 years ago |
neingeist
|
c0b4d95f75
|
Predict
|
10 years ago |
neingeist
|
31c4ac1967
|
Compute the gradient for logistic regression
|
10 years ago |
neingeist
|
8c100a3f49
|
Compute the cost function for logistic regression
|
10 years ago |
neingeist
|
b863a3863e
|
.gitignore ml_login_data.mat
|
10 years ago |
neingeist
|
efe94282b4
|
Compute the sigmoid function
|
10 years ago |
neingeist
|
d52fa95328
|
Plot the data
|
10 years ago |
neingeist
|
580e52ddc5
|
Initial commit of ex2
|
10 years ago |
neingeist
|
ba377e29ba
|
Linear regressing using the normal equation
|
10 years ago |
neingeist
|
ad9ab582de
|
Gradient descent for multiple features
|
10 years ago |
neingeist
|
3dc8897634
|
Compute cost for multiple variables
|
10 years ago |
neingeist
|
e38033c00d
|
Clean up whitespace
|
10 years ago |