2017年2月13日 Ridge Regression

原创
2017/03/20 08:22
阅读数 40

Suppose that for a known matrix A and vector b, we wish fo find a vector X such that

The ridge regression approach seeks to minimize the sum of squared residuals with a regularization term

An explicit solution is given by

import numpy as np
import ml_metrics as mtr

#prepare data
n_samples, n_features = 10, 5
np.random.seed(0)
X = np.random.randn(n_samples, n_features)
y = np.random.randn(n_samples)
y = (y-np.mean(y))/np.std(y)

#ridge regression implementation
def ridge_regression(X, y, alpha):
    tik_mat = alpha * np.identity(X.shape[1])
    coef = np.dot(np.transpose(X), X) + np.dot(np.transpose(tik_mat), tik_mat)
    coef = np.linalg.inv(coef)
    coef = np.dot(coef, np.transpose(X))
    coef = np.dot(coef, y)
    return coef

#train
coef = ridge_regression(X, y, 0)
print mtr.mse(np.dot(X, coef), y)
print coef

#0.677751350808
#[-0.30898281  0.02387927 -0.04666003 -0.2501281   0.16215742]

#scikit-learn implementation
from sklearn.linear_model import Ridge
r = Ridge(alpha=0, fit_intercept=False)
r.fit(X,y)
print mtr.mse(r.predict(X), y)
print r.coef_

#0.677751350808
#[-0.30898281  0.02387927 -0.04666003 -0.2501281   0.16215742]

 

展开阅读全文
加载中

作者的其它热门文章

打赏
0
0 收藏
分享
打赏
0 评论
0 收藏
0
分享
返回顶部
顶部