Preliminary study on ridge regression and LASSO regression (implemented in python)

As a rookie who has studied artificial neural network, I just heard about ridge regression estimation and LASSO estimation two days ago, and the statistics teacher also assigned homework. However, I didn't understand it very well and wrote it in Python. I don't know whether it is correct or not. Please do not hesitate to give advice where it is inappropriate.

The operation is as follows:

x. y is known, which is estimated by ridge estimation and LASSO estimation respectivelyAnd use MSE to evaluate the estimation results.

Personal understanding:

When there is no data at all, it is obvious that some data needs to be generated randomly. I knew it when I was young. If I knew itandValue, given an x, there will be a y Generation.

therefore

Step 1: you must generate a batch of yAs the real value, it is used to compare with the next estimated value.

The second step should be to use ridge estimation and LASSO estimation under the condition of given x and yEstimate of.

(well, it seems like nonsense)

For ridge estimation and LASSO estimation, ridge regression and LASSO methods are used.

For the theoretical knowledge of ridge regression and LASSO, please refer to the following links:

Machine learning - week 3, ridge regression and lasso (personally, it's very good)

Next, after understanding the superficial theoretical knowledge, start to implement the code.

(in fact, this is not good. You should have a deep understanding of the theoretical knowledge before you start writing the code, because if you think wrong at the beginning, it may lead to the complete overthrow of the code.)

Tell me the whole idea of implementing the code:

# Problem Description: model y=x*b+a. Given x, y, estimate b.
# overall idea:
      # 0. Initialize and set global parameters: n is the total amount of data and dim is the dimension
      # 1. Randomly generate the coefficients b and a of n*dim dimension x and dim*1 dimension, and calculate the value of y
      # 2. Randomly generate the model parameter Lambda, determine the regression model, and obtain the image of model coefficient B and Lambda
      # 3. Through verification, the Lambda value of the optimal regression model is determined
      # 4. Based on the optimal Lama modeling, and given x,y to determine the coefficient_ B
      # 5. Evaluate MSE of real coefficient B and model coefficient B

The code snippet is as follows: (the whole code will provide a link later)

1. Initialize the random generation of X in n*dim dimension and coefficients b and a in dim*1 dimension, calculate the y value and return x,y,b

def ini_data(n,dim):
    '''
    Initialization data
    :param n: Total data
    :param dim: Parameter dimension
    :return: x,y,b(Coefficient)
    '''
    x = np.random.uniform(-100,100,(n,dim))
    b = np.random.rand(dim)
    a = np.random.rand(n)
    y=np.matmul(x,b)+a
    return x,y,b

2. Ridge regression model

def reg_model_Ridge(x,y,alphas,dim):
    '''
    ;Ridge regression estimation 
    :param x:
    :param y:
    :param alphas: Randomly generate multiple model parameters Lambda
    :param dim:dimension
    :return: ridge_B Coefficient of optimal model
    '''
    model_coff=[]
    for alpha in alphas:
        ridge = Ridge(alpha=alpha,normalize=True)
        ridge.fit(x,y)
        model_coff.append(ridge.coef_)
    # if dim<=10:
    #plot_ Data (alpha, model_coff, 'log alpha', 'coefficients',' relationship between alpha coefficient and ridge regression coefficient, dim='+str(dim))
    # Cross validation to find the optimal Lambda value of the model
    ridge_cv= RidgeCV(alphas=alphas,normalize=True,scoring="neg_mean_absolute_error", cv=5)
    ridge_cv.fit(x,y)
    ridge_best_lambda = ridge_cv.alpha_
    # Establish the optimal model
    ridge = Ridge(alpha=ridge_best_lambda,normalize=True)
    ridge.fit(x,y)
    # The coefficients of the optimal model are obtained
    ridge_B = ridge.coef_
    return ridge_B

3.LASSO model

def reg_model_LASSO(x,y,alphas,dim):
    '''
    ;LASSO regression
    :param x:
    :param y:
    :param alphas: Randomly generate multiple model parameters Lambda
    :param dim:dimension
    :return: lasso_B Coefficient of optimal model
    '''
    model_coff=[]
    for alpha in alphas:
        lasso = Lasso(alpha=alpha,normalize=True)
        lasso.fit(x,y)
        model_coff.append(lasso.coef_)
    # if dim <= 10:
    #plot_ Data (alpha, model_coff, 'log alpha', 'coefficients',' relationship between alpha coefficient and LASSO coefficient, dim='+str(dim))
    # Cross validation to find the optimal Lambda value of the model
    lasso_cv= LassoCV(alphas=alphas,normalize=True,max_iter=1000, cv=5)
    lasso_cv.fit(x,y)
    ridge_best_lambda = lasso_cv.alpha_
    # Establish the optimal model
    lasso = Lasso(alpha=ridge_best_lambda,normalize=True)
    lasso.fit(x,y)
    # The coefficients of the optimal model are obtained
    lasso_B = lasso.coef_
    return lasso_B

explain:

Ridge regression and LASSO regression are in Python package sklearn linear_ It has been written in the model, and there are corresponding cross validation models.

The two codes are the same as a whole. First, generate a batch of alpha values to draw the relationship between the alpha value and the model coefficient (why? I don't know much about this now. I hope the great God can give some advice). Then, the alpha value of the optimal model is determined by checking the model. Finally, the model is established based on the optimal alpha value, and the coefficients of the optimal model (that is, the estimated coefficients) are obtainedValue)

Note:

(1) We need to randomly generate a batch of alpha variables of the model, because many times, we need to choose the best alpha value through experiments. If we assign a specific alpha value to the, a new model will be built.

(2) In the call of the model, the way of training neural network is used, such as The fit() function is used to feed data into the model. You can use "." To get the parameter values in the model.

(3) Ridge regression verification method (RidgeCV) has a scoring parameter, but not in LASSO verification.

(4) During operation, if scoring="mean_absolute_error" will cause errors, it should be changed to scoring="neg_mean_absolute_error".

5. Run the code (the job requires to compare the data to two dimensions, so the for loop is used)

def run_fun():
    n =500  # data length
    dims = [10,50,100,200]
    for dim in dims:
        # alphas = 10 ** np.random.uniform(-5,5,dim)
        alphas = 10 ** np.linspace(-5, 5, dim)
        x, y, b=ini_data(n,dim)
        ridge_B=reg_model_Ridge(x,y,alphas,dim)
        RMSE = np.sqrt(mean_squared_error(ridge_B, b))
        print("----------Dimension:", dim, ",---------")
        print("Ridge regression MSE :",RMSE)
        lasso_B=reg_model_LASSO(x,y,alphas,dim)
        LMSE = np.sqrt(mean_squared_error(lasso_B, b))
        print("LASSO MSE :", LMSE)
 run_fun()

6. It's over, that's it.

Code link: https://download.csdn.net/download/u014769320/12890422

The result of the operation is as follows:

 

Tags: Python Machine Learning

Posted by LostKID on Sat, 14 May 2022 08:13:32 +0300