Announcement: All noncommercial projects registered to use Earth Engine before
April 15, 2025 must
verify noncommercial eligibility to maintain Earth Engine access.
ee.Reducer.ridgeRegression
Stay organized with collections
Save and categorize content based on your preferences.
Creates a reducer that computes a ridge regression with numX independent variables (not including constant) followed by numY dependent variables. Ridge regression is a form of Tikhonov regularization which shrinks the regression coefficients by imposing a penalty on their size. With this implementation of ridge regression there NO NEED to include a constant value for bias.
The first output is a coefficients array with dimensions (numX + 1, numY); each column contains the coefficients for the corresponding dependent variable plus the intercept for the dependent variable in the last column. Additional outputs are a vector of the root mean square of the residuals of each dependent variable and a vector of p-values for each dependent variable. Outputs are null if the system is underdetermined, e.g., the number of inputs is less than numX + 1.
Usage | Returns | ee.Reducer.ridgeRegression(numX, numY, lambda) | Reducer |
Argument | Type | Details | numX | Integer | the number of independent variables being regressed. |
numY | Integer, default: 1 | the number of dependent variables. |
lambda | Float, default: 0.1 | Regularization parameter. |
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2024-07-13 UTC.
[null,null,["Last updated 2024-07-13 UTC."],[[["\u003cp\u003eCreates a reducer for ridge regression, a regularization method that shrinks regression coefficients to prevent overfitting.\u003c/p\u003e\n"],["\u003cp\u003eOutputs include regression coefficients, root mean square of residuals, and p-values for each dependent variable.\u003c/p\u003e\n"],["\u003cp\u003eRequires specifying the number of independent and dependent variables, along with an optional regularization parameter (lambda).\u003c/p\u003e\n"],["\u003cp\u003eThe reducer automatically handles the intercept term, so there's no need to add a constant value for bias.\u003c/p\u003e\n"],["\u003cp\u003eOutputs will be null if the system is underdetermined, meaning there are fewer input data points than independent variables plus one.\u003c/p\u003e\n"]]],[],null,["# ee.Reducer.ridgeRegression\n\nCreates a reducer that computes a ridge regression with numX independent variables (not including constant) followed by numY dependent variables. Ridge regression is a form of Tikhonov regularization which shrinks the regression coefficients by imposing a penalty on their size. With this implementation of ridge regression there NO NEED to include a constant value for bias.\n\n\u003cbr /\u003e\n\nThe first output is a coefficients array with dimensions (numX + 1, numY); each column contains the coefficients for the corresponding dependent variable plus the intercept for the dependent variable in the last column. Additional outputs are a vector of the root mean square of the residuals of each dependent variable and a vector of p-values for each dependent variable. Outputs are null if the system is underdetermined, e.g., the number of inputs is less than numX + 1.\n\n| Usage | Returns |\n|-----------------------------------------------------------|---------|\n| `ee.Reducer.ridgeRegression(numX, `*numY* `, `*lambda*`)` | Reducer |\n\n| Argument | Type | Details |\n|----------|---------------------|------------------------------------------------------|\n| `numX` | Integer | the number of independent variables being regressed. |\n| `numY` | Integer, default: 1 | the number of dependent variables. |\n| `lambda` | Float, default: 0.1 | Regularization parameter. |"]]