:: Experimental :: Fit a parametric survival regression model named accelerated failure time (AFT) model (see Accelerated failure time model (Wikipedia)) based on the Weibull distribution of the survival time.
:: Experimental :: Model produced by AFTSurvivalRegression.
:: Experimental :: Model produced by AFTSurvivalRegression.
Decision tree (Wikipedia) model for regression.
Decision tree (Wikipedia) model for regression. It supports both continuous and categorical features.
Decision tree learning algorithm for regression.
Decision tree learning algorithm for regression. It supports both continuous and categorical features.
Gradient-Boosted Trees (GBTs) model for regression.
Gradient-Boosted Trees (GBTs) model for regression. It supports both continuous and categorical features.
Gradient-Boosted Trees (GBTs) learning algorithm for regression.
Gradient-Boosted Trees (GBTs) learning algorithm for regression. It supports both continuous and categorical features.
The implementation is based upon: J.H. Friedman. "Stochastic Gradient Boosting." 1999.
Notes on Gradient Boosting vs. TreeBoost:
:: Experimental ::
:: Experimental ::
Fit a Generalized Linear Model (see Generalized linear model (Wikipedia)) specified by giving a symbolic description of the linear predictor (link function) and a description of the error distribution (family). It supports "gaussian", "binomial", "poisson" and "gamma" as family. Valid link functions for each family is listed below. The first link function of each family is the default one.
:: Experimental :: Model produced by GeneralizedLinearRegression.
:: Experimental :: Model produced by GeneralizedLinearRegression.
:: Experimental :: Summary of GeneralizedLinearRegression model and predictions.
:: Experimental :: Summary of GeneralizedLinearRegression model and predictions.
:: Experimental :: Summary of GeneralizedLinearRegression fitting and model.
:: Experimental :: Summary of GeneralizedLinearRegression fitting and model.
Isotonic regression.
Isotonic regression.
Currently implemented using parallelized pool adjacent violators algorithm. Only univariate (single feature) algorithm supported.
Model fitted by IsotonicRegression.
Model fitted by IsotonicRegression. Predicts using a piecewise linear function.
For detailed rules see org.apache.spark.mllib.regression.IsotonicRegressionModel.predict()
.
Linear regression.
Linear regression.
The learning objective is to minimize the squared error, with regularization. The specific squared error loss function used is:
$$ L = 1/2n ||A coefficients - y||^2^ $$
This supports multiple types of regularization:
Model produced by LinearRegression.
Model produced by LinearRegression.
:: Experimental :: Linear regression results evaluated on a dataset.
:: Experimental :: Linear regression results evaluated on a dataset.
:: Experimental :: Linear regression training results.
:: Experimental :: Linear regression training results. Currently, the training summary ignores the training weights except for the objective trace.
Random Forest model for regression.
Random Forest model for regression. It supports both continuous and categorical features.
Random Forest learning algorithm for regression.
Random Forest learning algorithm for regression. It supports both continuous and categorical features.
:: DeveloperApi ::
:: DeveloperApi ::
Model produced by a Regressor.
Type of input features. E.g., org.apache.spark.mllib.linalg.Vector
Concrete Model type.
:: Experimental :: Fit a parametric survival regression model named accelerated failure time (AFT) model (see Accelerated failure time model (Wikipedia)) based on the Weibull distribution of the survival time.