BaseMixer
class.
To learn more about Lightwood philosophy, follow this link.
mean_absolute_error
, mean_squared_error
, precision_score
, recall_score
, and f1_score
.
USING
clause of the CREATE MODEL
statement.
r2_score
value for regression predictions.balanced_accuracy_score
value for classification predictions.complementary_smape_array_accuracy
value for time series predictions.DESCRIBE
statement.USING
statement that provides an option to configure specific parameters of the training process.
In the upcoming version of MindsDB, it will be possible to choose from more ML frameworks. Please note that the Lightwood engine is used by default.
encoders
Keyencoders
and their options, visit the Lightwood documentation page on encoders.
model
KeyModel | Description | |
---|---|---|
BaseMixer | It is a base class for all mixers. | |
LightGBM | This mixer configures and uses LightGBM for regression or classification tasks depending on the problem definition. | |
LightGBMArray | This mixer consists of several LightGBM mixers in regression mode aimed at time series forecasting tasks. | |
NHitsMixer | This mixer is a wrapper around an MQN-HITS deep learning model. | |
Neural | This mixer trains a fully connected dense network from concatenated encoded outputs of each feature in the dataset to predict the encoded output. | |
NeuralTs | This mixer inherits from Neural mixer and should be used for time series forecasts. | |
ProphetMixer | This mixer is a wrapper around the popular time series library Prophet. | |
RandomForest | This mixer supports both regression and classification tasks. It inherits from sklearn.ensemble.RandomForestRegressor and sklearn.ensemble.RandomForestClassifier. | |
Regression | This mixer inherits from scikit-learn’s Ridge class. | |
SkTime | This mixer is a wrapper around the popular time series library sktime. | |
Unit | This is a special mixer that passes along whatever prediction is made by the target encoder without modifications. It is used for single-column predictive scenarios that may involve complex and/or expensive encoders (e.g. free-form text classification with transformers). | |
XGBoostMixer | This mixer is a good all-rounder, due to the generally great performance of tree-based ML algorithms for supervised learning tasks with tabular data. |
model
options, visit the Lightwood documentation page on mixers.
encoders
and model
keys explained above. To see all the available keys, check out the Lightwood documentation page on JsonAI.
home_rentals
dataset and specify particular encoders
for some columns and a LightGBM model
.