Module connectome.models.lgb
wrapper for gradient boosting classification and regression models
Classes
class GB (features: Union[numpy.ndarray, pandas.core.frame.DataFrame], target: numpy.ndarray, feature_names: list = None, classification: bool = True, fit_directly: bool = True, dart: bool = False, **kwargs)-
wrapper/class for gradient boosting models
Examples:
>>> from connectome.models.lgb import GB >>> import pandas as pd >>> import numpy as np >>> >>> # create synthetic data >>> X, y = make_classification(n_informative=15) >>> X = pd.DataFrame( >>> X, >>> columns=["feature_" + str(i) >>> for i in range(X.shape[1])]) >>> X_regr, y_regr = make_regression(n_features=20, n_informative=15) >>> X_regr = pd.DataFrame( >>> X_regr, >>> columns=["feature_" + str(i) >>> for i in range(X_regr.shape[1])]) >>> >>> # initialize some models >>> gb_class = GB(X, y, classification=True) >>> gb_regr = GB(X_regr, y_regr, classification=False) >>> >>> # get_fis >>> print(gb_class.get_feature_importances()) >>> print(gb_regr.get_feature_importances())Methods
def fit(self, **kwargs)def get_feature_importances(self)-
method to get the ordered feature importances in the form of a DataFrame Returns:
def predict(self, inputs: Union[numpy.ndarray, pandas.core.frame.DataFrame]) ‑> numpy.ndarraydef predict_proba(self, inputs) ‑> numpy.ndarraydef save_model(self, name: str = 'lgb')