9951 explained code solutions for 126 technologies


python-scikit-learnFeature importance example


For most models, feature_importances_ model attribute is available after training:

from sklearn import datasets, ensemble, model_selection

X, y = datasets.load_iris(return_X_y=True)
X_train, X_test, y_train, y_test = model_selection.train_test_split(X, y)

model = ensemble.RandomForestClassifier()
model.fit(X_train, y_train)

fi = model.feature_importances_ctrl + c
from sklearn import

import module from scikit-learn

load_iris

loads Iris dataset

model_selection.train_test_split

splits given X and y datasets to test (25% of values by default) and train (75% of values by default) subsets

.RandomForestClassifier(

creates random forest classification model

.fit(

train model with a given features and target variable dataset

.feature_importances_

returns trained model feature importance list


Usage example

from sklearn import datasets, ensemble, model_selection

X, y = datasets.load_iris(return_X_y=True)
X_train, X_test, y_train, y_test = model_selection.train_test_split(X, y)

model = ensemble.RandomForestClassifier()
model.fit(X_train, y_train)

fi = model.feature_importances_
print(fi)
output
[0.11663411 0.03627566 0.39945207 0.44763817]