We can use Linear Reg. for binary classification but it can give y
as more than 1 or less than 0 even when possible classes are 1 and 0.
When we use Linear Reg. for binary classification it gives same result as LDA.
NOTE: Put Logistic Function equatiom and cost function equation here:
We can use logistic regression for multi-class classification but not used that often, it is mostly used for binary classification.
Func Imports¶
In [1]:
import pandas as pd
from sklearn import datasets
from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score, classification_report
Data Prep¶
In [2]:
# using sklearn default dataset
iris = datasets.load_iris()
iris.keys()
Out[2]:
In [3]:
# Prepare data for training and testing
X = pd.DataFrame(iris['data'], columns=iris['feature_names'])
y = pd.Series(iris['target'], name='species')\
.apply(lambda x: iris['target_names'][x])
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=42)
Model Fitting¶
In [4]:
# Initialize the model
model = LogisticRegression(multi_class='multinomial', solver='newton-cg')
# Model Fitting
model.fit(X_train, y_train)
# Predictions
y_predict = model.predict(X_test)
Performance Check¶
In [5]:
accuracy_score(y_test, y_predict)
Out[5]:
In [6]:
print(classification_report(y_test, y_predict))