Just like we use the Normal Equation to find out the optimum theta value in Linear Regression, can/can't we use a similar formula for Logistic Regression ? Feb 24 151 views

## 1 answer to this question.

Well not likely,  only one discriminative method in classification theory, linear regression... (linear discriminant analysis/fischer discriminant are generative, and even they have a closed form solution due to the extreme simplicity of the distributions fitted).
So, what made Normal Equation so successful in linear regression? Because once you've computed your derivatives, you'll find that the outcome is a set of linear equations, m equations with m variables, which we know can be solved directly using matrix inversions (and other techniques). When logistic regression costs are differentiated, the resultant issue is no longer linear... it is convex (thus global optimum), but not linear, and as a result, present mathematics does not offer us with tools powerful enough to identify the optimum in closed form solution. answered Feb 24 by
• 5,480 points

## Can someone explain to me the difference between a cost function and the gradient descent equation in logistic regression?

when we train a model with data, ...READ MORE

## Can you give an example for isotonic regression?

Have a look at this one, It's ...READ MORE

## Why do we use gradient descent in linear regression?

An example you gave is one-dimensional, which ...READ MORE

## L2 regularization in Logistic regression vs NN

L2 Regularization is known as Ridge Regression. ...READ MORE

## Can we use Normal Equation for Logistic Regression ?

Unfortunately, only one discriminative method in classification ...READ MORE

## Python code for basic linear regression

Hi @Dipti, you could try something like ...READ MORE

## Example of Logistic regression with python code

Have a look at this: import csv import ...READ MORE