I'm not able to understand the difference between the cost function and the gradient. There are examples on the net where people compute the cost function and then there are places where they don't and just go with the gradient descent function

What is the difference between the two if any?
Feb 22, 2022 900 views

## 1 answer to this question.

Cost function is a way to evaluate the performance of the model/algorithm. So if the predicted values differ a lot  from the actual values then this cost function will be high. This also indicates that the algorithm is not performing well, or not learning well from the data.
While Gradient descent is used for finding a local minimum, it is an optimization algorithm used to train machine learning models and neural networks.
Gradient descent helps to find the best parameters that minimize the model’s cost function.
• 5,480 points

## Can someone explain to me the difference between a cost function and the gradient descent equation in logistic regression?

when we train a model with data, ...READ MORE

## Is there any difference between an activation function and a transfer function?

Transfer function and Activation function with respect ...READ MORE

## What is difference between loss function and RMSE in Machine Learning?

The loss function is a function of ...READ MORE

## What is the difference between a Confusion Matrix and Contingency Table?

Confusion Matrix is a classification matrix used ...READ MORE

## Use different distance formula other than euclidean distance in k means

K-means is based on variance minimization. The sum-of-variance formula ...READ MORE

## Overfitting vs Underfitting

In statistics and machine learning, one of ...READ MORE

+1 vote

## How to handle Nominal Data?

Nominal data is basically data which can ...READ MORE

## How to handle outliers

There are multiple ways to handle outliers ...READ MORE