What is the difference between loss and cost function? #400
Replies: 3 comments
-
The loss function refers to the difference between the actual and predicted values for a single training example. In contrast, the Cost function refers to aggregating the differences over the entire dataset and is often used for model optimization. |
Beta Was this translation helpful? Give feedback.
-
In other words, the loss function is to capture the difference between the actual and predicted values for a single record whereas cost functions aggregate the difference for the entire training dataset. The Most commonly used loss functions are Mean-squared error and Hinge loss. |
Beta Was this translation helpful? Give feedback.
-
Loss function measures the error for one data point. |
Beta Was this translation helpful? Give feedback.
-
What is the difference between loss and cost function?
Beta Was this translation helpful? Give feedback.
All reactions