Deep Learning Essentials
上QQ阅读APP看书,第一时间看更新

Backpropagation

All the networks learn from the error and then update the network weights/parameters to reflect the errors based on a given cost function. The gradient is the slope representing the relationship between a network’s weights and its error.