Summary

Completed

Well done for getting through all of that! Let's recap what we covered:

  • Supervised learning is a kind of learning by example. A model makes predictions, the predictions are compared to expected labels, and the model is then updated to produce better results.
  • A cost function is a mathematical way to describe what we want a model to learn. Cost functions calculate large numbers when a model isn't making good predictions, and small numbers when it's performing well.
  • Gradient descent is an optimization algorithm. It's way of calculating how to improve a model, given a cost function and some data.
  • Step size (learning rate) changes how quickly and how well gradient descent performs.