Applied Deep Learning with Keras
上QQ阅读APP看书,第一时间看更新

Summary

In this chapter, we have covered the various types of linear algebra components and operations that pertain to machine learning. The components include scalars, vectors, matrices, and tensors. The operations that were applied to these tensors included addition, transposition, and multiplication, all of which are fundamental for understanding the underlying mathematics of ANNs.

We also learned some basics of the Keras package, including the mathematics that occurs at each node. We also replicated the model from the first chapter, in which we built a logistic regression model to predict the same target from the bank data; however, we used the Keras library to create the model using an ANN instead of the scikit-learn logistic regression model. We achieved a similar level of accuracy using ANNs.

The next chapters in this book will use the same concepts learned in this chapter; however, we will continue building ANNs with the Keras package. We will extend our ANNs to more than a single layer, creating models that have multiple hidden layers. We will put the "deep" into "deep learning". We will also tackle the issues of under- and overfitting as they relate to training models with ANNs.