更新时间:2021-06-24 14:34:30
coverpage
Title Page
Copyright and Credits
Hands-On Generative Adversarial Networks with Keras
About Packt
Why subscribe?
Packt.com
Foreword
Contributors
About the author
About the reviewer
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
Section 1: Introduction and Environment Setup
Deep Learning Basics and Environment Setup
Deep learning basics
Artificial Neural Networks (ANNs)
The parameter estimation
Backpropagation
Loss functions
L1 loss
L2 loss
Categorical crossentropy loss
Non-linearities
Sigmoid
Tanh
ReLU
A fully connected layer
The convolution layer
The max pooling layer
Deep learning environment setup
Installing Anaconda and Python
Setting up a virtual environment in Anaconda
Installing TensorFlow
Installing Keras
Installing data visualization and machine learning libraries
The matplotlib library
The Jupyter library
The scikit-learn library
NVIDIA's CUDA Toolkit and cuDNN
The deep learning environment test
Summary
Introduction to Generative Models
Discriminative and generative models compared
Comparing discriminative and generative models
Generative models
Autoregressive models
Variational autoencoders
Reversible flows
Generative adversarial networks
GANs – building blocks
The discriminator
The generator
Real and fake data
Random noise
Discriminator and generator loss
GANs – strengths and weaknesses
Section 2: Training GANs
Implementing Your First GAN
Technical requirements
Imports
Implementing a Generator and Discriminator
Generator
Discriminator
Auxiliary functions
Training your GAN
Further reading
Evaluating Your First GAN
The evaluation of GANs
Image quality
Image variety
Domain specifications
Qualitative methods
k-nearest neighbors
Mode analysis
Other methods
Quantitative methods
The Inception score
The Frechét Inception Distance
Precision Recall and the F1 Score
GANs and the birthday paradox
Improving Your First GAN
Challenges in training GANs
Mode collapse and mode drop
Training instability
Sensitivity to hyperparameter initialization
Vanishing gradients
Tricks of the trade