更新时间:2021-06-25 21:34:38
封面
版权信息
Dedication
Packt Upsell
Why subscribe?
PacktPub.com
Contributors
About the author
About the reviewers
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Code in Action
Conventions used
Get in touch
Reviews
Become an Adaptive Thinker
Technical requirements
How to be an adaptive thinker
Addressing real-life issues before coding a solution
Step 1 – MDP in natural language
Step 2 – the mathematical representation of the Bellman equation and MDP
From MDP to the Bellman equation
Step 3 – implementing the solution in Python
The lessons of reinforcement learning
How to use the outputs
Machine learning versus traditional applications
Summary
Questions
Further reading
Think like a Machine
Designing datasets – where the dream stops and the hard work begins
Designing datasets in natural language meetings
Using the McCulloch-Pitts neuron
The McCulloch-Pitts neuron
The architecture of Python TensorFlow
Logistic activation functions and classifiers
Overall architecture
Logistic classifier
Logistic function
Softmax
Apply Machine Thinking to a Human Problem
Determining what and how to measure
Convergence
Implicit convergence
Numerical – controlled convergence
Applying machine thinking to a human problem
Evaluating a position in a chess game
Applying the evaluation and convergence process to a business problem
Using supervised learning to evaluate result quality
Become an Unconventional Innovator
The XOR limit of the original perceptron
XOR and linearly separable models
Linearly separable models
The XOR limit of a linear model such as the original perceptron
Building a feedforward neural network from scratch
Step 1 – Defining a feedforward neural network
Step 2 – how two children solve the XOR problem every day
Implementing a vintage XOR solution in Python with an FNN and backpropagation
A simplified version of a cost function and gradient descent
Linear separability was achieved
Applying the FNN XOR solution to a case study to optimize subsets of data
Manage the Power of Machine Learning and Deep Learning
Building the architecture of an FNN with TensorFlow
Writing code using the data flow graph as an architectural roadmap
A data flow graph translated into source code
The input data layer
The hidden layer
The output layer
The cost or loss function
Gradient descent and backpropagation
Running the session
Checking linear separability
Using TensorBoard to design the architecture of your machine learning and deep learning solutions
Designing the architecture of the data flow graph
Displaying the data flow graph in TensorBoard
The final source code with TensorFlow and TensorBoard
Using TensorBoard in a corporate environment
Using TensorBoard to explain the concept of classifying customer products to a CEO
Will your views on the project survive this meeting?