Home Page Icon
Home Page
Table of Contents for
Contributors
Close
Contributors
by Bharatendra Rai
Advanced Deep Learning with R
Title Page
Copyright and Credits
Advanced Deep Learning with R
About Packt
Why subscribe?
Contributors
About the author
About the reviewer
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
Section 1: Revisiting Deep Learning Basics
Revisiting Deep Learning Architecture and Techniques
Deep learning with R
Deep learning trend
Versions of key R packages used
Process of developing a deep network model
Preparing the data for a deep network model
Developing a deep learning model architecture
Compiling the model
Fitting the model
Assessing the model performance
Deep learning techniques with R and RStudio
Multi-class classification
Regression problems
Image classification
Convolutional neural networks
Autoencoders
Transfer learning
Generative adversarial networks
Deep network for text classification 
Recurrent neural networks 
Long short-term memory network
Convolutional recurrent networks
Tips, tricks, and best practices
Summary
Section 2: Deep Learning for Prediction and Classification
Deep Neural Networks for Multi-Class Classification
Cardiotocogram dataset
Dataset (medical)
Preparing the data for model building
Normalizing numeric variables
Partitioning the data
One-hot encoding
Creating and fitting a deep neural network model
Developing model architecture
Compiling the model
Fitting the model
Model evaluation and predictions
Loss and accuracy calculation
Confusion matrix
Performance optimization tips and best practices
Experimenting with an additional hidden layer
Experimenting with a higher number of units in the hidden layer
Experimenting using a deeper network with more units in the hidden layer
Experimenting by addressing the class imbalance problem
Saving and reloading a model
Summary
Deep Neural Networks for Regression
Understanding the Boston Housing dataset
Preparing the data
Visualizing the neural network
Data partitioning
Normalization
Creating and fitting a deep neural network model for regression
Calculating the total number of parameters
Compiling the model
Fitting the model
Model evaluation and prediction
Evaluation
Prediction
Improvements
Deeper network architecture
Results
Performance optimization tips and best practices
Log transformation on the output variable
Model performance
Summary
Section 3: Deep Learning for Computer Vision
Image Classification and Recognition
Handling image data
Data preparation
Resizing and reshaping
Training, validation, and test data
One-hot encoding
Creating and fitting the model
Developing the model architecture
Compiling the model
Fitting the model
Model evaluation and prediction
Loss, accuracy, and confusion matrices for training data
Prediction probabilities for training data
Loss, accuracy, and confusion matrices for test data
Prediction probabilities for test data
Performance optimization tips and best practices
Deeper networks
Results
Summary
Image Classification Using Convolutional Neural Networks
Data preparation
Fashion-MNIST data
Train and test data
Reshaping and resizing
One-hot encoding
Layers in the convolutional neural networks
Model architecture and related calculations
Compiling the model
Fitting the model
Accuracy and loss
Model evaluation and prediction
Training data
Test data
20 fashion items from the internet
Performance optimization tips and best practices
Image modification
Changes to the architecture
Summary
Applying Autoencoder Neural Networks Using Keras
Types of autoencoders
Dimension reduction autoencoders
MNIST fashion data
Encoder model
Decoder model
Autoencoder model
Compiling and fitting the model
Reconstructed images
Denoising autoencoders
MNIST data
Data preparation
Adding noise
Encoder model
Decoder model
Autoencoder model
Fitting the model
Image reconstruction
Image correction
Images that need correction
Clean images
Encoder model
Decoder model
Compiling and fitting the model
Reconstructing images from training data
Reconstructing images from new data
Summary
Image Classification for Small Data Using Transfer Learning
Using a pretrained model to identify an image
Reading an image
Preprocessing the input
Top five categories
Working with the CIFAR10 dataset
Sample images
Preprocessing and prediction
Image classification with CNN
Data preparation
CNN model
Model performance
Performance assessment with training data
Performance assessment with test data
Classifying images using the pretrained RESNET50 model
Model architecture
Freezing pretrained network weights
Fitting the model
Model evaluation and prediction
Loss, accuracy, and confusion matrix with the training data
Loss, accuracy, and confusion matrix with the test data
Performance optimization tips and best practices
Experimenting with the adam optimizer
Hyperparameter tuning
Experimenting with VGG16 as a pretrained network
Summary
Creating New Images Using Generative Adversarial Networks
Generative adversarial network overview
Processing MNIST image data
Digit five from the training data
Data processing
Developing the generator network
Network architecture
Summary of the generator network
Developing the discriminator network
Architecture
Summary of the discriminator network
Training the network
Initial setup for saving fake images and loss values
Training process
Reviewing results
Discriminator and GAN losses
Fake images
Performance optimization tips and best practices
Changes in the generator and discriminator network
Impact of these changes on the results
Generating a handwritten image of digit eight
Summary
Section 4: Deep Learning for Natural Language Processing
Deep Networks for Text Classification
Text datasets
The UCI machine learning repository
Text data within Keras
Preparing the data for model building
Tokenization
Converting text into sequences of integers
Padding and truncation
Developing a tweet sentiment classification model
Developing deep neural networks
Obtaining IMDb movie review data
Building a classification model
Compiling the model
Fitting the model
Model evaluation and prediction
Evaluation using training data
Evaluation using test data
Performance optimization tips and best practices
Experimenting with the maximum sequence length and the optimizer
Summary
Text Classification Using Recurrent Neural Networks
Preparing data for model building
Padding sequences
Developing a recurrent neural network model
Calculation of parameters
Compiling the model
Fitting the model
Accuracy and loss
Model evaluation and prediction
Training the data
Testing the data
Performance optimization tips and best practices
Number of units in the simple RNN layer
Using different activation functions in the simple RNN layer
Adding more recurrent layers 
The maximum length for padding sequences
Summary
Text classification Using Long Short-Term Memory Network
Why do we use LSTM networks?
Preparing text data for model building
Creating a long short-term memory network model
LSTM network architecture
Compiling the LSTM network model
Fitting the LSTM model
Loss and accuracy plot
Evaluating model performance 
Model evaluation with train data
Model evaluation with test data
Performance optimization tips and best practices
Experimenting with the Adam optimizer
Experimenting with the LSTM network having an additional layer
Experimenting with a bidirectional LSTM layer
Summary
Text Classification Using Convolutional Recurrent Neural Networks
Working with the reuter_50_50 dataset
Reading the training data
Reading the test data
Preparing the data for model building
Tokenization and converting text into a sequence of integers
Changing labels into integers
Padding and truncation of sequences
Data partitioning
One-hot encoding the labels
Developing the model architecture
Compiling and fitting the model
Compiling the model
Fitting the model
Evaluating the model and predicting classes
Model evaluation with training data
Model evaluation with test data
Performance optimization tips and best practices
Experimenting with reduced batch size
Experimenting with batch size, kernel size, and filters in CNNs
Summary
Section 5: The Road Ahead
Tips, Tricks, and the Road Ahead
TensorBoard for training performance visualization
Visualizing deep network models with LIME
Visualizing model training with tfruns
Early stopping of network training
Summary
Other Books You May Enjoy
Leave a review - let other readers know what you think
Search in book...
Toggle Font Controls
Playlists
Add To
Create new playlist
Name your new playlist
Playlist description (optional)
Cancel
Create playlist
Sign In
Email address
Password
Forgot Password?
Create account
Login
or
Continue with Facebook
Continue with Google
Sign Up
Full Name
Email address
Confirm Email Address
Password
Login
Create account
or
Continue with Facebook
Continue with Google
Prev
Previous Chapter
Why subscribe?
Next
Next Chapter
About the author
Contributors
Add Highlight
No Comment
..................Content has been hidden....................
You can't read the all page of ebook, please click
here
login for view all page.
Day Mode
Cloud Mode
Night Mode
Reset