0%

Book Description

Grasp the fundamental concepts of deep learning using Tensorflow in a hands-on manner

About This Book

  • Get a first-hand experience of the deep learning concepts and techniques with this easy-to-follow guide
  • Train different types of neural networks using Tensorflow for real-world problems in language processing, computer vision, transfer learning, and more
  • Designed for those who believe in the concept of 'learn by doing', this book is a perfect blend of theory and code examples

Who This Book Is For

This book targets data scientists and machine learning developers who wish to get started with deep learning. If you know what deep learning is but are not quite sure of how to use it, this book will help you as well. An understanding of statistics and data science concepts is required. Some familiarity with Python programming will also be beneficial.

What You Will Learn

  • Understand the fundamentals of deep learning and how it is different from machine learning
  • Get familiarized with Tensorflow, one of the most popular libraries for advanced machine learning
  • Increase the predictive power of your model using feature engineering
  • Understand the basics of deep learning by solving a digit classification problem of MNIST
  • Demonstrate face generation based on the CelebA database, a promising application of generative models
  • Apply deep learning to other domains like language modeling, sentiment analysis, and machine translation

In Detail

Deep learning is a popular subset of machine learning, and it allows you to build complex models that are faster and give more accurate predictions. This book is your companion to take your first steps into the world of deep learning, with hands-on examples to boost your understanding of the topic.

This book starts with a quick overview of the essential concepts of data science and machine learning which are required to get started with deep learning. It introduces you to Tensorflow, the most widely used machine learning library for training deep learning models. You will then work on your first deep learning problem by training a deep feed-forward neural network for digit classification, and move on to tackle other real-world problems in computer vision, language processing, sentiment analysis, and more. Advanced deep learning models such as generative adversarial networks and their applications are also covered in this book.

By the end of this book, you will have a solid understanding of all the essential concepts in deep learning. With the help of the examples and code provided in this book, you will be equipped to train your own deep learning models with more confidence.

Style and approach

A step-by-step guide filled with multiple examples to help you get started with data science and deep learning.

Table of Contents

  1. Title Page
  2. Copyright and Credits
    1. Deep Learning By Example
  3. Packt Upsell
    1. Why subscribe?
    2. PacktPub.com
  4. Contributors
    1. About the author
    2. About the reviewers
    3. Packt is searching for authors like you
  5. Preface
    1. Who this book is for
    2. What this book covers
    3. To get the most out of this book
      1. Download the example code files
      2. Download the color images
      3. Conventions used
    4. Get in touch
      1. Reviews
  6. Data Science - A Birds' Eye View
    1. Understanding data science by an example
    2. Design procedure of data science algorithms
      1. Data pre-processing
        1. Data cleaning
        2. Data pre-processing
      2. Feature selection
      3. Model selection
      4. Learning process
      5. Evaluating your model
    3. Getting to learn
      1. Challenges of learning
        1. Feature extraction – feature engineering
        2. Noise
        3. Overfitting
        4. Selection of a machine learning algorithm
        5. Prior knowledge
        6. Missing values
    4. Implementing the fish recognition/detection model
      1. Knowledge base/dataset
      2. Data analysis pre-processing
      3. Model building
        1. Model training and testing
        2. Fish recognition – all together
    5. Different learning types
      1. Supervised learning
      2. Unsupervised learning
      3. Semi-supervised learning
      4. Reinforcement learning
    6. Data size and industry needs
    7. Summary
  7. Data Modeling in Action - The Titanic Example
    1. Linear models for regression
      1. Motivation
      2. Advertising – a financial example
        1. Dependencies
        2. Importing data with pandas
        3. Understanding the advertising data
        4. Data analysis and visualization
        5. Simple regression model
          1. Learning model coefficients
          2. Interpreting model coefficients
          3. Using the model for prediction
    2. Linear models for classification
      1. Classification and logistic regression
    3. Titanic example – model building and training
      1. Data handling and visualization
      2. Data analysis – supervised machine learning
    4. Different types of errors
    5. Apparent (training set) error
    6. Generalization/true error
    7. Summary
  8. Feature Engineering and Model Complexity – The Titanic Example Revisited
    1. Feature engineering
      1. Types of feature engineering
        1. Feature selection
        2. Dimensionality reduction
        3. Feature construction
      2. Titanic example revisited
        1. Missing values
          1. Removing any sample with missing values in it
          2. Missing value inputting
          3. Assigning an average value
          4. Using a regression or another simple model to predict the values of missing variables
        2. Feature transformations
          1. Dummy features
          2. Factorizing
          3. Scaling
          4. Binning
        3. Derived features
          1. Name
          2. Cabin
          3. Ticket
        4. Interaction features
    2. The curse of dimensionality
      1. Avoiding the curse of dimensionality
    3. Titanic example revisited – all together
    4. Bias-variance decomposition
    5. Learning visibility
      1. Breaking the rule of thumb
    6. Summary
  9. Get Up and Running with TensorFlow
    1. TensorFlow installation
      1. TensorFlow GPU installation for Ubuntu 16.04
        1. Installing NVIDIA drivers and CUDA 8
        2. Installing TensorFlow
      2. TensorFlow CPU installation for Ubuntu 16.04
      3. TensorFlow CPU installation for macOS X
      4. TensorFlow GPU/CPU installation for Windows
    2. The TensorFlow environment
    3. Computational graphs
    4. TensorFlow data types, variables, and placeholders
      1. Variables
      2. Placeholders
      3. Mathematical operations
    5. Getting output from TensorFlow
    6. TensorBoard – visualizing learning
    7. Summary
  10. TensorFlow in Action - Some Basic Examples
    1. Capacity of a single neuron
      1. Biological motivation and connections
    2. Activation functions
      1. Sigmoid
      2. Tanh
      3. ReLU
    3. Feed-forward neural network
    4. The need for multilayer networks
      1. Training our MLP – the backpropagation algorithm
      2. Step 1 – forward propagation
      3. Step 2 – backpropagation and weight updation
    5. TensorFlow terminologies – recap
      1. Defining multidimensional arrays using TensorFlow
      2. Why tensors?
      3. Variables
      4. Placeholders
      5. Operations
    6. Linear regression model – building and training
      1. Linear regression with TensorFlow
    7. Logistic regression model – building and training
      1. Utilizing logistic regression in TensorFlow
        1. Why use placeholders?
        2. Set model weights and bias
        3. Logistic regression model
        4. Training
        5. Cost function
    8. Summary
  11. Deep Feed-forward Neural Networks - Implementing Digit Classification
    1. Hidden units and architecture design
    2. MNIST dataset analysis
      1. The MNIST data
    3. Digit classification – model building and training
      1. Data analysis
      2. Building the model
      3. Model training
    4. Summary
  12. Introduction to Convolutional Neural Networks
    1. The convolution operation
    2. Motivation
      1. Applications of CNNs
    3. Different layers of CNNs
      1. Input layer
      2. Convolution step
      3. Introducing non-linearity
      4. The pooling step
      5. Fully connected layer
        1. Logits layer
    4. CNN basic example – MNIST digit classification
      1. Building the model
        1. Cost function
        2. Performance measures
      2. Model training
    5. Summary
  13. Object Detection – CIFAR-10 Example
    1. Object detection
    2. CIFAR-10 – modeling, building, and training
      1. Used packages
      2. Loading the CIFAR-10 dataset
      3. Data analysis and preprocessing
      4. Building the network
      5. Model training
      6. Testing the model
    3. Summary
  14. Object Detection – Transfer Learning with CNNs
    1. Transfer learning
      1. The intuition behind TL
      2. Differences between traditional machine learning and TL
    2. CIFAR-10 object detection – revisited
      1. Solution outline
      2. Loading and exploring CIFAR-10
      3. Inception model transfer values
      4. Analysis of transfer values
      5. Model building and training
    3. Summary
  15. Recurrent-Type Neural Networks - Language Modeling
    1. The intuition behind RNNs
      1. Recurrent neural networks architectures
      2. Examples of RNNs
        1. Character-level language models
          1. Language model using Shakespeare data
      3. The vanishing gradient problem
      4. The problem of long-term dependencies
    2. LSTM networks
      1. Why does LSTM work?
    3. Implementation of the language model
      1. Mini-batch generation for training
      2. Building the model
        1. Stacked LSTMs
        2. Model architecture
        3. Inputs
        4. Building an LSTM cell
        5. RNN output
        6. Training loss
        7. Optimizer
        8. Building the network
        9. Model hyperparameters
      3. Training the model
        1. Saving checkpoints
        2. Generating text
    4. Summary
  16. Representation Learning - Implementing Word Embeddings
    1. Introduction to representation learning
    2. Word2Vec
      1. Building Word2Vec model
    3. A practical example of the skip-gram architecture
    4. Skip-gram Word2Vec implementation
      1. Data analysis and pre-processing
      2. Building the model
      3. Training
    5. Summary
  17. Neural Sentiment Analysis
    1. General sentiment analysis architecture
      1. RNNs – sentiment analysis context
      2. Exploding and vanishing gradients - recap
    2. Sentiment analysis – model implementation
      1. Keras
      2. Data analysis and preprocessing
      3. Building the model
      4. Model training and results analysis
    3. Summary
  18. Autoencoders – Feature Extraction and Denoising
    1. Introduction to autoencoders
    2. Examples of autoencoders
    3. Autoencoder architectures
    4. Compressing the MNIST dataset
      1. The MNIST dataset
      2. Building the model
      3. Model training
    5. Convolutional autoencoder
      1. Dataset
      2. Building the model
      3. Model training
    6. Denoising autoencoders
      1. Building the model
      2. Model training
    7. Applications of autoencoders
      1. Image colorization
      2. More applications
    8. Summary
  19. Generative Adversarial Networks
    1. An intuitive introduction
    2. Simple implementation of GANs
      1. Model inputs
      2. Variable scope
      3. Leaky ReLU
      4. Generator
      5. Discriminator
      6. Building the GAN network
        1. Model hyperparameters
        2. Defining the generator and discriminator
        3. Discriminator and generator losses
        4. Optimizers
      7. Model training
        1. Generator samples from training
      8. Sampling from the generator
    3. Summary
  20. Face Generation and Handling Missing Labels
    1. Face generation
      1. Getting the data
      2. Exploring the Data
      3. Building the model
        1. Model inputs
        2. Discriminator
        3. Generator
        4. Model losses
        5. Model optimizer
        6. Training the model
    2. Semi-supervised learning with Generative Adversarial Networks (GANs)
      1. Intuition
      2. Data analysis and preprocessing
      3. Building the model
        1. Model inputs
        2. Generator
        3. Discriminator
        4. Model losses
        5. Model optimizer
      4. Model training
    3. Summary
  21. Implementing Fish Recognition
    1. Code for fish recognition
  22. Other Books You May Enjoy
    1. Leave a review - let other readers know what you think