logo
Product categories

EbookNice.com

Most ebook files are in PDF format, so you can easily read them using various software such as Foxit Reader or directly on the Google Chrome browser.
Some ebook files are released by publishers in other formats such as .awz, .mobi, .epub, .fb2, etc. You may need to install specific software to read these formats on mobile/PC, such as Calibre.

Please read the tutorial at this link.  https://ebooknice.com/page/post?id=faq


We offer FREE conversion to the popular formats you request; however, this may take some time. Therefore, right after payment, please email us, and we will try to provide the service as quickly as possible.


For some exceptional file formats or broken links (if any), please refrain from opening any disputes. Instead, email us first, and we will try to assist within a maximum of 6 hours.

EbookNice Team

(Ebook) Deep Learning with Python 2nd Edition by François Chollet ISBN 9781617294433 1617294438

  • SKU: EBN-48506484
Zoomable Image
$ 32 $ 40 (-20%)

Status:

Available

0.0

0 reviews
Instant download (eBook) Deep Learning with Python after payment.
Authors:François Chollet
Pages:432 pages.
Year:2018
Editon:1
Publisher:Manning Publications
Language:english
File Size:12.71 MB
Format:pdf
ISBNS:9781617294433, 1617294438
Categories: Ebooks

Product desciption

(Ebook) Deep Learning with Python 2nd Edition by François Chollet ISBN 9781617294433 1617294438

(Ebook) Deep Learning with Python 2nd Edition by François Chollet - Ebook PDF Instant Download/Delivery: 9781617294433 ,1617294438
Full download (Ebook) Deep Learning with Python 2nd Edition after payment

Product details:

ISBN 10: 1617294438
ISBN 13: 9781617294433
Author: François Chollet

Unlock the groundbreaking advances of deep learning with this extensively revised edition of the bestselling original. Learn directly from the creator of Keras and master practical Python deep learning techniques that are easy to apply in the real world. In Deep Learning with Python, Second Edition you will learn:     Deep learning from first principles     Image classification & image segmentation     Timeseries forecasting     Text classification and machine translation     Text generation, neural style transfer, and image generation Deep Learning with Python has taught thousands of readers how to put the full capabilities of deep learning into action. This extensively revised second edition introduces deep learning using Python and Keras, and is loaded with insights for both novice and experienced ML practitioners. You’ll learn practical techniques that are easy to apply in the real world, and important theory for perfecting neural networks. Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications. About the technology Recent innovations in deep learning unlock exciting new software capabilities like automated language translation, image recognition, and more. Deep learning is becoming essential knowledge for every software developer, and modern tools like Keras and TensorFlow put it within your reach, even if you have no background in mathematics or data science.  About the book Deep Learning with Python, Second Edition introduces the field of deep learning using Python and the powerful Keras library. In this new edition, Keras creator François Chollet offers insights for both novice and experienced machine learning practitioners. As you move through this book, you’ll build your understanding through intuitive explanations, crisp illustrations, and clear examples. You’ll pick up the skills to start developing deep-learning applications. What's inside     Deep learning from first principles     Image classification and image segmentation     Time series forecasting     Text classification and machine translation     Text generation, neural style transfer, and image generation About the reader For readers with intermediate Python skills. No previous experience with Keras, TensorFlow, or machine learning is required. About the author François Chollet is a software engineer at Google and creator of the Keras deep-learning library. Table of Contents 1  What is deep learning? 2 The mathematical building blocks of neural networks 3 Introduction to Keras and TensorFlow 4 Getting started with neural networks: Classification and regression 5 Fundamentals of machine learning 6 The universal workflow of machine learning 7 Working with Keras: A deep dive 8 Introduction to deep learning for computer vision 9 Advanced deep learning for computer vision 10 Deep learning for timeseries 11 Deep learning for text 12 Generative deep learning 13 Best practices for the real world 14 Conclusions
 

(Ebook) Deep Learning with Python 2nd Edition Table of contents:

1 What is deep learning?

1.1 Artificial intelligence, machine learning, and deep learning

1.1.1 Artificial intelligence

1.1.2 Machine learning

1.1.3 Learning rules and representations from data

1.1.4 The “deep” in “deep learning”

1.1.5 Understanding how deep learning works, in three figures

1.1.6 What deep learning has achieved so far

1.1.7 Don’t believe the short-term hype

1.1.8 The promise of AI

1.2 Before deep learning: A brief history of machine learning

1.2.1 Probabilistic modeling

1.2.2 Early neural networks

1.2.3 Kernel methods

1.2.4 Decision trees, random forests, and gradient boosting machines

1.2.5 Back to neural networks

1.2.6 What makes deep learning different

1.2.7 The modern machine learning landscape

1.3 Why deep learning? Why now?

1.3.1 Hardware

1.3.2 Data

1.3.3 Algorithms

1.3.4 A new wave of investment

1.3.5 The democratization of deep learning

1.3.6 Will it last?

2 The mathematical building blocks of neural networks

2.1 A first look at a neural network

2.2 Data representations for neural networks

2.2.1 Scalars (rank-0 tensors)

2.2.2 Vectors (rank-1 tensors)

2.2.3 Matrices (rank-2 tensors)

2.2.4 Rank-3 and higher-rank tensors

2.2.5 Key attributes

2.2.6 Manipulating tensors in NumPy

2.2.7 The notion of data batches

2.2.8 Real-world examples of data tensors

2.2.9 Vector data

2.2.10 Timeseries data or sequence data

2.2.11 Image data

2.2.12 Video data

2.3 The gears of neural networks: Tensor operations

2.3.1 Element-wise operations

2.3.2 Broadcasting

2.3.3 Tensor product

2.3.4 Tensor reshaping

2.3.5 Geometric interpretation of tensor operations

2.3.6 A geometric interpretation of deep learning

2.4 The engine of neural networks: Gradient-based optimization

2.4.1 What’s a derivative?

2.4.2 Derivative of a tensor operation: The gradient

2.4.3 Stochastic gradient descent

2.4.4 Chaining derivatives: The Backpropagation algorithm

2.5 Looking back at our first example

2.5.1 Reimplementing our first example from scratch in TensorFlow

2.5.2 Running one training step

2.5.3 The full training loop

2.5.4 Evaluating the model

Summary

3 Introduction to Keras and TensorFlow

3.1 What’s TensorFlow?

3.2 What’s Keras?

3.3 Keras and TensorFlow: A brief history

3.4 Setting up a deep learning workspace

3.4.1 Jupyter notebooks: The preferred way to run deep learning experiments

3.4.2 Using Colaboratory

3.5 First steps with TensorFlow

3.5.1 Constant tensors and variables

3.5.2 Tensor operations: Doing math in TensorFlow

3.5.3 A second look at the GradientTape API

3.5.4 An end-to-end example: A linear classifier in pure TensorFlow

3.6 Anatomy of a neural network: Understanding core Keras APIs

3.6.1 Layers: The building blocks of deep learning

3.6.2 From layers to models

3.6.3 The “compile” step: Configuring the learning process

3.6.4 Picking a loss function

3.6.5 Understanding the fit() method

3.6.6 Monitoring loss and metrics on validation data

3.6.7 Inference: Using a model after training

Summary

4 Getting started with neural networks: Classification and regression

4.1 Classifying movie reviews: A binary classification example

4.1.1 The IMDB dataset

4.1.2 Preparing the data

4.1.3 Building your model

4.1.4 Validating your approach

4.1.5 Using a trained model to generate predictions on new data

4.1.6 Further experiments

4.1.7 Wrapping up

4.2 Classifying newswires: A multiclass classification example

4.2.1 The Reuters dataset

4.2.2 Preparing the data

4.2.3 Building your model

4.2.4 Validating your approach

4.2.5 Generating predictions on new data

4.2.6 A different way to handle the labels and the loss

4.2.7 The importance of having sufficiently large intermediate layers

4.2.8 Further experiments

4.2.9 Wrapping up

4.3 Predicting house prices: A regression example

4.3.1 The Boston housing price dataset

4.3.2 Preparing the data

4.3.3 Building your model

4.3.4 Validating your approach using K-fold validation

4.3.5 Generating predictions on new data

4.3.6 Wrapping up

Summary

5 Fundamentals of machine learning

5.1 Generalization: The goal of machine learning

5.1.1 Underfitting and overfitting

5.1.2 The nature of generalization in deep learning

5.2 Evaluating machine learning models

5.2.1 Training, validation, and test sets

5.2.2 Beating a common-sense baseline

5.2.3 Things to keep in mind about model evaluation

5.3 Improving model fit

5.3.1 Tuning key gradient descent parameters

5.3.2 Leveraging better architecture priors

5.3.3 Increasing model capacity

5.4 Improving generalization

5.4.1 Dataset curation

5.4.2 Feature engineering

5.4.3 Using early stopping

5.4.4 Regularizing your model

Summary

6 The universal workflow of machine learning

6.1 Define the task

6.1.1 Frame the problem

6.1.2 Collect a dataset

6.1.3 Understand your data

6.1.4 Choose a measure of success

6.2 Develop a model

6.2.1 Prepare the data

6.2.2 Choose an evaluation protocol

6.2.3 Beat a baseline

6.2.4 Scale up: Develop a model that overfits

6.2.5 Regularize and tune your model

6.3 Deploy the model

6.3.1 Explain your work to stakeholders and set expectations

6.3.2 Ship an inference model

6.3.3 Monitor your model in the wild

6.3.4 Maintain your model

Summary

7 Working with Keras: A deep dive

7.1 A spectrum of workflows

7.2 Different ways to build Keras models

7.2.1 The Sequential model

7.2.2 The Functional API

7.2.3 Subclassing the Model class

7.2.4 Mixing and matching different components

7.2.5 Remember: Use the right tool for the job

7.3 Using built-in training and evaluation loops

7.3.1 Writing your own metrics

7.3.2 Using callbacks

7.3.3 Writing your own callbacks

7.3.4 Monitoring and visualization with TensorBoard

7.4 Writing your own training and evaluation loops

7.4.1 Training versus inference

7.4.2 Low-level usage of metrics

7.4.3 A complete training and evaluation loop

7.4.4 Make it fast with tf.function

7.4.5 Leveraging fit() with a custom training loop

Summary

8 Introduction to deep learning for computer vision

8.1 Introduction to convnets

8.1.1 The convolution operation

8.1.2 The max-pooling operation

8.2 Training a convnet from scratch on a small dataset

8.2.1 The relevance of deep learning for small-data problems

8.2.2 Downloading the data

8.2.3 Building the model

8.2.4 Data preprocessing

8.2.5 Using data augmentation

8.3 Leveraging a pretrained model

8.3.1 Feature extraction with a pretrained model

8.3.2 Fine-tuning a pretrained model

Summary

9 Advanced deep learning for computer vision

9.1 Three essential computer vision tasks

9.2 An image segmentation example

9.3 Modern convnet architecture patterns

9.3.1 Modularity, hierarchy, and reuse

9.3.2 Residual connections

9.3.3 Batch normalization

9.3.4 Depthwise separable convolutions

9.3.5 Putting it together: A mini Xception-like model

9.4 Interpreting what convnets learn

9.4.1 Visualizing intermediate activations

9.4.2 Visualizing convnet filters

9.4.3 Visualizing heatmaps of class activation

Summary

10 Deep learning for timeseries

10.1 Different kinds of timeseries tasks

10.2 A temperature-forecasting example

10.2.1 Preparing the data

10.2.2 A common-sense, non-machine learning baseline

10.2.3 Let’s try a basic machine learning model

10.2.4 Let’s try a 1D convolutional model

10.2.5 A first recurrent baseline

10.3 Understanding recurrent neural networks

10.3.1 A recurrent layer in Keras

10.4 Advanced use of recurrent neural networks

10.4.1 Using recurrent dropout to fight overfitting

10.4.2 Stacking recurrent layers

10.4.3 Using bidirectional RNNs

10.4.4 Going even further

Summary

11 Deep learning for text

11.1 Natural language processing: The bird’s eye view

11.2 Preparing text data

11.2.1 Text standardization

11.2.2 Text splitting (tokenization)

11.2.3 Vocabulary indexing

11.2.4 Using the TextVectorization layer

11.3 Two approaches for representing groups of words: Sets and sequences

11.3.1 Preparing the IMDB movie reviews data

11.3.2 Processing words as a set: The bag-of-words approach

11.3.3 Processing words as a sequence: The sequence model approach

11.4 The Transformer architecture

11.4.1 Understanding self-attention

11.4.2 Multi-head attention

11.4.3 The Transformer encoder

11.4.4 When to use sequence models over bag-of-words models

11.5 Beyond text classification: Sequence-to-sequence learning

11.5.1 A machine translation example

11.5.2 Sequence-to-sequence learning with RNNs

11.5.3 Sequence-to-sequence learning with Transformer

Summary

12 Generative deep learning

12.1 Text generation

12.1.1 A brief history of generative deep learning for sequence generation

12.1.2 How do you generate sequence data?

12.1.3 The importance of the sampling strategy

12.1.4 Implementing text generation with Keras

12.1.5 A text-generation callback with variable-temperature sampling

12.1.6 Wrapping up

12.2 DeepDream

12.2.1 Implementing DeepDream in Keras

12.2.2 Wrapping up

12.3 Neural style transfer

12.3.1 The content loss

12.3.2 The style loss

12.3.3 Neural style transfer in Keras

12.3.4 Wrapping up

12.4 Generating images with variational autoencoders

12.4.1 Sampling from latent spaces of images

12.4.2 Concept vectors for image editing

12.4.3 Variational autoencoders

12.4.4 Implementing a VAE with Keras

12.4.5 Wrapping up

12.5 Introduction to generative adversarial networks

12.5.1 A schematic GAN implementation

12.5.2 A bag of tricks

12.5.3 Getting our hands on the CelebA dataset

12.5.4 The discriminator

12.5.5 The generator

12.5.6 The adversarial network

12.5.7 Wrapping up

Summary

13 Best practices for the real world

13.1 Getting the most out of your models

13.1.1 Hyperparameter optimization

13.1.2 ensembling

13.2 Scaling-up model training

13.2.1 Speeding up training on GPU with mixed precision

13.2.2 Multi-GPU training

13.2.3 TPU training

Summary

14 Conclusions

14.1 Key concepts in review

14.1.1 Various approaches to AI

14.1.2 What makes deep learning special within the field of machine learning

14.1.3 How to think about deep learning

14.1.4 Key enabling technologies

14.1.5 The universal machine learning workflow

14.1.6 Key network architectures

14.1.7 The space of possibilities

14.2 The limitations of deep learning

14.2.1 The risk of anthropomorphizing machine learning models

14.2.2 Automatons vs. intelligent agents

14.2.3 Local generalization vs. extreme generalization

14.2.4 The purpose of intelligence

14.2.5 Climbing the spectrum of generalization

14.3 Setting the course toward greater generality in AI

14.3.1 On the importance of setting the right objective: The shortcut rule

14.3.2 A new target

14.4 Implementing intelligence: The missing ingredients

14.4.1 Intelligence as sensitivity to abstract analogies

14.4.2 The two poles of abstraction

14.4.3 The missing half of the picture

14.5 The future of deep learning

14.5.1 Models as programs

14.5.2 Blending together deep learning and program synthesis

14.5.3 Lifelong learning and modular subroutine reuse

14.5.4 The long-term vision

14.6 Staying up to date in a fast-moving field

14.6.1 Practice on real-world problems using Kaggle

14.6.2 Read about the latest developments on arXiv

14.6.3 Explore the Keras ecosystem

People also search for (Ebook) Deep Learning with Python 2nd Edition:

    
deep learning with python github
    
deep learning with python 3rd
    
deep learning with python amazon
    
deep learning with python and pytorch
    
deep learning with python audiobook

Tags: François Chollet, Deep Learning, Python

*Free conversion of into popular formats such as PDF, DOCX, DOC, AZW, EPUB, and MOBI after payment.

Related Products