Amazon cover image
Image from Amazon.com

Understanding machine learning : from theory to algorithms / Shai Shalev-Shwartz and Shai Ben-David

By: Contributor(s): Publication details: New York, NY, USA ; India : Cambridge University Press, 2014. [Reprinted 2022]Edition: First south asia edition 2015Description: xvi, 397 pages : illustrations ; 26 cmISBN:
  • 9781107057135 (hardback)
  • 1107057132 (hardback)
  • 9781107512825
Subject(s): DDC classification:
  • 006.31 23
Contents:
Machine generated contents note: 1. Introduction; Part I. Foundations: 2. A gentle start; 3. A formal learning model; 4. Learning via uniform convergence; 5. The bias-complexity tradeoff; 6. The VC-dimension; 7. Non-uniform learnability; 8. The runtime of learning; Part II. From Theory to Algorithms: 9. Linear predictors; 10. Boosting; 11. Model selection and validation; 12. Convex learning problems; 13. Regularization and stability; 14. Stochastic gradient descent; 15. Support vector machines; 16. Kernel methods; 17. Multiclass, ranking, and complex prediction problems; 18. Decision trees; 19. Nearest neighbor; 20. Neural networks; Part III. Additional Learning Models: 21. Online learning; 22. Clustering; 23. Dimensionality reduction; 24. Generative models; 25. Feature selection and generation; Part IV. Advanced Theory: 26. Rademacher complexities; 27. Covering numbers; 28. Proof of the fundamental theorem of learning theory; 29. Multiclass learnability; 30. Compression bounds; 31. PAC-Bayes; Appendix A. Technical lemmas; Appendix B. Measure concentration; Appendix C. Linear algebra.
Summary: "Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way. The book provides an extensive theoretical account of the fundamental ideas underlying machine learning and the mathematical derivations that transform these principles into practical algorithms. Following a presentation of the basics of the field, the book covers a wide array of central topics that have not been addressed by previous textbooks. These include a discussion of the computational complexity of learning and the concepts of convexity and stability; important algorithmic paradigms including stochastic gradient descent, neural networks, and structured output learning; and emerging theoretical concepts such as the PAC-Bayes approach and compression-based bounds. Designed for an advanced undergraduate or beginning graduate course, the text makes the fundamentals and algorithms of machine learning accessible to students and non-expert readers in statistics, computer science, mathematics, and engineering"--
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 3.7 (3 votes)
Holdings
Item type Current library Home library Call number Copy number Status Date due Barcode Item holds
Book Book Ayesha Abed Library General Stacks Ayesha Abed Library General Stacks 006.31 SHA (Browse shelf(Opens below)) 1 Available 3010040939
Book Book Ayesha Abed Library General Stacks Ayesha Abed Library General Stacks 006.31 SHA (Browse shelf(Opens below)) 2 Checked out 08/07/2024 3010040940
Book Book Ayesha Abed Library General Stacks Ayesha Abed Library General Stacks 006.31 SHA (Browse shelf(Opens below)) 3 Checked out 07/07/2024 3010040941
Book Book Ayesha Abed Library General Stacks Ayesha Abed Library General Stacks 006.31 SHA (Browse shelf(Opens below)) 4 Checked out 07/07/2024 3010040942
Book Book Ayesha Abed Library General Stacks Ayesha Abed Library General Stacks 006.31 SHA (Browse shelf(Opens below)) 5 Checked out 08/07/2024 3010040943
Total holds: 0

Includes bibliographical references (pages 385-393) and index.

Machine generated contents note: 1. Introduction; Part I. Foundations: 2. A gentle start; 3. A formal learning model; 4. Learning via uniform convergence; 5. The bias-complexity tradeoff; 6. The VC-dimension; 7. Non-uniform learnability; 8. The runtime of learning; Part II. From Theory to Algorithms: 9. Linear predictors; 10. Boosting; 11. Model selection and validation; 12. Convex learning problems; 13. Regularization and stability; 14. Stochastic gradient descent; 15. Support vector machines; 16. Kernel methods; 17. Multiclass, ranking, and complex prediction problems; 18. Decision trees; 19. Nearest neighbor; 20. Neural networks; Part III. Additional Learning Models: 21. Online learning; 22. Clustering; 23. Dimensionality reduction; 24. Generative models; 25. Feature selection and generation; Part IV. Advanced Theory: 26. Rademacher complexities; 27. Covering numbers; 28. Proof of the fundamental theorem of learning theory; 29. Multiclass learnability; 30. Compression bounds; 31. PAC-Bayes; Appendix A. Technical lemmas; Appendix B. Measure concentration; Appendix C. Linear algebra.

"Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way. The book provides an extensive theoretical account of the fundamental ideas underlying machine learning and the mathematical derivations that transform these principles into practical algorithms. Following a presentation of the basics of the field, the book covers a wide array of central topics that have not been addressed by previous textbooks. These include a discussion of the computational complexity of learning and the concepts of convexity and stability; important algorithmic paradigms including stochastic gradient descent, neural networks, and structured output learning; and emerging theoretical concepts such as the PAC-Bayes approach and compression-based bounds. Designed for an advanced undergraduate or beginning graduate course, the text makes the fundamentals and algorithms of machine learning accessible to students and non-expert readers in statistics, computer science, mathematics, and engineering"--

CSE

There are no comments on this title.

to post a comment.
Share