Machine Learning, 1st Edition

A Theoretical Approach

 
Machine Learning, 1st Edition,Balas Natarajan,ISBN9781558601482
 
 
Up to
25%
off
 

  

Morgan Kaufmann

9781558601482

9780080510538

217

Print Book + eBook

USD 87.54
USD 145.90

Buy both together and save 40%

Print Book

Hardcover

In Stock

Estimated Delivery Time
USD 54.71
USD 72.95

eBook
eBook Overview

DRM Free included formats: PDF

USD 54.71
USD 72.95
Add to Cart
 
 

Description

This is the first comprehensive introduction to computational learning theory. The author's uniform presentation of fundamental results and their applications offers AI researchers a theoretical perspective on the problems they study. The book presents tools for the analysis of probabilistic models of learning, tools that crisply classify what is and is not efficiently learnable. After a general introduction to Valiant's PAC paradigm and the important notion of the Vapnik-Chervonenkis dimension, the author explores specific topics such as finite automata and neural networks. The presentation is intended for a broad audience--the author's ability to motivate and pace discussions for beginners has been praised by reviewers. Each chapter contains numerous examples and exercises, as well as a useful summary of important results. An excellent introduction to the area, suitable either for a first course, or as a component in general machine learning and advanced AI courses. Also an important reference for AI researchers.

Balas Natarajan

Machine Learning, 1st Edition

Machine Learning: A Theoretical Approach
Balas K. Natarajan

  • Chapter 1 Introduction
    • 1.1 Bibliographic Notes

  • Chapter 2 Learning Concept on Countable Domains
    • 2.1 Preliminaries

    • 2.2 Sample Complexity

    • 2.3 Dimension and Learnability

    • 2.4 Learning Concepts with One-Sided Error

    • 2.5 Summary

    • 2.6 Appendix

    • 2.7 Exercises

    • 2.8 Bibliographic Notes

  • Chapter 3 Time Complexity of Concept Learning
    • 3.1 Preliminaries

    • 3.2 Polynomial-Time Learnability

    • 3.3 Occam's Razor

    • 3.4 One-Sided Error

    • 3.5 Hardness Results

    • 3.6 Summary

    • 3.7 Appendix
      • 3.7.1 Randomized Algorithms

      • 3.7.2 Chabyshev's Inequality

    • 3.8 Exercises

    • 3.9 Bibliographic Notes

  • Chapter 4 Learning Concepts on Uncoutable Domains
    • 4.1 Preliminaries

    • 4.2 Uniform Convergence and Learnability

    • 4.3 Summary

    • 4.4 Appendix
      • 4.4.1 Measurability and Probability Distributions

      • 4.4.2 Bounds for the Binomial Distribution

    • 4.5 Exercises

  • Chapter 5 Learning Functions
    • 5.1 Learning Functions on Countable Domains
      • 5.1.1 Dimension and Learnability

      • 5.1.2 Time Complexity of Function Learning

    • 5.2 Learning Functions on Uncountable Domains

    • 5.3 Summary

    • 5.4 Exercises

    • 5.5 Bibliographic Notes

  • Chapter 6 Finite Automata
    • 6.1 Preliminaries

    • 6.2 A Modified Framework

    • 6.3 Summary

    • 6.4 Exercises

    • 6.5 Bibliographic Notes

  • Chapter 7 Neural Networks
    • 7.1 Preliminaries

    • 7.2 Bounded-Precision Networks

    • 7.3 Efficiency Issues

    • 7.4 Summary

    • 7.5 Appendix
      • 7.5.1 Hyperplanes and Half-Spaces

    • 7.6 Exercises

    • 7.7 Bibliographic Notes

  • Chapter 8 Generalizing the Learning Model
    • 8.1 Preliminaries

    • 8.2 Sample Complexity

    • 8.3 Time Complexity

    • 8.4 Prediction
      • 8.4.1 Hardness Results

    • 8.5 Boosting
      • 8.5.1 Confidence Boosting

      • 8.5.2 Precision Boosting

    • 8.6 Summary

    • 8.7 Exercises

    • 8.8 Bibliographic Notes

  • Chapter 9 Conclusion
    • 9.1 The Paradigm

    • 9.2 Recent and Future Directions

    • 9.3 An AI Perspective

  • Index
 
 
Save up to 25% on all Books
Shop with Confidence

Free Shipping around the world
▪ Broad range of products
▪ 30 days return policy
FAQ

Contact Us