Computational Learning Theory
Computational learning theory is a subject which has been advancing rapidly in the last few years. The authors concentrate on the probably approximately correct model of learning, and gradually develop the ideas of efficiency considerations. Finally, applications of the theory to artificial neural networks are considered. Many exercises are included throughout, and the list of references is extensive. This volume is relatively self contained as the necessary background material from logic, probability and complexity theory is included. It will therefore form an introduction to the theory of computational learning, suitable for a broad spectrum of graduate students from theoretical computer science and mathematics.
- Biggs is world leader
- All mathematical concepts developed in an economic context
Product details
February 1997Paperback
9780521599221
172 pages
244 × 170 × 9 mm
0.29kg
Available
Table of Contents
- 1. Concepts, hypotheses, learning algorithms
- 2. Boolean formulae and representations
- 3. Probabilistic learning
- 4. Consistent algorithms and learnability
- 5. Efficient learning I
- 6. Efficient learning II
- 7. The VC dimension
- 8. Learning and the VC dimension
- 9. VC dimension and efficient learning
- 10. Linear threshold networks.