Skip to content
Paperback Neural Networks : A Comprehensive Foundation 2ND EDITION Book

ISBN: 8120323734

ISBN13: 9788120323735

Neural Networks : A Comprehensive Foundation 2ND EDITION

Select Format

Select Condition ThriftBooks Help Icon

Recommended

Format: Paperback

Condition: Good

$31.19
Almost Gone, Only 1 Left!

Book Overview

Fluid and authoritative, this well-organized book represents the first comprehensive treatment of neural networks and learning machines from an engineering perspective, providing extensive,... This description may be from another edition of this product.

Customer Reviews

5 ratings

I wish all books were like this.

Extremely concise, extremely complete. Every new page has a new concept or method. In the first chapter, I knew more than I did after reading two other books I bought on the subject.I would suggest, however, not to use this as an introduction. It's a bit more rigorous mathematically than some others, so use it if you understand the concepts first. It will shine new insight onto the concepts you already know, but it will probably fail at teaching them to you from the ground up.I suggest this for the experienced Artificial Intelligence experimenter. And for the love of god, use Perl for your test programs! Writing C++ classes for artificial intelligence is wholly impractical!

Well suited for teachers and undergraduates...

There aren't too many words to comment on the book. If you have strong mathematical analysis basics and you love Neural Networks then you have found your book. It was hard at the beginning thus I had to brush up my memories of mathematical analysis to have the "puzzle" slowly shape up. All the algorithms are introduced by clear and rigorous mathematical theory. I think it's well suited for teachers and undergraduates.

Good detail with rigorous mathematics

This book, excellent for self-study and for use as a textbook, covers a subject that has had enormous impact in science and technology. One can say with confidence that neural networks will increase in importance in the decades ahead, especially in the field of artificial intelligence. The book is a comprehensive overview, and does take some time to read and digest, but it is worth the effort, as there are many applications of neural networks and the author is detailed in his discussion. In the first part of the book, the author introduces neural networks and modeling brain functions. A good overview of the modeling of neural networks and knowledge representation is given, along with a discussion of how they are used in artificial intelligence. Ideas from computational learning are introduced, as well as the important concept of the Vapnik-Chervonenkis (VC) dimension. The VC dimension is defined in this book in terms of the maximum number of training examples that a machine can learn without errors. The author shows it to be a useful parameter, and allows one to avoid the difficult problem of finding an exact formula for the growth function of a hypothesis space. In the next part of the book, the author discusses learning machines that have a teacher. The single-layer perceptron is introduced and shown to have an error-correction learning algorithm that is convergent. There is a fine discussion of optimization techniques and Bayes classifiers in this part. The least-mean-square algorithm is generalized to the back-propagation algorithm in order to train multi-layer perceptrons along with a discussion on how to optimize its performance using heuristics. The author gives a detailed discussion of the limitations of back-propagation learning. In addition, the radial-basis function networks are introduced. Supervised learning is viewed as an ill-posed hypersurface reconstruction problem, which is then solved using regularization methods. Support vector machines are introduced as neural networks that arise from statistical learning theory considerations via the VC dimension. A summary is given of the differences between the different approaches in neural network learning machines. Committee machines, based on the divide and conquer algorithm, are also treated. Here the strategy is to divide the learning process into a number of experts, with the expectation that the collective efforts of these experts will more efficiently arrive at the solution. The next part of the book introduces unsupervised learning machines. The ability of machines to discover useful information, such as patterns or features, in the input data is taken as an acid test for real intelligence. Hebbian learning via principal components analysis is discussed, along with competitive learning via self-organizing maps. The author uses computer simulations to illustrate the behavior of systems of neurons. Vector quantization is brought in as another supervised learning technique to fine

Theoretically Great

I found this book to be an excellent "research" reference. It's mathematical presentation is rigorous and provides good (up-to-date)theoretical foundation for the experienced scientist/engineer. Saying this, it is not a good book for the beginner especially when one only wants to know the general physical meaning of neural networks and where it is best applied.

Informative and masterfully written.

A wonderfully well written, insightful, treatment of artificial neural networks. Beginning from the basics, the author sets forth both a technological and historical perspective for the understanding this multidisiplinary subject area. The book is written from a practical engineering perspective and comprehensively spans the entire discipline of modern neural network theory. A+
Copyright © 2023 Thriftbooks.com Terms of Use | Privacy Policy | Do Not Sell/Share My Personal Information | Cookie Policy | Cookie Preferences | Accessibility Statement
ThriftBooks® and the ThriftBooks® logo are registered trademarks of Thrift Books Global, LLC
GoDaddy Verified and Secured