0
Your cart

Your cart is empty

Books > Professional & Technical > Energy technology & engineering > Electrical engineering

Buy Now

Theoretical Advances in Neural Computation and Learning (Paperback, Softcover reprint of the original 1st ed. 1994) Loot Price: R4,501
Discovery Miles 45 010
Theoretical Advances in Neural Computation and Learning (Paperback, Softcover reprint of the original 1st ed. 1994): Vwani...

Theoretical Advances in Neural Computation and Learning (Paperback, Softcover reprint of the original 1st ed. 1994)

Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky

 (sign in to rate)
Loot Price R4,501 Discovery Miles 45 010 | Repayment Terms: R422 pm x 12*

Bookmark and Share

Expected to ship within 10 - 15 working days

For any research field to have a lasting impact, there must be a firm theoretical foundation. Neural networks research is no exception. Some of the founda tional concepts, established several decades ago, led to the early promise of developing machines exhibiting intelligence. The motivation for studying such machines comes from the fact that the brain is far more efficient in visual processing and speech recognition than existing computers. Undoubtedly, neu robiological systems employ very different computational principles. The study of artificial neural networks aims at understanding these computational prin ciples and applying them in the solutions of engineering problems. Due to the recent advances in both device technology and computational science, we are currently witnessing an explosive growth in the studies of neural networks and their applications. It may take many years before we have a complete understanding about the mechanisms of neural systems. Before this ultimate goal can be achieved, an swers are needed to important fundamental questions such as (a) what can neu ral networks do that traditional computing techniques cannot, (b) how does the complexity of the network for an application relate to the complexity of that problem, and (c) how much training data are required for the resulting network to learn properly? Everyone working in the field has attempted to answer these questions, but general solutions remain elusive. However, encouraging progress in studying specific neural models has been made by researchers from various disciplines."

General

Imprint: Springer-Verlag New York
Country of origin: United States
Release date: December 2012
First published: 1994
Editors: Vwani Roychowdhury • Kai-Yeung Siu • Alon Orlitsky
Dimensions: 235 x 155 x 25mm (L x W x T)
Format: Paperback
Pages: 468
Edition: Softcover reprint of the original 1st ed. 1994
ISBN-13: 978-1-4613-6160-2
Categories: Books > Computing & IT > General theory of computing > Data structures
Books > Computing & IT > Computer programming > Algorithms & procedures
Books > Science & Mathematics > Physics > Thermodynamics & statistical physics > Statistical physics
Books > Professional & Technical > Energy technology & engineering > Electrical engineering > General
Books > Computing & IT > Applications of computing > Artificial intelligence > General
Promotions
LSN: 1-4613-6160-5
Barcode: 9781461361602

Is the information for this product incomplete, wrong or inappropriate? Let us know about it.

Does this product have an incorrect or missing image? Send us a new image.

Is this product missing categories? Add more categories.

Review This Product

No reviews yet - be the first to create one!

Partners