0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R2,500 - R5,000 (2)
  • R5,000 - R10,000 (2)
  • -
Status
Brand

Showing 1 - 4 of 4 matches in All Departments

Theoretical Advances in Neural Computation and Learning (Hardcover, 1994 ed.): Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky Theoretical Advances in Neural Computation and Learning (Hardcover, 1994 ed.)
Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky
R4,577 Discovery Miles 45 770 Ships in 12 - 17 working days

Theoretical Advances in Neural Computation and Learning brings together in one volume some of the recent advances in the development of a theoretical framework for studying neural networks. A variety of novel techniques from disciplines such as computer science, electrical engineering, statistics, and mathematics have been integrated and applied to develop ground-breaking analytical tools for such studies. This volume emphasizes the computational issues in artificial neural networks and compiles a set of pioneering research works, which together establish a general framework for studying the complexity of neural networks and their learning capabilities. This book represents one of the first efforts to highlight these fundamental results, and provides a unified platform for a theoretical exploration of neural computation. Each chapter is authored by a leading researcher and/or scholar who has made significant contributions in this area. Part 1 provides a complexity theoretic study of different models of neural computation. Complexity measures for neural models are introduced, and techniques for the efficient design of networks for performing basic computations, as well as analytical tools for understanding the capabilities and limitations of neural computation are discussed. The results describe how the computational cost of a neural network increases with the problem size. Equally important, these results go beyond the study of single neural elements, and establish to computational power of multilayer networks. Part 2 discusses concepts and results concerning learning using models of neural computation. Basic concepts such as VC-dimension and PAC-learning are introduced, and recentresults relating neural networks to learning theory are derived. In addition, a number of the chapters address fundamental issues concerning learning algorithms, such as accuracy and rate of convergence, selection of training data, and efficient algorithms for learning useful classes of mappings.

Communications, Computation, Control, and Signal Processing - a tribute to Thomas Kailath (Hardcover, 1997 ed.): Arogyaswami... Communications, Computation, Control, and Signal Processing - a tribute to Thomas Kailath (Hardcover, 1997 ed.)
Arogyaswami Paulraj, Vwani Roychowdhury, Charles D. Schaper
R6,068 Discovery Miles 60 680 Ships in 10 - 15 working days

The traditional systems disciplines of communications, computation, control and signal processing are becoming increasingly important in addressing major technological challenges of the coming century, in fields such as materials processing, manufacturing automation, speech recognition and ubiquitous personal communications, among many others. Moreover the boundaries between these separate disciplines are being rapidly blurred by the many demands of these applications. This Tribute, dedicated to Thomas Kailath for his many seminal contributions to these areas, highlights several recent trends and results, described by leading scientists and engineers from around the world. The thirty-six papers in this volume present important results on, among others, interference cancellation in multipath channels, decision feedback equalization for packet transmission, blind equalization and smart antennas for mobile communications, displacement structure, fast and stable algorithms in numerical linear algebra, nonconvex optimization problems, issues in nanoelectronic computation, fundamental limits of control system performance, LQG control with communication constraints, nonlinear "H"INFINITY control, adaptive nonlinear control, model identification, tomographic deconvolution, and higher-order statistics. The applications discussed herein include packet radio, robotics, very flexible mechanical systems, power systems and power electronics, moving object detection, complexity management and several others. The volume starts out with a survey by Professor Kailath entitled Norbert Wiener and the Development of Mathematical Engineering', a term suggested by Wiener that can serve as a compactdescription of the variety of fields described herein.

Communications, Computation, Control, and Signal Processing - a tribute to Thomas Kailath (Paperback, Softcover reprint of the... Communications, Computation, Control, and Signal Processing - a tribute to Thomas Kailath (Paperback, Softcover reprint of the original 1st ed. 1997)
Arogyaswami Paulraj, Vwani Roychowdhury, Charles D. Schaper
R5,796 Discovery Miles 57 960 Ships in 10 - 15 working days

A. Paulraj*, V. Roychowdhury**, and C. Schaper* * Dept. of Electrical Engineering, Stanford University ** Dept. of Electrical Engineering, UCLA Innumerable conferences are held around the world on the subjects of commu nications, computation, control and signal processing, and on their numerous subdisciplines. Therefore one might not envision a coherent conference encom passing all these areas. However, such an event did take place June 22-26, 1995, at an international symposium held at Stanford University to celebrate Professor Thomas Kailath's sixtieth birthday and to honor the notable con tributions made by him and his students and associates. The depth of these contributions was evident from the participation of so many leading figures in each of these fields. Over the five days of the meeting, there were about 200 at tendees, from eighteen countries, more than twenty government and industrial organizations, and various engineering, mathematics and statistics faculties at nearly 50 different academic institutions. They came not only to celebrate but also to learn and to ponder the threads and the connections that Professor Kailath has discovered and woven among so many apparently disparate areas. The organizers received many comments about the richness of the occasion. A distinguished academic wrote of the conference being "the single most rewarding professional event of my life. " The program is summarized in Table 1. 1; a letter of reflections by Dr. C. Rohrs appears a little later."

Theoretical Advances in Neural Computation and Learning (Paperback, Softcover reprint of the original 1st ed. 1994): Vwani... Theoretical Advances in Neural Computation and Learning (Paperback, Softcover reprint of the original 1st ed. 1994)
Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky
R4,476 Discovery Miles 44 760 Ships in 10 - 15 working days

For any research field to have a lasting impact, there must be a firm theoretical foundation. Neural networks research is no exception. Some of the founda tional concepts, established several decades ago, led to the early promise of developing machines exhibiting intelligence. The motivation for studying such machines comes from the fact that the brain is far more efficient in visual processing and speech recognition than existing computers. Undoubtedly, neu robiological systems employ very different computational principles. The study of artificial neural networks aims at understanding these computational prin ciples and applying them in the solutions of engineering problems. Due to the recent advances in both device technology and computational science, we are currently witnessing an explosive growth in the studies of neural networks and their applications. It may take many years before we have a complete understanding about the mechanisms of neural systems. Before this ultimate goal can be achieved, an swers are needed to important fundamental questions such as (a) what can neu ral networks do that traditional computing techniques cannot, (b) how does the complexity of the network for an application relate to the complexity of that problem, and (c) how much training data are required for the resulting network to learn properly? Everyone working in the field has attempted to answer these questions, but general solutions remain elusive. However, encouraging progress in studying specific neural models has been made by researchers from various disciplines."

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Gym Towel & Bag
R95 R78 Discovery Miles 780
Violet Bent Backwards Over The Grass…
Lana Del Rey CD R457 Discovery Miles 4 570
Samsung EO-IA500BBEGWW Wired In-ear…
R299 R199 Discovery Miles 1 990
Baby Dove Body Wash 200ml
R50 R33 Discovery Miles 330
Fine Living Kendall Office Chair (Light…
R2,499 R1,629 Discovery Miles 16 290
Home Classix Placemats - The Tropics…
R59 R51 Discovery Miles 510
JCB Holton Hiker Nubuck Steel Toe Safety…
R1,339 Discovery Miles 13 390
Kirstenbosch - A Visitor's Guide
Colin Paterson-Jones, John Winter Paperback R160 R125 Discovery Miles 1 250
Elecstor 12V 9A LIFEPO4 Battery 3000…
R1,499 R807 Discovery Miles 8 070
Multifunction Water Gun - Gladiator
R399 R379 Discovery Miles 3 790

 

Partners