Theoretical Advances in Neural Computation and Learning (eBook, PDF)
-19%
121,95 €
Statt 149,99 €**
121,95 €
inkl. MwSt.
**Preis der gedruckten Ausgabe (Gebundenes Buch)
Sofort per Download lieferbar
61 °P sammeln
-19%
121,95 €
Statt 149,99 €**
121,95 €
inkl. MwSt.
**Preis der gedruckten Ausgabe (Gebundenes Buch)
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
61 °P sammeln
Als Download kaufen
Statt 149,99 €**
-19%
121,95 €
inkl. MwSt.
**Preis der gedruckten Ausgabe (Gebundenes Buch)
Sofort per Download lieferbar
61 °P sammeln
Jetzt verschenken
Statt 149,99 €**
-19%
121,95 €
inkl. MwSt.
**Preis der gedruckten Ausgabe (Gebundenes Buch)
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
61 °P sammeln
  • Format: PDF


For any research field to have a lasting impact, there must be a firm theoretical foundation. Neural networks research is no exception. Some of the founda tional concepts, established several decades ago, led to the early promise of developing machines exhibiting intelligence. The motivation for studying such machines comes from the fact that the brain is far more efficient in visual processing and speech recognition than existing computers. Undoubtedly, neu robiological systems employ very different computational principles. The study of artificial neural networks aims at understanding these…mehr

  • Geräte: PC
  • ohne Kopierschutz
  • eBook Hilfe
  • Größe: 35.53MB
Produktbeschreibung
For any research field to have a lasting impact, there must be a firm theoretical foundation. Neural networks research is no exception. Some of the founda tional concepts, established several decades ago, led to the early promise of developing machines exhibiting intelligence. The motivation for studying such machines comes from the fact that the brain is far more efficient in visual processing and speech recognition than existing computers. Undoubtedly, neu robiological systems employ very different computational principles. The study of artificial neural networks aims at understanding these computational prin ciples and applying them in the solutions of engineering problems. Due to the recent advances in both device technology and computational science, we are currently witnessing an explosive growth in the studies of neural networks and their applications. It may take many years before we have a complete understanding about the mechanisms of neural systems. Before this ultimate goal can be achieved, an swers are needed to important fundamental questions such as (a) what can neu ral networks do that traditional computing techniques cannot, (b) how does the complexity of the network for an application relate to the complexity of that problem, and (c) how much training data are required for the resulting network to learn properly? Everyone working in the field has attempted to answer these questions, but general solutions remain elusive. However, encouraging progress in studying specific neural models has been made by researchers from various disciplines.

Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.

  • Produktdetails
  • Verlag: Springer US
  • Seitenzahl: 468
  • Erscheinungstermin: 6. Dezember 2012
  • Englisch
  • ISBN-13: 9781461526964
  • Artikelnr.: 43992157
Inhaltsangabe
Foreword; B. Widrow. Foreword; D.E. Rummelhart. Preface. Part I: Computational Complexity of Neural Networks. 1. Neural Models and Spectral Methods; V. Roychowdhury, Kai-Yeung Siu, A. Orlitsky. 2. Depth-Efficient Threshold Circuits for Arithmetic Functions; T. Hofmeister. 3. Communication Complexity and Lower Bounds for Threshold Circuits; M. Goldmann. 4. A Comparison of the Computational Power of Sigmoid and Boolean Threshold Circuits; W. Maass, G. Schnitger, E.D. Sontag. 5. Computing on Analog Neural Nets with Arbitrary Real Weights; W. Maass. 6. Connectivity versus Capacity in the Hebb Rule; S.S. Venkatesh. Part II: Learning and Neural Networks. 7. Computational Learning Theory and Neural Networks: a Survey of Selected Topics; G. Turán. 8. Perspectives of Current Research about the Complexity of Learning on Neural Nets; W. Maass. 9. Learning an Intersection of K Halfspaces over a Uniform Distribution; A.L. Blum, R. Kannan. 10. On the Intractability of Loading Neural Networks; B. DasGupta, H.T. Siegelmann, E. Sontag. 11. Learning Boolean Functions via the Fourier Transform; Y. Mansour. 12. LMS and Backpropagation are Minimax Filters; B. Hassibi, A.H. Sayed, T. Kailath. 13. Supervised Learning: Can it Escape its Local Minimum? P.J. Werbos. Index.