default search action
NIPS 1992: Denver, CO, USA
- Stephen Jose Hanson, Jack D. Cowan, C. Lee Giles:
Advances in Neural Information Processing Systems 5, [NIPS Conference, Denver, Colorado, USA, November 30 - December 3, 1992]. Morgan Kaufmann 1993, ISBN 1-55860-274-7
Part 1: Learning and Generalization
- Nathan Intrator:
On the Use of Projection Pursuit Constraints for Training Neural Networks. 3-10 - Andreas Stolcke, Stephen M. Omohundro:
Hidden Markov Model} Induction by Bayesian Model Merging. 11-18 - Kai-Yeung Siu, Vwani P. Roychowdhury, Thomas Kailath:
Computing with Almost Optimal Size Neural Networks. 19-26 - Janet Wiles, Mark Ollila:
Intersecting Regions: The Key to Combinatorial Structure in Hidden Unit Space. 27-33 - Tony Plate:
Holographic Recurrent Networks. 34-41 - Harris Drucker, Robert E. Schapire, Patrice Y. Simard:
Improving Performance in Neural Networks Using a Boosting Algorithm. 42-49 - Patrice Y. Simard, Yann LeCun, John S. Denker:
Efficient Pattern Recognition Using a New Transformation Distance. 50-58 - Kai-Yeung Siu, Vwani P. Roychowdhury:
Optimal Depth Neural Networks for Multiplication and Related Problems. 59-64 - Sreerupa Das, C. Lee Giles, Guo-Zheng Sun:
Using Prior Knowledge in a {NNPDA} to Learn Context-Free Languages. 65-72 - Yaser S. Abu-Mostafa:
A Method for Learning From Hints. 73-80 - Charles W. Anderson:
Q-Learning with Hidden-Unit Restarting. 81-88 - J. Stephen Judd, Paul W. Munro:
Nets with Unreliable Hidden Nodes Learn Error-Correcting Codes. 89-96
Part 2: Architectures and Algorithms
- Richard K. Belew:
Interposing an Ontogenetic Model Between Genetic Algorithms and Neural Networks. 99-106 - J. Jeffrey Mahoney, Raymond J. Mooney:
Combining Neural and Symbolic Learning to Revise Probabilistic Rule Bases. 107-114 - Mark B. Ring:
Learning Sequential Tasks by Incrementally Adding Higher Orders. 115-122 - Bernd Fritzke:
Kohonen Feature Maps and Growing Cell Structures - a Performance Comparison. 123-130 - Brian V. Bonnlander, Michael Mozer:
Metamorphosis Networks: An Alternative to Constructive Models. 131-138 - Eric I. Chang, Richard Lippmann:
A Boundary Hunting Radial Basis Function Classifier which Allocates Centers Constructively. 139-146 - Isabelle Guyon, Bernhard E. Boser, Vladimir Vapnik:
Automatic Capacity Tuning of Very Large VC-Dimension Classifiers. 147-155 - Yann LeCun, Patrice Y. Simard, Barak A. Pearlmutter:
Automatic Learning Rate Maximization in Large Adaptive Machines. 156-163 - Babak Hassibi, David G. Stork:
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon. 164-171 - Richard S. Zemel, Christopher K. I. Williams, Michael Mozer:
Directional-Unit Boltzmann Machines. 172-179 - Guo-Zheng Sun, Hsing-Hen Chen, Yee-Chun Lee:
Time Warping Invariant Neural Networks. 180-187 - Enno Littmann, Helge J. Ritter:
Generalization Abilities of Cascade Network Architecture. 188-195 - Gerhard Paass:
Assessing and Improving Neural Network Predictions by the Bootstrap Algorithm. 196-203 - Lorien Y. Pratt:
Discriminability-Based Transfer between Neural Networks. 204-211 - Barry Flower, Marwan A. Jabri:
Summed Weight Neuron Perturbation: An O(N) Improvement Over Weight Perturbation. 212-219 - Virginia R. de Sa, Dana H. Ballard:
A Note on Learning Vector Quantization. 220-227 - William Finnoff, Ferdinand Hergert, Hans-Georg Zimmermann:
Extended Regularization Methods for Nonconvergent Model Selection. 228-235 - Bill Baird, Todd Troyer, Frank H. Eeckman:
Synchronization and Grammatical Inference in an Oscillating Elman Net. 236-243 - Gert Cauwenberghs:
A Fast Stochastic Error-Descent Algorithm for Supervised Learning and Optimization. 244-251
Part 3: Control, Navigation, and Planning
- David DeMers, Kenneth Kreutz-Delgado:
Global Regularization of Inverse Kinematics for Redundant Manipulators. 255-262 - Andrew W. Moore, Christopher G. Atkeson:
Memory-Based Reinforcement Learning: Efficient Computation with Prioritized Sweeping. 263-270 - Peter Dayan, Geoffrey E. Hinton:
Feudal Reinforcement Learning. 271-278 - Dean Pomerleau:
Input Reconstruction Reliability Estimation. 279-286 - Tom M. Mitchell, Sebastian Thrun:
Explanation-Based Neural Network Learning for Robot Control. 287-294 - Steven J. Bradtke:
Reinforcement Learning Applied to Linear Quadratic Regulation. 295-302 - Christopher Bowman:
Neural Network On-Line Learning Control of Spacecraft Smart Structures. 303-310 - Yoji Uno, Naohiro Fukumura, Ryoji Suzuki, Mitsuo Kawato:
Integration of Visual and Somatosensory Information for Preshaping Hand in Grasping Movements. 311-318 - James K. Peterson:
On Line Estimation of Optimal Control Sequences: HJB Estimators. 319-326 - Vijaykumar Gullapalli:
Learning Control Under Extreme Uncertainty. 327-334 - Terence D. Sanger:
A Practice Strategy for Robot Learning Control. 335-341 - Gerald Fahner, Rolf Eckmiller:
Learning Spatio-Temporal Planning from a Dynamic Programming Teacher: Feed-Forward Neurocontrol for Moving Obstacle Avoidance. 342-349 - Charles M. Higgins, Rodney M. Goodman:
Learning Fuzzy Rule-Based Neural Networks for Control. 350-357
Part 4: Visual Processing
- Suzanna Becker:
Learning to Categorize Objects Using Temporal Coherence. 361-368 - Steven J. Nowlan, Terrence J. Sejnowski:
Filter Selection Model for Generating Visual Motion Signals. 369-376 - Edward Stern, Ad Aertsen, Eilon Vaadia, Shaul Hochstein:
Stimulus Encoding by Multidimensional Receptive Fields in Single Cells and Cell Populations in V1 of Awake Monkey. 377-384 - Suthep Madarasmi, Daniel J. Kersten, Ting-Chuen Pong:
The Computation of Stereo Disparity for Transparent and for Opaque Surfaces. 385-392 - Subutai Ahmad, Volker Tresp:
Some Solutions to the Missing Feature Problem in Vision. 393-400 - Joachim Utans, Gene Gindi:
Improving Convergence in Hierarchical Matching Networks for Object Recognition. 401-408 - Carlos D. Brody:
A Model of Feedback to the Lateral Geniculate Nucleus. 409-416 - Kevin E. Martin, Jonathan A. Marshall:
Unsmearing Visual Motion: Development of Long-Range Horizontal Intrinsic Connections. 417-424 - Hayit Greenspan, Rodney M. Goodman:
Remote Sensing Image Analysis via a Texture Classification Neural Network. 425-432 - Markus Lappe, Josef P. Rauschecker:
Computation of Heading Direction from Optic Flow in Visual Cortex. 433-440 - Gale Martin, Mosfeq Rashid, David Chapman, James A. Pittman:
Learning to See Where and What: Training a Net to Make Saccades and Recognize Handwritten Characters. 441-447
Part 5: Stochastic Learning and Analysis
- Todd K. Leen, John E. Moody:
Weight Space Probability Densities in Stochastic Learning: I. Dynamics and Equilibria. 451-458 - William Finnoff:
Diffusion Approximations for the Constant Step Size Backpropagation Algorithm and Resistance to Local Minima. 459-466 - Lei Xu, Alan L. Yuille:
Self-Organizing Rules for Robust Principal Component Analysis. 467-474 - Radford M. Neal:
Bayesian Learning via Stochastic Dynamics. 475-482 - Yoav Freund, H. Sebastian Seung, Eli Shamir, Naftali Tishby:
Information, Prediction, and Query by Committee. 483-490 - Alan F. Murray, Peter J. Edwards:
Synaptic Weight Noise During MLP Learning Enhances Fault-Tolerance, Generalization and Learning Trajectory. 491-498 - Nicol N. Schraudolph, Terrence J. Sejnowski:
Unsupervised Discrimination of Clustered Data via Optimization of Binary Information Gain. 499-506 - Genevieve B. Orr, Todd K. Leen:
Weight Space Probability Densities in Stochastic Learning: II. Transients and Basin Hopping Times. 507-514 - Satoru Shiono, Satoshi Yamada, Michio Nakashima, Kenji Matsumoto:
Information Theoretic Analysis of Connection Structure from Spike Trains. 515-522 - Holm Schwarze, John A. Hertz:
Statistical Mechanics of Learning in a Large Committee Machine. 523-530 - John W. Miller, Rodney M. Goodman:
Probability Estimator from a Database Using a Gibbs Energy Model. 531-538 - David H. Wolpert:
On the Use of Evidence in Neural Networks. 539-546
Part 6: Network Dynamics and Chaos
- Bernard Doyon, Bruno Cessac, Mathias Quoy, Manuel Samuelides:
Destabilization and Route to Chaos in Neural Networks with Random Connectivity. 549-555 - Ali A. Minai, William B. Levy:
Predicting Complex Behavior in Sparse Asymmetric Networks. 556-563 - Isaac Meilijson, Eytan Ruppin, Moshe Sipper:
Single-Iteration Threshold Hamming Networks. 564-571 - Isaac Meilijson, Eytan Ruppin:
History-Dependent Attractor Neural Networks. 572-579 - David DeMers, Garrison W. Cottrell:
Non-Linear Dimensionality Reduction. 580-587
Part 7: Theory and Analysis
- Mostefa Golea, Mario Marchand, Thomas R. Hancock:
On Learning µ-Perceptron Networks with Binary Weights. 591-598 - Yong Liu:
Neural Network Model Selection Using Asymptotic Jackknife Estimator and Cross-Validation Method. 599-606 - Noboru Murata, Shuji Yoshizawa, Shun-ichi Amari:
Learning Curves, Model Selection and Complexity of Neural Networks. 607-614 - Bhaskar DasGupta, Georg Schnitger:
The Power of Approximation: A Comparison of Activation Functions. 615-622 - Uwe Helmke, Robert C. Williamson:
Rational Parametrizations of Neural Networks. 623-630 - N. H. Wulff, John A. Hertz:
Learning Cellular Automation Dynamics with Neural Networks. 631-638 - Adam Kowalczyk:
Some Estimates on the Number of Connections and Hidden Units for Feed-Forward Networks. 639-646
Part 8: Speech and Signal Processing
- Michael Cohen, Horacio Franco, Nelson Morgan, David E. Rumelhart, Victor Abrash:
Context-Dependent Multiple Distribution Phonetic Modeling with MLPs. 649-657 - Makoto Hirayama, Eric Vatikiotis-Bateson, Kiyoshi Honda, Yasuharu Koike, Mitsuo Kawato:
Physiologically Based Speech Synthesis. 658-665 - Weimin Liu, Andreas G. Andreou, Moise H. Goldstein Jr.:
Analog Cochlear Model for Multiresolution Speech Analysis. 666-673 - Wei-Tsih Lee, John C. Pearson:
A Hybrid Linear/Nonlinear Approach to Channel Equalization Problems. 674-681 - Yochai Konig, Nelson Morgan, Chuck Wooters, Victor Abrash, Michael Cohen, Horacio Franco:
Modeling Consistency in a Speaker Independent Continuous Speech Recognition System. 682-687 - José Carlos Príncipe, Abir Zahalka:
Transient Signal Detection with Neural Networks: The Search for the Desired Signal. 688-695 - Joe Tebelskis, Alex Waibel:
Performance Through Consistency: MS-TDNN's for Large Vocabulary Continuous Speech Recognition. 696-703 - George Zavaliagkos, Ying Zhao, Richard M. Schwartz, John Makhoul:
A Hybrid Neural Net System for State-of-the-Art Continuous Speech Recognition. 704-711 - Hermann Hild, Alex Waibel:
Connected Letter Recognition with a Multi-State Time Delay Neural Network. 712-719
Part 9: Applications
- Markus Schenkel, H. Weissman, Isabelle Guyon, C. Nohl, Donnie Henderson:
Recognition-Based Segmentation of On-Line Hand-Printed Words. 723-730 - Esther Levin, Roberto Pieraccini:
Planar Hidden Markov Modeling: From Speech to Optical Character Recognition. 731-738 - Jen-Lun Yuan, Terrence Fine:
Forecasting Demand for Electric Power. 739-746 - Pierre Baldi, Yves Chauvin, Tim Hunkapiller, Marcella A. McClure:
Hidden Markov Models in Molecular Biology: New Algorithms and Applications. 747-754 - Charles R. Rosenberg, Jacob Erel, Henri Atlan:
A Neural Network that Learns to Interpret Myocardial Planar Thallium Scintigrams. 755-762
Part 10: Implementations
- Janeen Anderson, John C. Platt, David Blair Kirk:
An Analog VLSI Chip for Radial Basis Functions. 765-772 - Stephen Churcher, Donald J. Baxter, Alister Hamilton, Alan F. Murray, H. Martin Reekie:
Generic Analog Neural Computation - The Epsilon Chip. 773-780 - Rahul Sarpeshkar, Wyeth Bair, Christof Koch:
Visual Motion Computation in Analog VLSI Using Pulses. 781-788 - David B. Kirch, Douglas Kerns, Kurt W. Fleischer, Alan H. Barr:
Analog VLSI Implementation of Gradient Descent. 789-796 - Alexander Linden, Thomas Sudbrak, Christoph Tietz, F. Weber:
An Object-Oriented Framework for the Simulation of Neural Networks. 797-804 - Eros Pasero, Riccardo Zecchina:
Attractor Neural Networks with Local Inhibition: From Statistical Physics to a Digitial Programmable Integrated Circuit. 805-812 - Sylvie Renaud-Le Masson, Gwendal Le Masson, Eve Marder, L. F. Abbott:
Hybrid Circuits of Interacting Computer Model and Biological Neurons. 813-819 - John Lazzaro, John Wawrzynek, Misha Mahowald, Massimo Sivilotti, Dave Gillespie:
Silicon Auditory Processors as Computer Peripherals. 820-827 - Christof Koch, Binnal Mathur, Shih-Chii Liu, John G. Harris, Jin Luo, Massimo Sivilotti:
Object-Based Analog VLSI Vision Circuits. 828-835 - Joshua Alspector, Ronny Meir, Ben P. Yuhas, Anthony Jayakumar, D. Lippe:
A Parallel Gradient Descent Method for Learning in Analog VLSI Neural Networks. 836-844
Part 11: Cognitive Science
- Paul Smolensky:
Harmonic Grammars for Formal Languages. 847-854 - Dedre Gentner, Arthur B. Markman:
Analogy - Watershed or Waterloo? Structural Alignment and the Development of Connectionist Models of Cognition. 855-862 - Michael Mozer, Sreerupa Das:
A Connectionist Symbol Manipulator that Discovers the Structure of Context-Free Languages. 863-870 - Volker Tresp, Jürgen Hollatz, Subutai Ahmad:
Network Structuring and Training Using Rule-Based Knowledge. 871-878 - Daphne Bavelier, Michael I. Jordan:
A Dynamical Model of Priming and Repetition Blindness. 879-886 - Geoffrey G. Towell, Richard Lehrer:
A Knowledge-Based Model of Geometry Learning. 887-894 - Hinrich Schütze:
Word Space. 895-902 - Rainer Goebel:
Perceiving Complex Visual Scenes: An Oscillator Neural Network Model that Integrates Selective Attention, Perceptual Organization, and Invariant Recognition. 903-910
Part 12: Computational and Theoretical Neurobiology
- Kenji Doya, Mary E. T. Boyle, Allen I. Selverston:
Maaping Between Neural and Physical Activities of the Lobster Gastric Mill. 913-920 - Mark E. Nelson:
A Neural Model of Descending Gain Control in the Electrosensory System. 921-928 - Neil Burgess, John O'Keefe, Michael Recce:
Using Hippocampal 'Plane Cells' for Navigation, Exploiting Phase Coding. 929-936 - Mark A. Gluck, Catherine Myers:
Adaptive Stimulus Representations: A Computational Theory of Hippocampal-Region Functions. 937-944 - Itay Gat, Naftali Tishby:
Statistical Modeling of Cell Assemblies Activities in Associative Cortex of Behaving Monkeys. 945-952 - Ralph Linsker:
Deriving Receptive Fields Using an Optimal Encoding Criterion. 953-960 - Olivier J. M. D. Coenen, Terrence J. Sejnowski, Stephen G. Lisberger:
Biologically Plausible Local Learning Rules for the Adaptation of the Vestibulo-Ocular Reflex. 961-968 - P. Read Montague, Peter Dayan, Steven J. Nowlan, Terrence J. Sejnowski:
Using Aperiodic Reinforcement for Directed Self-Organization During Development. 969-976 - Klaus Pawelzik, Hans-Ulrich Bauer, Josef Deppisch, Theo Geisel:
How Oscillatory Neuronal Responses Reflect Bistability and Switching of the Hidden Assembly Dynamics. 977-984 - Geoffrey J. Goodhill:
Topography and Ocular Dominance with Positive Correlations. 985-992 - Frank Moss, André Longtin:
Statistical and Dynamical Interpretation of ISIH Data from Periodically Stimulated Sensory Neurons. 993-1000 - John G. Milton, Po Hsiang Chu, Jack D. Cowan:
Spiral Waves in Integrate-and-Fire Neural Networks. 1001-1006 - Lance Craig Walton, David L. Bisset:
Parametrizing Feature Sensitive Cell Formation in Linsker Networks in the Auditory System. 1007-1013 - Lina Massone:
A Recurrent Neural Network for Generation of Occular Saccades. 1014-1021 - Christiane Linster, David Marsan, Claudine Masson, Michel Kerszberg, Gérard Dreyfus, Léon Personnaz:
A Formal Model of the Insect Olfactory Macroglomerulus: Simulations and Analytic Results. 1022-1029 - William E. Skaggs, Bruce L. McNaughton, Katalin M. Gothard:
An Information-Theoretic Approach to Deciphering the Hippocampal Code. 1030-1037
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.