New Bulgarian University > | Center for Cognitive Science > | Preparatory Program > | Course Description |
COG402 Connectionist Modeling
Parallel Distributed Processing: Exploration in the Microstructure of Cognition, vol. 1&2, MIT Press.
[PDP3] McClelland & Rumelhart (1988),
Explorations in Parallel Distributed Processing, MIT Press.
[TNC] Hertz, Krogh & Palmer (1991),
Introduction to the Theory of Neural Computation, (Lecture Notes of the Santa Fe Institute), Addison-Wesley.
[NC1] Anderson & Rosenfeld (1988),
Neurocomputing: Foundation of Research, MIT Press.
[NC2] Anderson, Pellionisz & Rosenfeld (1990),
Neurocomputing 2: Directions for Research, MIT Press.
[CM] Waltz & Feldman (1988),
Connectionist Models and their Implications: Readings from Cognitive Science,, Ablex.
[NNAI] Zeidenberg (1990),
Neural Network Models in Artificial Intelligence, Ellis Horwood.
[NN] Mueller & Reinhardt (1990),
Neural Networks: An Introduction, Springer.
[NCM] Levine, D. (1991),
Introduction to Neural and Cognitive Modeling, Lawrence Erlbaum, Hillsdale, NJ.
[CP] Quinlan, P. (1991),
Connectionism and Psychology: A Psychological Perspective on New Connectionist Research, Harvester, New York.
[ANN] Patterson, D.,
Artificial Neural Networks, Prentice Hall, 1996.
Introduction
Topic 1: Biological basis of neural networks (neural networks: biological and
artificial). The connectionist approach to AI and Cognitive Science. Historical remarks.
Seminar: Demonstration of a tape recording of a NN reading (NETtalk) and composing
music (Mozer). Discussion.
Required readings:
Topic 2: General Connectionist Architecture. Basic concepts. Connectionist Model
usage (relaxation): heteroassociator (pattern associator) and autoassociator (associative
memory). Learning paradigms: supervised, unsupervised, reinforcement learning.
Seminar: Discussing different particular architectures.
Required readings:
Relaxation Search
Topic 3: Associative Memory: Interactive Activation and Competition Model.
Properties: content addressability, graceful degradation, default values, spontaneous
generalization.
Lab: Experiments with the computer simulation: Sharks and Jets.
Required reading:
Topic 4: Associative Memory: Hopfield Networks. Potential Functions. The constraint
satisfaction problem: Global extremum. Physics Analogy. Bolzmann Machines.
Lab: Experiments with the computer simulation: Necker cube.
Required readings:
Representation
Topic 5: Local vs. distributed representation. Schema representation. The Schema
Model. Harmony Theory.
Lab: Experiments with the computer simulation: Room schema.
Required readings:
Topic 6: Complex structure representation. Relation representation. Coarse
coding.
Tensor Product Formalism.
Seminar: Building examples of representations of complex structures.
Learning
Topic 7: Learning in Single-layered Networks. Hebbian rule. Associative
memo-ries. Least Mean Squares Learning Rule. Perceptron Learning rule.
Lab: Experiments with the computer simulation: Pattern association.
Required readings:
Topic 8: Supervised Learning in Multi-layered Networks. Limitations of Perceptrons
and LMS. Back-propagation learning rule.
Lab: Experiments with the computer simulation: XOR problem.
Required readings:
Topic 9: Unsupervised learning: Competitive Learning. Kohonen topographic maps.
Lab: Experiments with the computer simulation: Clustering the Jets and Sharks.
Required readings:
Topic 10: Unsupervised learning: Brain-State-in-a-Box Model. Hebbian learning.
Adaptive Resonance Theory.
Lab: Experiments with the computer simulation: BSB model.
Required readings:
Topic 11: Reinforcement Learning: The credit assignment problem. Adaptive search
elements. Associative reward-penalty algorithm.
Seminar: Examples and discussion on learning methods.
Required reading:
Cognitive Models
Topic 12: Perception Models.
Lab: Experiments with the computer simulation: Interactive Activation Model of
Reading. Exploring the role of context.
Required readings:
Topic 13: Natural Language Understanding. Learning the past tenses of English
verbs. Parsing and role assignment.
Seminar: Discussing the adequacy of the models.
Required reading:
Topic 14: Memory and Reasoning models.
Lab: Experiments with the computer simulation: DMA model. Discussing the adequacy
of the models.
Required reading:
Topic 15: Pros and cons of neural networks. Comparing connectionist and sym-bolic
approaches. The symbol vs. subsymbol debate. Historical lessons.
Seminar: Final discussion about the advantages and disadvantages of the
connec-tionist and symbolic approaches and their relations.
Grading procedure:
25% written paper