4.6: SubTopics and Explorations
- Page ID
- 12585
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)
SubTopics
Here are all the sub-topics within the Learning chapter, collected in one place for easy browsing. These may or may not be optional for a given course, depending on the instructor's specifications of what to read:
- Detailed Biology of Learning -- more in-depth treatment of postsynaptic signaling cascades that mediate LTP and LTD, described in context of the Urakubo, Honda, Froemke, & Kuroda (2008) model of synaptic plasticity.
- Hebbian Learning -- extensive treatment of computational properties of Hebbian learning -- starts with a simple manual simulation of Hebbian learning showing exactly how and why it captures patterns of co-occurrence.
- STDP -- more details on spike timing dependent plasticity
- Backpropagation -- history and mathematical derivation of error-driven learning functions -- strongly recommended to obtain greater insight into the computational nature of error-driven learning (starts with some important conceptual points before getting into the math).
- Oscillating Learning Function -- Norman et al learning rule based on rapidly changing synaptic plasticity combined with oscillations in inhibitory strength -- produces an interesting hybrid of error-driven and self-organizing learning, and is mathematically equivalent to the CAL learning function.
- Implementational Details -- misc implementational details about how time averaged activations are computed, etc.
- Leabra Details -- describes how the current version of Leabra based on XCAL differs from the original one from the CECNtextbook (O'Reilly & Munakata, 2000).
- Full set of Leabra equations on emergent page: grey.colorado.edu/emergent/index.php/Leabra#Leabra_Algorithm_Equations
Explorations
Here are all the explorations covered in the main portion of the Learning chapter:
- Self Organizing (self_org.proj) -- Self organizing learning using BCM-like dynamic of XCAL (Questions 4.1-4.2).
- Pattern Associator (pat_assoc.proj) -- Basic two-layer network learning simple input/output mapping tasks with Hebbian and Error-driven mechanisms (Questions 4.3-4.6).
- Error Driven Hidden (err_driven_hidden.proj) -- Full error-driven learning with a hidden layer, can solve any input output mapping (Question 4.7).
- Family Trees (family_trees.proj) -- Learning in a deep (multi-hidden-layer) network, reshaping internal representations to encode relational similarity (Questions 4.8-4.9).