November 24, 2020

Efficiency of local learning rules in threshold-linear associative networks

We show that associative networks of threshold linear units endowed with Hebbian learning can operate closer to the Gardner optimal storage capacity than their binary counterparts and even surpass this bound. This is largely achieved through a sparsification of the retrieved patterns, which we analyze for theoretical and empirical distributions of activity. As reaching the optimal capacity via non-local learning rules like back-propagation requires slow and neurally implausible training procedures, our results indicate that one-shot self-organized Hebbian learning can be just as efficient.

 bioRxiv Subject Collection: Neuroscience

 Read More

Leave a Reply

%d bloggers like this: