Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
We introduce a biologically inspired, multi-layer neural architecture built from Rectified Spectral Units (ReSUs). Each ReSU projects a recent window of its input history onto a canonical direction learned by the canonical correlation analysis (CCA) of previously observed past-future input pairs and then rectifies either the positive or negative component. Because synaptic weights are obtained via past-future CCA on the pre-synaptic activity, ReSU networks offer a potentially local, self-supervised algorithm for the progressive construction of increasingly complex features. To assess both computational power and biological fidelity, we trained a two-layer ReSU network in a self-supervised regime on translating natural scenes. First-layer units, each driven by a single pixel, developed temporal filters matching those of \textit{Drosophila} post-photoreceptor neurons (L1/L2 and L3), including their empirically measured adaptation to signal‑to‑noise‑ratio. Second-layer units, pooling spatially over the first layer, became direction-selective, reminiscent of T4 motion-detecting cells, with learned synaptic weights approximating known patterns in the \textit{Drosophila} connectome. These results demonstrate that ReSU networks may provide: (i) a principled framework for modeling sensory circuits, (ii) a back-prop-free self-supervised paradigm for constructing deep artificial neural networks.