Authors: Yijing Watkins, Edward Kim, Andrew Sornborger, Garrett T. Kenyon Description: Sparse coding algorithms have been used to model the acquisition of V1 simple cell receptive fields as well as to accomplish the unsupervised acquisition of features for a variety of machine learning applications. The Locally Competitive Algorithm (LCA) provides a biologically plausible implementation of sparse coding based on lateral inhibition. LCA can be reformulated to support dictionary learning via an online local Hebbian rule that reduces predictive coding error. Although originally formulated in terms of leaky integrator rate-coded neurons, LCA based on lateral inhibition between leaky integrate-and-fire (LIF) neurons has been implemented on spiking neuromorphic processors but such implementations preclude local online learning. We previously reported that spiking LCA can be expressed in terms of predictive coding error in a manner that allows for unsupervised dictionary learning via a local Hebbian rule but the issue of stability has not previously been addressed. Here, we use the Nengo simulator to show that unsupervised dictionary learning in a spiking LCA model can be made stable by incorporating epochs of sinusoidally-modulated noise that we hypothesize are analogous to slow-wave sleep. In the absence of slow-wave sleep epochs, the $|L|_2$ norm of individual features tends to increase over time during unsupervised dictionary learning until the corresponding neurons can be activated by random Gaussian noise. By inserting epochs of sinusoidally-modulated Gaussian noise, however, the $|L|_2$ norms of any activated neurons are down regulated such that individual neurons are no longer activated by noise. Our results suggest that slow-wave sleep may act, in part, to ensure that cortical neurons do not hallucinate' their target features in pure noise, thus helping to maintain dynamical stability.