Demanding Too Much Purity

I don’t know a thing about neuromorphic processors or the problems encountered in trying to do computer vision with them, yet this caught my eye:

The Intel Loihi neuromorphic processor.

[Yijing Watkins of Los Alamos National Laboratory] and her colleagues experimented with programming neuromorphic processors to learn to reconstruct images and video based on sparse data, a bit like how the human brain learns from its environment during childhood development. “However, all of our attempts to learn eventually became unstable,” said study senior author Garrett Kenyon, also a computer scientist at Los Alamos.

The scientists ran computer simulations of a spiking neural network to find out what happened. They found that although it could learn to identify the data it was trained to look for, when such training went uninterrupted long enough, its neurons began to continuously fire no matter what signals they received.

Watkins recalled that “almost in desperation,” they tried having the simulation essentially undergo deep sleep. They exposed it to cycles of oscillating noise, roughly corresponding to the slow brain waves seen in deep sleep, which restored the simulation to stability. The researchers suggest this simulation of slow-wave sleep may help “prevent neurons from hallucinating the features they’re looking for in random noise,” Watkins said. [Inside Science]

By introducing noise, it seems to me – conceptually – that there’s a reduction in the purity requirements of the processors. That is, it permits a certain amount of fuzziness or abstraction in order to identify some object as a member of this or that category.

The interesting part of the article is the notion that any sentient creature will need deep sleep in order to have a usable cognitive apparatus – and that’s the rough equivalent of sleep.

Bookmark the permalink.

About Hue White

Former BBS operator; software engineer; cat lackey.

Comments are closed.