DrXaos t1_ix66waj wrote
Reply to comment by [deleted] in Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation by chromoscience
The introduction of the paper is explanatory and not particularly technical.
Conventionally artificial neural networks learn only when old and new data are shuffled when presented for training. People don’t learn like that, they can concentrate and learn new skills while not forgetting old ones, but conventional neural network algorithms fail to do that. This paper presents a model of sleeping in a biologically inspired neural network model in which the sleep phase algorithms overcomes the problem.
[deleted] t1_ix6v9dc wrote
We also have problems of overwriting data without intervening sleep. Breaks can help to an extent as well. When you space out your brain is compressing ideas into abstractions as it does when you sleep, but only for short less effective bursts.
This is why they sought to replicate that mechanism in silico
Viewing a single comment thread. View all comments