As an example, the 50-layer ResNet network has ~26 million weight parameters and computes ~16 million activations in the forward pass. In training, activations from a forward pass must be retained until they can be used to calculate the error gradients in the backwards pass. Memory in neural networks is required to store input data, weight parameters and activations as an input propagates through the network. Why do we need such large attached memory storage with CPU and GPU-powered deep learning systems when our brains appear to work well without it? Why Do Deep Neural Networks Need So Much Memory? However, memory on-chip is area expensive and it wouldn’t be possible to add on the large amounts of memory currently attached to the CPU and GPU processors currently used to train and deploy DNNs. Even simple organisms such as the C.Elgan worm, with a neural structure made up of just over 300 neurons, has some basic memory functions of this sort.īuilding memory into conventional processors is one way of getting around the memory bottleneck problem by opening huge memory bandwidth at much lower power consumption. The long- and short-term memory function in human brains is thought to be embedded in the neuron/synapse structure. The interface between these two devices is a major bottleneck that introduces latency and bandwidth limitations and adds a considerable overhead in power consumption.Īlthough we do not yet have a complete understanding of human brains and how they work, it is generally understood that there is no large, separate memory store. But these challenges are not quite as they seem.Ĭomputer architectures have developed with processor chips specialised for serial processing and DRAMs optimised for high density memory. DRAM capacity appears to be a limitation too. Researchers are struggling with the limited memory bandwidth of the DRAM devices that have to be used by today’s systems to store the huge amounts of weights and activations in DNNs. Memory is one of the biggest challenges in deep neural networks (DNNs) today.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |