A Brain-Inspired Chip Can Run AI With Far Less Energy | Quanta MagazineArtificial intelligence algorithms cannot keep growing at their current pace. Algorithms like deep neural networks â which are loosely inspired by the brain, with multiple layers of artificial neurons linked to each other via numerical values called weights â get bigger every year. But these days, hardware improvements are no longer keeping pace with the enormous amount of memory and processing capacity required to run these massive algorithms. Soon, the size of AI algorithms may hit a wall.
And even if we could keep scaling up hardware to meet the demands of AI, there's another problem: running them on traditional computers wastes an enormous amount of energy. The high carbon emissions generated from running large AI algorithms is already harmful for the environment, and it will only get worse as the algorithms grow ever more gigantic.
One solution, called neuromorphic computing, takes inspiration from biological brains to create energy-efficient designs. Unfortunately, while these chips can outpace digital computers in conserving energy, they've lacked the computational power needed to run a sizable deep neural network. That's made them easy for AI researchers to overlook.
Continued here