Tech Translated: Neuromorphic computing

What is neuromorphic computing? Neuromorphic computer systems aim to mimic brain functions, with the ultimate goal of matching—or even surpassing—the capabilities of the human mind. This ranges from using software to model and process information in the way that living organisms do, through trying to match the brain’s (as yet) unbeaten combination of low power and high performance with the use of radical new hardware architectures, including novel components such as memristors (transistors that behave like neurons).

What business problems can it address?

“With the enormous potential of AI becoming more obvious by the day, one of the major concerns for the field is that flexible intelligence as we understand it actually maps onto binary computing very poorly, and is inefficient as a result,” explains Dina Brozzetti, a managing director in PwC US’s products and technology practice. “If AI could truly learn and evolve its understanding of the world without prior programming, just like us, but at the same low energy cost—a human brain uses only around the same energy as a 20-watt light bulb, to do calculations a supercomputer would struggle to perform—then it could be the most transformative watershed in computing since the switch from vacuum tubes to transistors.”

Beyond immediate applications for AI and machine learning specifically, neuromorphic computing has implications for all human–machine intellectual collaboration—particularly data analysis and research and development—as well as internet of things, smart cities, autonomous vehicles, human augmentation, sensory processing, industrial management and more. This wide-ranging potential is why in 2023 PwC US listed neuromorphic computing as one of the Essential Eight emerging technologies.

How does it create value?

Today’s computers are limited by their internal configuration: a transistor is only connected on either side of its gate. But a neuron is connected to thousands of other neurons simultaneously. Additionally, rather than the binary on–off approach of today’s digital systems, neurons respond to both the quantity and duration of incoming signals, giving vastly more capacity to process information despite far lower energy needs. “This will both massively reduce IT overhead and give us an important tool for addressing climate change through reduction of energy use,” says Scott Likens, Global AI and Innovation Technology Leader, PwC United States.

There are hurdles: neuroscientists are still struggling to understand and model even simple animal brains, and memristors are more theory than reality. However, applying neuromorphic principles in both software and hardware has still led to impressive advancements, such as UC Santa Cruz’s SpikeGPT, a neural network that uses 22 times less energy than comparable systems. Neuromorphic systems can also self-improve through evolution, just like living creatures do, with potential for dramatic positive feedback loops in many fields of R&D.

“If a scalable breakthrough can be made, neuromorphic processors will lead to a dizzying number of new applications that we can only begin to speculate about, and at the same time truly integrate AI into our lives,” says Likens. “Something the size of your phone will be able to run tasks that currently require a supercomputer and enable new forms of AI that could see today’s generative AI models seem positively slow and limited in comparison.”

Who should be paying attention?

Anyone whose work involves information processing in which neurons still beat silicon could ultimately be impacted if neuromorphic computing takes off. Innovation-focused teams—in technology in general and medical technology in particular—as well as transportation and logisticsengineering and constructionautomotivechemicals, and pharmaceuticals and life sciences should all keep on top of developments in case of a breakthrough. The ubiquity of digital technology means leaders such as CTOs, CISOs, and COOs across industries should be aware of the potential of neuromorphic computing to radically alter computing’s capabilities.

How can businesses prepare?

“The field is still largely confined to academic/corporate R&D labs,” says Brozzetti, “so first, task a team with upskilling their knowledge, investigating possible applications in your industry, and deciding whether it makes sense to invest in either researching directly internally or engaging with external partners. Connect with researchers to help strengthen these efforts and find ways to collaborate to realize the potential benefits early.” At that point, it may be possible to identify possible use cases—for example, integration into an existing embedded IT product—and inclusion in your long-term product development road map.

Learn more

Follow us

Contact us

Tom Archer

Tom Archer

Global Technology Leader, Global Transformation Co-Leader, PwC US

Hide