Skip to main content
|

Can the Human Brain Outsmart AI?

A study by Prof. Ido Kanter, of the BIU Department of Physics and Gonda Multidisciplinary Brain Research Center, found that the human brain performs complex tasks as effectively as AI

תמונה
outsmarting ai

A study by Prof. Ido Kanter, of the Bar-Ilan Department of Physics and Gonda Multidisciplinary Brain Research Center, found that the human brain’s wide, shallow architecture can perform tasks as effectively as the deep learning architecture that AI is based.

Neural network learning techniques stem from the dynamics of the brain. However, brain learning and AI-based deep learning are intrinsically different. One of the most prominent differences is the number of layers each of them utilizes.

Remote video URL

Recent research has contrasted the learning mechanisms of the human brain with those of deep learning in AI. The human brain’s shallow architecture, which consists of a few layers, can efficiently perform complex tasks that require a multitude of layers, sometimes extending to hundreds, in deep learning AI architectures. Despite having fewer layers and slower, noisier dynamics, this study suggests that the brain’s efficiency for performing complex classification tasks lies in its wide, shallow architecture.

The findings challenge current AI models and indicate a need for a shift in advanced GPU technology designed for deep learning architectures, in order to better mimic the brain’s structure and learning methods.

“Instead of a deep architecture, like a skyscraper, the brain consists of a wide shallow architecture, more like a very wide building with only a few floors,” said Prof. Kanter, who led the research.

The research explores the viability of shallow learning as an alternative to deep learning in machine learning architectures. The study demonstrates that increasing the number of filters per layer in a generalized shallow architecture result in a power-law decay of error rates to zero.

The observed phenomenon suggests that in the shallow architecture, the average noise per filter decreases as the number of filters per layer increases. These findings indicate potential benefits and efficiency in shallow learning compared to deeper architectures.