Learning on tree architectures shown to outperform a convolutional feedforward network

Scheme of a simple neural network based on dendritic tree (left) vs. a complex artificial intelligence deep learning architecture (right). Credit: Prof. Ido Kanter, Bar-Ilan University

Traditionally, artificial intelligence stems from human brain dynamics. However, brain learning is restricted in a number of significant aspects compared to deep learning (DL). First, efficient DL wiring structures (architectures) consist of many tens of feedforward (consecutive) layers, whereas brain dynamics consist of only a few feedforward layers. Second, DL architectures typically consist of many consecutive filter layers, which are essential to identify one of the input classes.

If the input is a car, for example, the first filter identifies wheels, the second one identifies doors, the third one lights and after many additional filters it becomes clear that the input object is, indeed, a car. Conversely, brain dynamics contain just a single filter located close to the retina. The last necessary component is the mathematical complex DL training procedure, which is evidently far beyond biological realization.

Can the brain, with its limited realization of precise mathematical operations, compete with advanced artificial intelligence systems implemented on fast and parallel computers? From our daily experience, we know that for many tasks the answer is yes. Why is this and, given this affirmative answer, can one build a new type of efficient artificial intelligence inspired by the brain? In an article published today in Scientific Reports, researchers from Bar-Ilan University in Israel solve this puzzle.

“We’ve shown that efficient learning on an artificial tree architecture, where each weight has a single route to an output unit, can achieve better classification success rates than previously achieved by DL architectures consisting of more layers and filters. This finding paves the way for efficient, biologically inspired new AI hardware and algorithms,” said Prof. Ido Kanter, of Bar-Ilan’s Department of Physics and Gonda (Goldschmied) Multidisciplinary Brain Research Center, who led the research.

“Highly pruned tree architectures represent a step toward a plausible biological realization of efficient dendritic tree learning by a single or several neurons, with reduced complexity and energy consumption, and biological realization of backpropagation mechanism, which is currently the central technique in AI,” added Yuval Meir, a Ph.D. student and contributor to this work.





Credit: Bar-Ilan University

Efficient dendritic tree learning is based on previous research by Kanter and his experimental research team—and conducted by Dr. Roni Vardi—indicating evidence for sub-dendritic adaptation using neuronal cultures, together with other anisotropic properties of neurons, like different spike waveforms, refractory periods and maximal transmission rates.

The efficient implementation of highly pruned tree training requires a new type of hardware that differs from emerging GPUs which are better fitted to the current DL strategy. The emergence of a new hardware is required to efficiently imitate brain dynamics.

More information:
Yuval Meir et al, Learning on tree architectures outperforms a convolutional feedforward network, Scientific Reports (2023). DOI: 10.1038/s41598-023-27986-6

Provided by
Bar-Ilan University


Citation:
Learning on tree architectures shown to outperform a convolutional feedforward network (2023, January 30)
retrieved 30 January 2023
from https://techxplore.com/news/2023-01-tree-architectures-shown-outperform-convolutional.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.