Home → News → Artificial Neuronal Network Learns to ­se Human Language → Full Text

Artificial Neuronal Network Learns to ­se Human Language

By Science 2.0

November 16, 2015

[article image]


Researchers from the University of Sassari and the University of Plymouth have developed a cognitive model comprising 2 million interconnected artificial neurons called the Artificial Neural Network with Adaptive Behavior Exploited for Language Learning (ANNABELL), which is able to learn to communicate using human language.

The model can learn human language starting from a blank state only via communication with a human interlocutor.

ANNABELL is a cognitive architecture consisting entirely of interconnected artificial neurons, and does not have pre-coded language knowledge. It relies on synaptic plasticity and neural gating. Synaptic plasticity is the ability of the connection between two neurons to increase its efficiency when the two neurons are often active simultaneously, or nearly simultaneously. Neural gating mechanisms are based on the properties of bistable neurons to behave as switches that can be turned "on" or "off" by a control signal coming from other neurons.

The cognitive model has been validated using a database of about 1,500 input sentences, based on literature on early language development, and has responded by generating a total of about 500 output sentences containing nouns, verbs, adjectives, pronouns, and other word classes.

The research demonstrates the model's ability to express a wide range of capabilities in human-language processing.

From Science 2.0
View Full Article

 

Abstracts Copyright © 2015 Information Inc., Bethesda, Maryland, USA

0 Comments

No entries found