noun. a linked network of input and output nodes which serves as a beneficial design of associative neural networks. A simple type may stand for two linked neurons, while more complex perceptrons have extra concealed layers between output and input. The links between the two can be weighted to shape the favored output. The objective is to cultivate a theoretical perception of the way neural connections handle signals and shape correlations. Back-propagation algorithms depict the most typical procedure by which the weightings between input and output are shifted.
PERCEPTRON: "The functions of perceptrons will be on the test."