Complex-Valued Neural Networks with Multi-Valued Neurons by Igor Aizenberg (auth.)

By Igor Aizenberg (auth.)

Complex-Valued Neural Networks have greater performance, study quicker and generalize greater than their real-valued counterparts.

This booklet is dedicated to the Multi-Valued Neuron (MVN) and MVN-based neural networks. It includes a complete commentary of MVN conception, its studying, and functions. MVN is a complex-valued neuron whose inputs and output can be found at the unit circle. Its activation functionality is a functionality purely of argument (phase) of the weighted sum. MVN derivative-free studying is predicated at the error-correction rule. A unmarried MVN can examine these input/output mappings which are non-linearly separable within the actual area. Such classical non-linearly separable difficulties as XOR and Parity n are the best that may be discovered by means of a unmarried MVN. one other vital benefit of MVN is a formal therapy of the part information.

These houses of MVN develop into much more impressive whilst this neuron is used as a uncomplicated one in neural networks. The Multilayer Neural community according to Multi-Valued Neurons (MLMVN) is an MVN-based feedforward neural community. Its backpropagation studying set of rules is derivative-free and in keeping with the error-correction rule. It doesn't be afflicted by the neighborhood minima phenomenon. MLMVN outperforms many different computer studying strategies when it comes to studying velocity, community complexity and generalization strength while fixing either benchmark and real-world type and prediction difficulties. one other attention-grabbing program of MVN is its use as a uncomplicated neuron in multi-state associative memories.

The booklet is addressed to these readers who increase theoretical basics of neural networks and use neural networks for fixing quite a few real-world difficulties. it may even be very compatible for Ph.D. and graduate scholars pursuing their levels in computational intelligence.

Show description

Read or Download Complex-Valued Neural Networks with Multi-Valued Neurons PDF

Best ai & machine learning books

Towards Sustainable and Scalable Educational Innovations Informed by the Learning Sciences: Sharing Good Practices of Research, Experimentation and Innovation

Studying sciences researchers wish to examine studying in real contexts. They acquire either qualitative and quantitative info from a number of views and keep on with developmental micro-genetic or ancient methods to info remark. studying sciences researchers behavior examine with the purpose of deriving layout ideas in which switch and innovation may be enacted.

How did we find out about the beginning of life?

Describes scientists' makes an attempt to determine how lifestyles all started, together with such themes as spontaneous iteration and evolution.

Practical Speech User Interface Design

Even supposing speech is the main typical kind of verbal exchange among people, most folks locate utilizing speech to speak with machines whatever yet ordinary. Drawing from psychology, human-computer interplay, linguistics, and verbal exchange conception, functional Speech consumer Interface layout presents a complete but concise survey of sensible speech consumer interface (SUI) layout.

Neural Network Design

This publication, by means of the authors of the Neural community Toolbox for MATLAB, presents a transparent and targeted assurance of basic neural community architectures and studying principles. In it, the authors emphasize a coherent presentation of the relevant neural networks, equipment for education them and their functions to useful difficulties.

Extra info for Complex-Valued Neural Networks with Multi-Valued Neurons

Sample text

Furthermore let ( ) represent the weighted sum (of the input z kj , ykj and Ykj = ykj zkj signals), the activation function value, and the output value of the kth neuron at the jth layer, respectively. , N m −1 , = ∂wikm ∂ykm ∂z km ∂wikm where ∂E (W ) ∂ ⎛1 1 ∂ ∗ 2⎞ ∗ 2 = (δ km ) ⎟= ∑ (δ km ) = ∑ ⎜ ∂ykm ∂ykm ⎝ 2 k ⎠ 2 k ∂ykm ∂ ∂ 1 N 1 ∂ ∗ 2 * ∗ ∗ ∗ = Dkms − Ykms = −δ km (δ km ) = δ km (δ km ) = δ km ; ∑ ∂ykm ∂ykm N s =1 2 ∂ykm ∂ykm ′ ( zkm ), = ykm ∂zkm ( ) and ∂zkm ∂ = km w0km + w1kmY1, m −1 + ...

The weighted sum after the correction is equal to Since is equal to z = −3 + 1 ⋅ ( −1) + 1 ⋅ 1 = −3 ; ϕ ( z ) = sgn( z ) = sgn( −3) = −1 . f ( −1,1) = −1 , no further correction of the weights is needed. 4) Inputs (-1, -1). The weighted sum z = −3 + 1 ⋅ ( −1) + 1 ⋅ ( −1) = −5 ; ϕ ( z ) = sgn( z ) = sgn( −5) = −1 . Since f ( −1, −1) = −1 , no further correction of the weights is needed. Iteration 2. 1) Inputs (1, 1). The weighted sum is equal to z = −3 + 1 ⋅ 1 + 1 ⋅ 1 = −1 ; ϕ ( z ) = sgn( z ) = sgn( −1) = −1 .

The more hidden neurons are in the network, the more is level of heuristics in the backpropagation algorithm. Indeed, the hidden neurons desired outputs and the exact errors are never known. The hidden layer errors can be calculated only on the base of the backpropagation learning algorithm, which is based on the heuristic assumption on the dependence of the error of each neuron on the errors of those neurons to which this neuron is connected. Increasing of the total number of weights in the network leads to complications in solving the optimization problem of the error functional minimization.

Download PDF sample

Rated 4.40 of 5 – based on 29 votes