A) feedforward or feedback B) feedforward and feedback C) feedforward manner D) feedback manner
A) second layer B) hidden layer C) input layer D) output layer
A) receives inputs from all others B) gives output to all others C) may receive or give input or output to others
A) Supervised B) Supervised and Unsupervised C) UnSupervised
A) Adaptive Resonance Theory B) Artificial Resonance Theory C) Automatic Resonance Theory
A) Bipolar B) Binary C) Binary and Bipolar
A) Large Cluster B) No change C) Small cluster
A) two feedforward network with hidden layer B) feedforwward network with hidden layer C) feed forward network only
A) its ability to learn inverse mapping functions B) its ability to learn forward and inverse mapping functions C) its ability to learn forward mapping functions
A) all are one to one connected B) some are connected C) each input unit is connected to each output unit
A) Supervised B) Learning with critic C) UnSupervised
A) TRUE B) FALSE
A) excitatory input B) inhibitory inpur
A) deterministically B) stochastically C) both deterministically & stochastically
A) greater the degradation more is the activation value of winning units B) greater the degradation less is the activation value of winning units C) greater the degradation less is the activation value of other units
A) No B) depends on type of clustering C) Yes
A) learning laws which modulate difference between synaptic weight & activation value B) learning laws which modulate difference between actual output & desired output C) learning laws which modulate difference between synaptic weight & output signal
A) the overall characteristics of the mapping problem B) the number of inputs C) the number of outputs
A) the number of inputs it can deliver B) the number of patterns that can be stored C) the number of inputs it can take
A) can be slow or fast in general B) Slow process C) Fast process |