A) feedback manner B) feedforward and feedback C) feedforward or feedback D) feedforward manner
A) hidden layer B) output layer C) input layer D) second layer
A) may receive or give input or output to others B) gives output to all others C) receives inputs from all others
A) Supervised B) UnSupervised C) Supervised and Unsupervised
A) Automatic Resonance Theory B) Adaptive Resonance Theory C) Artificial Resonance Theory
A) Binary and Bipolar B) Bipolar C) Binary
A) Large Cluster B) Small cluster C) No change
A) feed forward network only B) two feedforward network with hidden layer C) feedforwward network with hidden layer
A) its ability to learn forward mapping functions B) its ability to learn inverse mapping functions C) its ability to learn forward and inverse mapping functions
A) all are one to one connected B) each input unit is connected to each output unit C) some are connected
A) UnSupervised B) Learning with critic C) Supervised
A) FALSE B) TRUE
A) inhibitory inpur B) excitatory input
A) both deterministically & stochastically B) deterministically C) stochastically
A) greater the degradation less is the activation value of other units B) greater the degradation more is the activation value of winning units C) greater the degradation less is the activation value of winning units
A) Yes B) No C) depends on type of clustering
A) learning laws which modulate difference between synaptic weight & output signal B) learning laws which modulate difference between actual output & desired output C) learning laws which modulate difference between synaptic weight & activation value
A) the overall characteristics of the mapping problem B) the number of inputs C) the number of outputs
A) the number of inputs it can take B) the number of inputs it can deliver C) the number of patterns that can be stored
A) Fast process B) can be slow or fast in general C) Slow process |