User Tools

Site Tools


start

CS 81 Section 2

Week 1 Summary

1/19/12: Thomas Shultz A neural network primer Computational Developmental Psychology, Chapters 1 and 2 (1995)

Neural Networks:

  • Neural networks are function-approximating models consisting of many highly connected simple units, each one representing a single neuron.
  • Composed of discrete units loosely modeling neurons, each unit has:
    • Inputs
    • Weights on each input
    • Activation function
    • Outputs F(∑Inputi * Weighti) where F is activation function
  • NNets can learn non-linear functions if activation function is non-linear
  • Use Back-Propagation to train networks
  • Networks can have any topology–time sensitive variants exist
  • Possible to overtrain for inputs–networks will learn the easiest feature to classify first

Psychological Development:

  • Can/do neural networks mirror the stages of development that humans go through?
    • Language acquisition stages
      • Children go through three very predictable stages in learning past tense verbs. At first, they quickly pick up on correct past tenses of common irregular verbs (e.g. “went”), but then they will learn the rule about adding “ed” to turn a regular verb into the past tense and incorrectly overgeneralize this rule to include irregular verbs as well (e.g. “goed” instead of “went”). Eventually, however, they will learn that irregular verbs are exceptions to the rule and do everything correctly. Computers given words with the same frequency that they appear in English will exhibit the same behavior, suggesting that these stages are the result of statistics, and that computers can accurately model parts of human development.
    • Development over lifetime
      • Cascade correlation - neural net set-up in which new nodes are added until the problem can be solved. This allows for young children’s plasticity in neural connections, which ends at puberty.
        • Alternate set-ups allow for deletion of nodes, but not much work has been done on neural pruning, despite it being a vital stage in the development of children.
      • Some regions of brain or neural net might need plasticity more than others (new memories need to be formed, but other concepts are more fixed)
      • Babies use a dramatic amount of energy for brain function, but this drops steeply after a certain (young) age.
        • Why don’t we keep doing this? Probably because, evolutionarily, our energy needs to be diverted to more vital functions. (sufficiency; cf. Phil’s wish for more coordination)
        • The learning curve for most things starts to level out, making it a lot more difficult to gain the last for percentage points of ability than to gain earlier ones, which is probably generally not worth it.
        • Similarly, neural nets tend to slow their learning as they approach accuracy. This is largely because there isn’t much more accuracy that can be gained without overfitting, but is also partly due to the weights of a lot of connections getting kind of caught in valleys of accuracy. (cf. simulated annealing)
    • Nature vs. nurture:
      • A lot of neural nets are really sensitive to starting conditions. J. Kolen found that the set of starting conditions from which the solution will be found is fractal.
      • Evolution kind of sets starting conditions for humans. We, too, are sensitive to small changes, in the form of genetic mutations. (“Maybe some of us were just born in the dark part of the fractal.” – Temple)
start.txt · Last modified: 2012/01/26 20:42 by edolson1