Auto-associative and hetroassociative memory

Introduction Associative memory

  •  Pattern association involves associating a new pattern with a stored pattern.
    It is a “simplified” model of human memory.
  •  Types of associative memory:
  1.  Heteroassociative memory
  2. Autoassociative memory
  3. Hopfield Net
  4. Bidirectional Associative Memory (BAM)
  • These are usually single-layer networks.
  • The neural network is firstly trained to store a set of patterns in the form s : t
    s represents the input vector and t the corresponding output vector.
  • The neural network is then tested on a set of data to test its “memory” by using it to
    identify patterns containing incorrect or missing information.
  • Associative memory can be feed forward or recurrent.
  • Autoassociative memory cannot hold an infinite number of patterns.
  • Factors that affect this:  Complexity of each pattern, Similarity of input patterns

Auto Associative Memory Architecture
Auto Associative Architecture
Auto Associative Architecture
Auto associative Memory
  • The inputs and output vectors s and t are the same.
  • The Hebb rule is used as a learning  algorithm or calculate the weight matrix by summing the outer products of each input-output pair.
  • The autoassociative application algorithm is used to test the algorithm



Hetero associative Memory

Hetero Associative Architecture
Hetero Associative Architecture
Hetero associative Memory
  • The inputs and output vectors s and t are different.
  • The Hebb rule is used as a learning algorithm or calculate the weight matrix by summing the outer products of each input-output pair.
  • The heteroassociative application algorithm is used to test the algorithm.
The Hebb Algorithm
  • Initialize weights to zero, wij =0, where i = 1, …, n and j = 1, …, m.
  • For each training case s:t repeat:
    • xi = si , where i=1,…,n
    • yi = tj, where j = 1, .., m
    • Adjust weights wij(new) = wij(old) + xiyj,                                where i = 1, .., n and j = 1, .., m

No comments:

Post a Comment