A few days ago I run into Carin Meier post titled Vector Symbolic Architectures in Clojure. Beside this point that I can really relate to
[…] I don’t have a lot of free time. When I do get a few precious hours to do some coding just for me, I want it it to be small enough for me to fire up and play with it in a REPL on my local laptop and get a result back in under two minutes. […]
Carin points to Pentti Kanerva’s Hyperdymensional Comuputing introductory paper as a good place to start.
The 1990s saw the emergence of cognitive models that depend on very high dimensionality and randomness. They include Holographic Reduced Representations, Spatter Code, Semantic Vectors, Latent Semantic Analysis, Context-Dependent Thinning, and Vector Symbolic Architecture. They represent things in high dimensional vectors that are manipulated by operations that produce new high-dimensional vectors in the style of traditional computing, in what is called here hyperdimensional computing on account of the very high dimensionality. The paper presents the main ideas behind these models, written as a tutorial essay in hopes of making the ideas accessible and even provocative. A sketch of how we have arrived at these models, with references and pointers to further reading, is given at the end. The thesis of the paper is that hyperdimensional representation has much to offer to students of cognitive science, theoretical neuroscience, computer science and engineering, and mathematics.
After reading a few more papers around the topic, this approach sounds pretty intriguing and worth some thinking experiments.
Carin’s video explains some of the work she introduced in the blog post above.