A Walsh Hadamard Derived Linear Vector Symbolic Architecture
About
Vector Symbolic Architectures (VSAs) are one approach to developing Neuro-symbolic AI, where two vectors in $\mathbb{R}^d$ are `bound' together to produce a new vector in the same space. VSAs support the commutativity and associativity of this binding operation, along with an inverse operation, allowing one to construct symbolic-style manipulations over real-valued vectors. Most VSAs were developed before deep learning and automatic differentiation became popular and instead focused on efficacy in hand-designed systems. In this work, we introduce the Hadamard-derived linear Binding (HLB), which is designed to have favorable computational efficiency, and efficacy in classic VSA tasks, and perform well in differentiable systems. Code is available at https://github.com/FutureComputing4AI/Hadamard-derived-Linear-Binding
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | MNIST | -- | 395 | |
| Image Classification | SVHN | Accuracy94.53 | 359 | |
| Image Clustering | CIFAR-10 | -- | 243 | |
| Clustering | MNIST | ARI18.44 | 37 | |
| Clustering | CIFAR-100 | ARI61 | 19 | |
| Clustering | SVHN | ARI1.37 | 8 | |
| Image Classification | CIFAR-10 CR10 | Top-1 Acc83.81 | 5 | |
| Image Classification | CIFAR-100 CR100 | Top-1 Acc58.82 | 5 | |
| Image Classification | Mini-ImageNet (MIN) | Top-1 Acc59.48 | 5 | |
| XML classification | BIBTEX | nDCG61.741 | 5 |