Hyperdimensional computing: a framework for stochastic computation and symbolic AI

Abstract Hyperdimensional Computing (HDC), also known as Vector Symbolic Architectures (VSA), is a neuro-inspired computing framework that exploits high-dimensional random vector spaces. HDC uses extremely parallelizable arithmetic to provide computational solutions that balance accuracy, efficiency...

Full description

Saved in:
Bibliographic Details
Main Authors: Mike Heddes, Igor Nunes, Tony Givargis, Alexandru Nicolau, Alex Veidenbaum
Format: Article
Language:English
Published: SpringerOpen 2024-10-01
Series:Journal of Big Data
Subjects:
Online Access:https://doi.org/10.1186/s40537-024-01010-8
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850203790181924864
author Mike Heddes
Igor Nunes
Tony Givargis
Alexandru Nicolau
Alex Veidenbaum
author_facet Mike Heddes
Igor Nunes
Tony Givargis
Alexandru Nicolau
Alex Veidenbaum
author_sort Mike Heddes
collection DOAJ
description Abstract Hyperdimensional Computing (HDC), also known as Vector Symbolic Architectures (VSA), is a neuro-inspired computing framework that exploits high-dimensional random vector spaces. HDC uses extremely parallelizable arithmetic to provide computational solutions that balance accuracy, efficiency and robustness. The majority of current HDC research focuses on the learning capabilities of these high-dimensional spaces. However, a tangential research direction investigates the properties of these high-dimensional spaces more generally as a probabilistic model for computation. In this manuscript, we provide an approachable, yet thorough, survey of the components of HDC. To highlight the dual use of HDC, we provide an in-depth analysis of two vastly different applications. The first uses HDC in a learning setting to classify graphs. Graphs are among the most important forms of information representation, and graph learning in IoT and sensor networks introduces challenges because of the limited compute capabilities. Compared to the state-of-the-art Graph Neural Networks, our proposed method achieves comparable accuracy, while training and inference times are on average 14.6× and 2.0× faster, respectively. Secondly, we analyse a dynamic hash table that uses a novel hypervector type called circular-hypervectors to map requests to a dynamic set of resources. The proposed hyperdimensional hashing method has the efficiency to be deployed in large systems. Moreover, our approach remains unaffected by a realistic level of memory errors which causes significant mismatches for existing methods.
format Article
id doaj-art-2b8a150e59d340cb9b3269564d1f8ab9
institution OA Journals
issn 2196-1115
language English
publishDate 2024-10-01
publisher SpringerOpen
record_format Article
series Journal of Big Data
spelling doaj-art-2b8a150e59d340cb9b3269564d1f8ab92025-08-20T02:11:25ZengSpringerOpenJournal of Big Data2196-11152024-10-0111113210.1186/s40537-024-01010-8Hyperdimensional computing: a framework for stochastic computation and symbolic AIMike Heddes0Igor Nunes1Tony Givargis2Alexandru Nicolau3Alex Veidenbaum4Department of Computer Science, University of California, IrvineDepartment of Computer Science, University of California, IrvineDepartment of Computer Science, University of California, IrvineDepartment of Computer Science, University of California, IrvineDepartment of Computer Science, University of California, IrvineAbstract Hyperdimensional Computing (HDC), also known as Vector Symbolic Architectures (VSA), is a neuro-inspired computing framework that exploits high-dimensional random vector spaces. HDC uses extremely parallelizable arithmetic to provide computational solutions that balance accuracy, efficiency and robustness. The majority of current HDC research focuses on the learning capabilities of these high-dimensional spaces. However, a tangential research direction investigates the properties of these high-dimensional spaces more generally as a probabilistic model for computation. In this manuscript, we provide an approachable, yet thorough, survey of the components of HDC. To highlight the dual use of HDC, we provide an in-depth analysis of two vastly different applications. The first uses HDC in a learning setting to classify graphs. Graphs are among the most important forms of information representation, and graph learning in IoT and sensor networks introduces challenges because of the limited compute capabilities. Compared to the state-of-the-art Graph Neural Networks, our proposed method achieves comparable accuracy, while training and inference times are on average 14.6× and 2.0× faster, respectively. Secondly, we analyse a dynamic hash table that uses a novel hypervector type called circular-hypervectors to map requests to a dynamic set of resources. The proposed hyperdimensional hashing method has the efficiency to be deployed in large systems. Moreover, our approach remains unaffected by a realistic level of memory errors which causes significant mismatches for existing methods.https://doi.org/10.1186/s40537-024-01010-8Hyperdimensional computingVector symbolic architecturesBasis hypervectorsGraph classificationDynamic hash table
spellingShingle Mike Heddes
Igor Nunes
Tony Givargis
Alexandru Nicolau
Alex Veidenbaum
Hyperdimensional computing: a framework for stochastic computation and symbolic AI
Journal of Big Data
Hyperdimensional computing
Vector symbolic architectures
Basis hypervectors
Graph classification
Dynamic hash table
title Hyperdimensional computing: a framework for stochastic computation and symbolic AI
title_full Hyperdimensional computing: a framework for stochastic computation and symbolic AI
title_fullStr Hyperdimensional computing: a framework for stochastic computation and symbolic AI
title_full_unstemmed Hyperdimensional computing: a framework for stochastic computation and symbolic AI
title_short Hyperdimensional computing: a framework for stochastic computation and symbolic AI
title_sort hyperdimensional computing a framework for stochastic computation and symbolic ai
topic Hyperdimensional computing
Vector symbolic architectures
Basis hypervectors
Graph classification
Dynamic hash table
url https://doi.org/10.1186/s40537-024-01010-8
work_keys_str_mv AT mikeheddes hyperdimensionalcomputingaframeworkforstochasticcomputationandsymbolicai
AT igornunes hyperdimensionalcomputingaframeworkforstochasticcomputationandsymbolicai
AT tonygivargis hyperdimensionalcomputingaframeworkforstochasticcomputationandsymbolicai
AT alexandrunicolau hyperdimensionalcomputingaframeworkforstochasticcomputationandsymbolicai
AT alexveidenbaum hyperdimensionalcomputingaframeworkforstochasticcomputationandsymbolicai