Browse by author
Lookup NU author(s): Dr Shengyu DuanORCiD, Professor Rishad ShafikORCiD, Professor Alex YakovlevORCiD
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
© 2025 IEEE. The Tsetlin Machine (TM) is a novel alternative to deep neural networks (DNNs). Unlike DNNs, which rely on multi-path arithmetic operations, a TM learns propositional logic patterns from data literals using Tsetlin automata. This fundamental shift from arithmetic to logic underpinning makes TMs ideal for low-cost applications. In TM, literals are often included by both positive and negative clauses within the same class, canceling out their impact on individual class definitions. We exploit this property to develop compressed TM models, enabling energy-efficient and high-Throughput inferences for machine learning (ML) applications. We introduce a training approach that incorporates excluded automata states to sparsify TM logic patterns in both positive and negative clauses. This exclusion is iterative, ensuring that highly class-correlated (and therefore significant) literals are retained in the compressed inference model, ETHEREAL, to maintain strong classification accuracy. Compared to standard TMs, ETHEREAL TM models can reduce model size by up to 87.54%, with only a minor accuracy compromise. We validate the impact of this compression on eight real-world Tiny machine learning (TinyML) datasets against standard TM, equivalent Random Forest (RF) and Binarized Neural Network (BNN) on the STM32F746G-DISCO platform. Our results show that ETHEREAL TM models achieve over an order of magnitude reduction in inference time (resulting in higher throughput) and energy consumption compared to BNNs, while maintaining a significantly smaller memory footprint compared to RFs.
Author(s): Duan S, Shafik R, Yakovlev A
Publication type: Conference Proceedings (inc. Abstract)
Publication status: Published
Conference Name: 10th International Workshop on Advances in Sensors and Interfaces (IWASI 2025)
Year of Conference: 2025
Online publication date: 19/08/2025
Acceptance date: 02/04/2018
ISSN: 2836-7936
Publisher: IEEE
URL: https://doi.org/10.1109/IWASI66786.2025.11121985
DOI: 10.1109/IWASI66786.2025.11121985
Library holdings: Search Newcastle University Library for this item
ISBN: 9798331565787