< Back to previous page


Dynamic Complexity Tuning for Hardware-Aware Probabilistic Circuits

Book Contribution - Book Chapter Conference Contribution

Probabilistic inference is a well suited approach to address the challenges of resource constrained embedded application scenarios. In particular, probabilistic models learned generatively are robust to missing data and are capable of encoding domain knowledge seamlessly. These traits have been leveraged to propose hardware-aware probabilistic learning and inference strategies that induce Pareto optimal accuracy versus resource consumption trade-offs. This paper proposes a model-complexity tuning strategy that relies on ensembles of probabilistic classifiers to identify the difficulty of the classification task on a given instance. It then dynamically switches to a higher or lower complexity setting accordingly. The strategy is evaluated on an embedded human activity recognition scenario and demonstrates a superior performance when compared to the Pareto-optimal trade-off obtained when the ensembles are deployed statically, especially in low cost regions of the trade-off space. This makes the strategy amenable to embedded computing scenarios, where one of the main constraints towards always-on functionality are the device’s strict resource constraints.
Book: IoT Streams for Data-Driven Predictive Maintenance and IoT, Edge, and Mobile for Embedded Machine Learning
Pages: 283 - 295
Publication year:2021