
Research Article
Scalable Approximate Computing Techniques for Latency and Bandwidth Constrained IoT Edge
@INPROCEEDINGS{10.1007/978-3-030-76063-2_20, author={Anjus George and Arun Ravindran}, title={Scalable Approximate Computing Techniques for Latency and Bandwidth Constrained IoT Edge}, proceedings={Science and Technologies for Smart Cities. 6th EAI International Conference, SmartCity360°, Virtual Event, December 2-4, 2020, Proceedings}, proceedings_a={SMARTCITY}, year={2021}, month={5}, keywords={Edge computing IoT Approximate computing Machine learning Machine vision}, doi={10.1007/978-3-030-76063-2_20} }
- Anjus George
Arun Ravindran
Year: 2021
Scalable Approximate Computing Techniques for Latency and Bandwidth Constrained IoT Edge
SMARTCITY
Springer
DOI: 10.1007/978-3-030-76063-2_20
Abstract
Machine vision applications at the IoT Edge have bandwdith and latency constraints due to large sizes of video data. In this paper we propose approximate computing, that trades off inference accuracy with video frame size, as a potential solution. We present a number of low compute overhead video frame modifications that can reduce the video frame size, while achieving acceptable levels of inference accuracy. We present, a heuristic based design space pruning, and a Categorical boost based machine learning model as two approaches to achieve scalable performance in determining the appropriate video frame modifications that satisfy design constraints. Experimental results on an object detection application on the Microsoft COCO 2017 data set, indicates that proposed methods were able to reduce the video frame size by upto 71.3% while achieving an inference accuracy of 80.9% of that of the unmodified video frames. The machine learning model has a high training cost, but has a lower inference time, and is scalable and flexible compared to the heuristic design space pruning algorithm.