作者: Soroush Shahi , Mahdi Pedram , Glenn Fernandes , Nabil Alshurafa
DOI:
关键词:
摘要: Researchers have been leveraging wearable cameras to both visually confirm and automatically detect individuals’ eating habits. However, energy-intensive tasks such as continuously collecting and storing RGB images in memory, or running algorithms in real-time to automate detection of eating, greatly impacts battery life. Since eating moments are spread sparsely throughout the day, battery life can be mitigated by recording and processing data only when there is a high likelihood of eating. We present a framework comprising a golf-ball sized wearable device using a low-powered thermal sensor array and real-time activation algorithm that activates high-energy tasks when a hand-to-mouth gesture is confirmed by the thermal sensor array. The high-energy tasks tested are turning on the RGB camera (Trigger RGB mode) and running inference on an on-device machine learning model (Trigger ML mode). Our …