Since 2019, tiny machine learning has imposed itself everywhere as an innovative technology trend deployed at the edge and has been pervasive in many IoT applications. One interesting, addressed by this work, is related to the welfare of the laboratory animals that could be preserved by acquiring and classifying some image data to monitor object’s presence in their cages. For example mice and rats activities such as drinking and eating can be indicators of their state of health. In that context, μBottleNet has been developed to classify the presence, or the absence of the water bottle, while μFoodNet to classify the level of the food into the feeder. Those neural networks (NNs) achieved 99.8% accuracy. At our best knowledge, no datasets were available to support this study and therefore we built three image datasets to train the neural networks on both Bottle and Food objects. From image capture to the inferences execution, the tasks have been carried out by STM32L4 (ultra-low-power that consumes 120 μA/MHz) and STM32H7 micro controller units (MCUs). Both NNs have been designed to fit into constrained MCU resources. Special attention has been given to the on-chip memory occupation to ensure the RAM footprint was 39.44 KBytes. To evaluate and test on the field the performances of these models (and against to the MobileNetV2 one), a graphical user interface (GUI), has been developed, capable of rendering, the validation and test results of inference runs on the MCUs. K-fold cross validation has been run and the resulting performances of the two NNs were compared to the MobileNetV2 confirming that the μBottleNet and the μFoodNet reached and exceeded the accuracy achieved by a more complex NN architecture.

Complexity bounded classification of fish-eye distorted objects with micro-controllers / Pau, Danilo Pietro; Carra, Alessandro; Garzola, Marco; Falaschetti, Laura; Turchetti, Claudio. - ELETTRONICO. - (2022), pp. 746-751. (Intervento presentato al convegno 2022 IEEE 21st Mediterranean Electrotechnical Conference (MELECON) tenutosi a Palermo, Italy nel 14-16 June 2022) [10.1109/MELECON53508.2022.9842897].

Complexity bounded classification of fish-eye distorted objects with micro-controllers

Falaschetti, Laura;Turchetti, Claudio
2022-01-01

Abstract

Since 2019, tiny machine learning has imposed itself everywhere as an innovative technology trend deployed at the edge and has been pervasive in many IoT applications. One interesting, addressed by this work, is related to the welfare of the laboratory animals that could be preserved by acquiring and classifying some image data to monitor object’s presence in their cages. For example mice and rats activities such as drinking and eating can be indicators of their state of health. In that context, μBottleNet has been developed to classify the presence, or the absence of the water bottle, while μFoodNet to classify the level of the food into the feeder. Those neural networks (NNs) achieved 99.8% accuracy. At our best knowledge, no datasets were available to support this study and therefore we built three image datasets to train the neural networks on both Bottle and Food objects. From image capture to the inferences execution, the tasks have been carried out by STM32L4 (ultra-low-power that consumes 120 μA/MHz) and STM32H7 micro controller units (MCUs). Both NNs have been designed to fit into constrained MCU resources. Special attention has been given to the on-chip memory occupation to ensure the RAM footprint was 39.44 KBytes. To evaluate and test on the field the performances of these models (and against to the MobileNetV2 one), a graphical user interface (GUI), has been developed, capable of rendering, the validation and test results of inference runs on the MCUs. K-fold cross validation has been run and the resulting performances of the two NNs were compared to the MobileNetV2 confirming that the μBottleNet and the μFoodNet reached and exceeded the accuracy achieved by a more complex NN architecture.
2022
978-1-6654-4280-0
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11566/305003
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact