Hi everyone,
I have been using the AMB82-mini for about a week now, and I love its capabilities.
My question is: I found that there is no “official” support for Tflite-micro for AmebaPro2 architecture, but I have used the Arduino library of Tflite micro for AmebaD architecture, and it’s working fine. I am just wondering if there is any performance compromise associated with using a non-supported library.
Are there any other ways to do customs Neural network inference on AMB82 other than Tflite-micro?
And another question I had is, I tried finding details on the onboard NPU present on the AMB82-mini but couldn’t find much information…I was in search if the NPU support floating point operations (FP32/16) or is it only for quantized (INT8/16) models…And how to ensure the board uses NPU for inference, to ensure faster response of models…
Thanks,
Ashish Bangwal