Hi, I’m using AMB82-mini for a binary classification problem. I have models quantized using TF. I had been using Raspberry Pi but now am interested in migrating to a powerful MCU like said.
I see that the compatible format for models is .nb, which I am unaware of. The system expects me to upload my model on the AI Conversion Platform ( Amebapro2 AI Convert Model – Realtek IoT/Wi-Fi MCU Solutions (amebaiot.com)) and wait for an email. Is there a more transparent way to get this done? I’d be happy to do this myself programmatically.
Yeah, the conversion seems fine for me if the model is a custom CNN or something light basically. If I use a larger model (typically a transfer learning model), it takes forever and ends up failing at times.
PS I have given up on AMB82 and moved to other options that allow more control like ESP32 CAM, Portenta and Nicla Vision