Loading models onto AMB82

Hi, I’m using AMB82-mini for a binary classification problem. I have models quantized using TF. I had been using Raspberry Pi but now am interested in migrating to a powerful MCU like said.

I see that the compatible format for models is .nb, which I am unaware of. The system expects me to upload my model on the AI Conversion Platform ( Amebapro2 AI Convert Model – Realtek IoT/Wi-Fi MCU Solutions (amebaiot.com)) and wait for an email. Is there a more transparent way to get this done? I’d be happy to do this myself programmatically.

Or is there a way to use TFLite on AMB82-mini?

Regards,
pchat

Hi @pchatty,

Currently this was the only method for obtaining the .nb format. However, self-conversion may be implementation in the future.

Thank you

1 Like

have you tried it converting? mine takes so long

Yeah, the conversion seems fine for me if the model is a custom CNN or something light basically. If I use a larger model (typically a transfer learning model), it takes forever and ends up failing at times.

PS I have given up on AMB82 and moved to other options that allow more control like ESP32 CAM, Portenta and Nicla Vision

PC

about how long does it take to arrive in your email?

Hello

If the server is working well, typically it will reach your email within minutes.

Thank you