Loading models onto AMB82

Hi, I’m using AMB82-mini for a binary classification problem. I have models quantized using TF. I had been using Raspberry Pi but now am interested in migrating to a powerful MCU like said.

I see that the compatible format for models is .nb, which I am unaware of. The system expects me to upload my model on the AI Conversion Platform ( Amebapro2 AI Convert Model – Realtek IoT/Wi-Fi MCU Solutions (amebaiot.com)) and wait for an email. Is there a more transparent way to get this done? I’d be happy to do this myself programmatically.

Or is there a way to use TFLite on AMB82-mini?

Regards,
pchat

Hi @pchatty,

Currently this was the only method for obtaining the .nb format. However, self-conversion may be implementation in the future.

Thank you