Inquiry regarding MobileNetV2 deployment workflow

Hi Team,

I am currently working on deploying a MobileNetV2 model to the Pegasus NPU platform. I have prepared the .pt weight file and a conversion script. I would like to verify if my workflow is correct, especially when compared to the standard procedure used for YOLOv7-tiny.

My current workflow:

  1. PyTorch to ONNX: Use the convert2onnx.py script (see below) to export mobilenetv2.onnx.

    from mobilefacenet import MobileFaceNet
    import torch
    import time
    
    if __name__ == '__main__':
        filename = 'weights/mobilefacenet.pt'
        print('loading {}...'.format(filename))
        start = time.time()
        model = MobileFaceNet()
        model.load_state_dict(torch.load(filename, map_location=torch.device('cpu')))
        print('elapsed {} sec'.format(time.time() - start))
        print(model)
    
        output_onnx = 'weights/MobileFaceNet.onnx'
        print("==> Exporting model to ONNX format at '{}'".format(output_onnx))
        input_names = ["input0"]
        output_names = ["output0"]
        inputs = torch.randn(1, 3, 112, 112)
    
        torch_out = torch.onnx._export(model, inputs, output_onnx, export_params=True, verbose=False,
                                       input_names=input_names, output_names=output_names, opset_version=10)
    
  2. Pegasus Import: Feed the generated ONNX file into the pegasus_import.sh script.

  3. Post-processing: Proceed with quantization and model compilation.

Questions:

  1. Does MobileFaceNet require a reparam (re-parameterization) step similar to YOLOv7-tiny before conversion? I would like to confirm this with your team.

  2. After generating the ONNX file for MobileNetV2 via convert2onnx.py, is the subsequent execution flow (importing to the conversion tools) identical to the workflow used for YOLOv7-tiny?

hi @t114c75035 ,

May I know if you are using MobileNetV2 or MobileFaceNet for your deployment?

Thank you.

Hi,

​For our current deployment, we are utilizing MobileNetV2.

​Thank you.