Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why mobile output is so different with python? #26

Open
thelou1s opened this issue Feb 27, 2024 · 3 comments
Open

Why mobile output is so different with python? #26

thelou1s opened this issue Feb 27, 2024 · 3 comments

Comments

@thelou1s
Copy link

thelou1s commented Feb 27, 2024

Hi,
I converted dymn10_as to pytorch mobile, But mobile output is so different with python.
I checked both torch version and the model file. What may be the problem? Thanks :)

Expected Behavior:
Same or similar outputs

Actual Behavior:
So different output

Reproduce:
1.convert dymn10_as to pytorch mobile(android)
2.compare python and mobile(android) output

if __name__ == '__main__':
    model_name = 'dymn10_as'
    model_input = torch.rand(1, 1, 128, 210)
    ptmobile_name = 'eat_' + model_name + '_ptmobile.ptl'

    if model_name.startswith('dymn'):
        model = get_dymn(width_mult=NAME_TO_WIDTH(model_name), pretrained_name=model_name, strides=[2, 2, 2, 2])
    else:
        model = get_mn(width_mult=NAME_TO_WIDTH(model_name), pretrained_name=model_name, strides=[2, 2, 2, 2])
    model.to(torch.device('cpu'))
    model.eval()
    model = torch.jit.trace(model, model_input)
    print(model.code)

    # https://github.com/pytorch/pytorch/issues/96639
    # model = mobile_optimizer.optimize_for_mobile(model,
    #                                                  {
    #                                                      MobileOptimizerType.CONV_BN_FUSION,
    #                                                      # I'm only disabling CONV_BN_FUSION
    #                                                      # MobileOptimizerType.FUSE_ADD_RELU,
    #                                                      # MobileOptimizerType.HOIST_CONV_PACKED_PARAMS,
    #                                                      # MobileOptimizerType.INSERT_FOLD_PREPACK_OPS,
    #                                                      # MobileOptimizerType.REMOVE_DROPOUT
    #                                                  })
    model._save_for_lite_interpreter(ptmobile_name)
    print('android model save success!')

image

@fschmid56
Copy link
Owner

fschmid56 commented Feb 27, 2024

Hi,
is the problem specific to 'dymn10_as', or does it also hold for 'mn10_as'?

Do I understand correctly from the screenshot above that the values computed by the mel are slightly changed (but almost the same) and the values of the computed model outputs are very different on your PC and an Android device?

The model outputs on Android are very different, but not completely random, correct?

@thelou1s
Copy link
Author

thelou1s commented Feb 28, 2024

Hi,
Thank you for quick reply and your great work : )

"is the problem specific to 'dymn10_as', or does it also hold for 'mn10_as'?"
both dymn10_as and mn10_as

"very different on your PC and an Android device"
Yes, Very different.
You can notice that, python inference 'Snoring' correctly but android inference 'Music' on 'dd_01_16khz.wav'.
And I also tested other 33 snoring wav files, python inference correctly half of them, but android none.

image

@thelou1s thelou1s mentioned this issue Mar 4, 2024
@fschmid56
Copy link
Owner

I see. I haven't used the models on mobile phones before, so I will not be of much help here. Maybe I'll find the time in the near future to look into it.

However, I suggest closely examining the data type used on PC and Android. Maybe there is a mismatch in terms of precision?

Have you also checked whether the model weights stay the same?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants