Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Works incorrectly in samsung A21s #25

Open
emreakcan opened this issue Dec 2, 2021 · 9 comments
Open

Works incorrectly in samsung A21s #25

emreakcan opened this issue Dec 2, 2021 · 9 comments

Comments

@emreakcan
Copy link

App working correctly on all devices I have, but somehow on Samsung A21s, it is producing incorrect results.
It is nearly producing same output for all faces probability between .2 .3, for l2, cosine is also the same.

I thought maybe camera is the problem, so I embedded the pictures inside drawable and matched, but still it is producing nearly the same results for everything.

I noticed FloatArray returned from getCroppedFaceEmbedding producing different results for different phones.

My xiaomi mi9 works alright.

Do you have any ideas?

@emreakcan
Copy link
Author

if I run the L2Norm with the FloatArrays that I get from the Xiaomi, Samsung can do the calculation correctly.

@shubham0204
Copy link
Owner

This is because of different device configurations. Probably switching off the GpuDelegate and XNNPack might help.
In the FaceNetModel.kt class, you'll see these lines,

init {
    // Initialize TFLiteInterpreter
    val interpreterOptions = Interpreter.Options().apply {
        // Add the GPU Delegate if supported.
        // See -> https://www.tensorflow.org/lite/performance/gpu#android
        if ( CompatibilityList().isDelegateSupportedOnThisDevice ) {
            addDelegate( GpuDelegate( CompatibilityList().bestOptionsForThisDevice ))
        }
        else {
            // Number of threads for computation
            setNumThreads( 4 )
        }
        setUseXNNPACK( true )
    }
    interpreter = Interpreter(FileUtil.loadMappedFile(context, model.assetsFilename ) , interpreterOptions )
    Logger.log("Using ${model.name} model.")
}

Replace these lines with,

init {
    // Initialize TFLiteInterpreter
    val interpreterOptions = Interpreter.Options().apply {    
         setNumThreads( 4 )
    }
    interpreter = Interpreter(FileUtil.loadMappedFile(context, model.assetsFilename ) , interpreterOptions )
    Logger.log("Using ${model.name} model.")
}

@shubham0204
Copy link
Owner

@emreakcan Could you resolve the error by removing GpuDelegate?

@DineshIT
Copy link
Contributor

HI @shubham0204,

I'm also facing the same issue with face recognition. Initially, it generated multiple results for the same face. But now after removing the GPUDelegate and XNNPack, for all the faces it is giving me the same result as unknown.

@shubham0204
Copy link
Owner

@DineshIT can you send me some more details of the Samsung A21 device on which you're testing the app? I need these details specifically:

  1. Android OS Version
  2. GPU Renderer
  3. Supported ABIs
  4. CPU architecture

You can get these details by installing the Device Info app on the device.

@DineshIT
Copy link
Contributor

DineshIT commented Apr 25, 2022 via email

@DineshIT
Copy link
Contributor

DineshIT commented May 6, 2022

Hi @shubham0204

I have worked on this issue and fixed it by adding the properties in the interpreter class object.

Please check my commit in the repo and merge it to handle this issue.

Hi @emreakcan

You can make the mentioned changes in your FaceNetModel Class and verify whether this issue got fixed at your end. If you got success please share with us.

@shubham0204
Copy link
Owner

@DineshIT Can you open a PR in this repo, so that I can review the changes?

@DineshIT
Copy link
Contributor

DineshIT commented May 9, 2022

PR has been created

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants