Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ML model doesn't load on background thread #611

Open
bhavin250495 opened this issue Jan 31, 2020 · 6 comments
Open

ML model doesn't load on background thread #611

bhavin250495 opened this issue Jan 31, 2020 · 6 comments
Assignees
Labels
bug Unexpected behaviour that should be corrected (type)

Comments

@bhavin250495
Copy link

🐞Describe the bug

  • The inference runs fine when the app is in foreground but when it goes background, and tries to run inference gets crashed or takes long time to load model MLModel.init()

Trace

2020-01-31 12:17:10.135653+0530 ********** [3505:875445] [coreml] Error in adding network -1.
2020-01-31 12:17:10.139437+0530 ********** [3505:875445] [coreml] MLModelAsset: load failed with error Error Domain=com.apple.CoreML Code=0 "Error in declaring network." UserInfo={NSLocalizedDescription=Error in declaring network.}
2020-01-31 12:17:10.140146+0530 ********** [3505:875445] [coreml] MLModelAsset: modelWithError: load failed with error Error Domain=com.apple.CoreML Code=0 "Error in declaring network." UserInfo={NSLocalizedDescription=Error in declaring network.}
Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.CoreML Code=0 "Error in declaring network." UserInfo={NSLocalizedDescription=Error in declaring network.}: file ML.swift, line 105

To Reproduce

  • Just load the model when app in background
Default generated file to run ML model
/// Construct a model that automatically loads the model from the app's bundle
    convenience init() {
        
        try! self.init(contentsOf: model.urlOfModelInThisBundle)
    }

demoModel.mlmodel.zip

System environment (please complete the following information):

  • coremltools version (e.g., 3.0b5):

  • OS: MAC OS

  • macOS version (if applicable): 14

  • XCode version (if applicable): 11.3.1]

  • How you install python (anaconda, virtualenv, system): system

  • python version (e.g. 3.7): 3.7

  • any other relevant information: 2.3

    • keras version: 2.3
@bhavin250495 bhavin250495 added the bug Unexpected behaviour that should be corrected (type) label Jan 31, 2020
@anilkatti
Copy link

@bhavin250495 could you share the code that you are using to run the prediction in the background? One thing to keep in mind is background mode is enabled only on CPU. You can do this by setting MLModelConfiguration.computeUnits.

@bhavin250495
Copy link
Author

bhavin250495 commented Feb 5, 2020

@anilkatti Thanks for replying, I tried adding MLPredcitionOptions but i don't have any problem
with prediction, the issue is with loading of ML model from local URL

  1. I trigger a silent notification from the server to start inference on app
  2. When app gets silent notification, it does some preprocessing and creates input for MLmodel
  3. Now when i am trying to run prediction on MLmodel when the app is in background it gets crashed or just take forever to load model
 func predict(input:MLInput) throws -> Bool{
        print("Loading model")
//Model does't load here when in background
        if let model = SmartML(){
            let options = MLPredictionOptions.init()
            options.usesCPUOnly = true
            let outputLabel = try nudgeModel.prediction(input: input:MLInput,options: options).classLabel
            
            let status = (outputLabel == "yes") ? true : false
            return status
        }
        return false
    }

@anilkatti
Copy link

My suggestion was to use SmartML(configuration:) method to load the model but, I did not realize that you were running this on MacOS. Based on the error, it seem like the app is having issue reading the model files from disk in the background mode. Could you share a repro case so, I can debug the root-cause? The model itself looks good.

@bhavin250495
Copy link
Author

bhavin250495 commented Feb 6, 2020

I am using this model on iOS , and if i share the repo to reproduce the error you will need to setup silent notifications . Could you run the inference using local timer in background to reproduce the error

@hfnvbh
Copy link

hfnvbh commented Sep 14, 2020

Hi all,
Is there any updates about this issue?
I met same problem, when coreml model loaded in background mode after silent notification in my ios app, I have same message "Error in declaring network."

When I inited model in foreground, or in BGProcessing mode, it's ok.

@anilkatti
Copy link

Silent notifications are not meant as a way to keep your app awake in the background beyond quick refresh operations, nor are they meant for high priority updates.

I am looking for clear guidance in this scenario. I will update this tread.

Birch-san pushed a commit to Birch-san/coremltools that referenced this issue Nov 27, 2022
Signed-off-by: Ryan Russell <[email protected]>

Signed-off-by: Ryan Russell <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Unexpected behaviour that should be corrected (type)
Projects
None yet
Development

No branches or pull requests

3 participants