This repository has been archived by the owner on Sep 25, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 55
Applying coreml conversation on other style transfer torch models #8
Comments
this might help , this is the model layers and the main problem is with evaluate() method
the model is loaded perfectly fine but when i am trying to evaluate model , i get this error
what is the main reason behind the evaluation to bring this NoneType error. for reference also this is the Model i am trying to evaluate |
To use this repo and models with custom layers from it you need to implement corresponding layers using pytorch (legacy.nn.Module subclasses) and replace them in parsed model. |
@opedge do you have any example for that |
There is an example of implementing custom InstanceNormalization layer using pytorch. |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I have tried the coreml conversation on other repository
here is the link of the repository :- link
they provide a pre trained model :- model.t7
By running the perpare_model.lua on this model an error is thrown
adding some changes to make it load the model
error :-
running on convert-fast-neural-style.py
getting this error
The text was updated successfully, but these errors were encountered: