Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

End-to-end solutions #5

Open
marcorossi5 opened this issue Nov 18, 2020 · 0 comments
Open

End-to-end solutions #5

marcorossi5 opened this issue Nov 18, 2020 · 0 comments

Comments

@marcorossi5
Copy link
Owner

marcorossi5 commented Nov 18, 2020

Inference mode in Larsoft

Input

root file from dunetpc, containing raw::RawDigit and sim::SimChannel products.

Output

NN output containing roi and dn outputs as larsoft products inside ROOT files

Interface

A C++ interface is required to be able to export PyTorch model into an appropriate format for larsoft. Like this one: WaveformRoiFinder_module.cc

Possible solutions:

  • ONNX: save the model trained in PyTorch and load it back in Tensorflow since larsoft contains tf (link here)
  • TMVA: save the model from PyTorch and load it via TMVA. TMVA operates directly on the ROOT format. Not sure if this conforms to larsoft standards.
    Useful links: Class Reference, Pytorch tutorial
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant