Partial Monte-Carlo dropout #252
Replies: 4 comments 10 replies
-
Hi @pieterblok, thank you so much for your comment and yes definitely please go ahead and make a PR. If you were interested you can then maybe generate some metrics to compare the two using BaaL and we post it in our tutorial. Otherwise I can do that part :) |
Beta Was this translation helpful? Give feedback.
-
That sounds great! |
Beta Was this translation helpful? Give feedback.
-
@parmidaatg @Dref360 @orangetoaster Perfect! I will have a deep dive into the BAAL source code to look how I can integrate such method. If I'm stuck somewhere or lost in integration, I will ask in this Discussion page. |
Beta Was this translation helpful? Give feedback.
-
@pieterblok and I met on Friday to discuss this issue. def montecarlo_forward(x : Tensor[BatchSize, ...], iterations:int) -> Tensor[BatchSize, ..., ITERATION]:
pass This would make the case where we have "replicate_in_memory=False" much faster. Pieter shared a speedup of 20 to 40% on his models. We also discussed another way of doing it with a LRU cache. This PR is the first implementation of such proposal: #254 Let me know what you think! |
Beta Was this translation helpful? Give feedback.
-
Monte-Carlo dropout is a simple but time-consuming procedure when determining the uncertainty score of an image. To alleviate this problem, I have implemented the so-called Partial Monte-Carlo dropout (PMCD) procedure in Detectron2 Active Learning (see line 173 in https://github.com/pieterblok/maskal/blob/maskAL/active_learning/sampling/montecarlo_dropout.py).
With PMCD an image is inferred only once through the neural network parts that don't have dropout. Then, the resulting features from these parts are copy-paste to the network parts with dropout where the actual repeated inference happens. PMCD is highly scalable and significantly faster than inferring the same image multiple times through the entire network. I was wondering if I can implement this PMCD with one of your developers in BAAL. @Dref360 @parmidaatg
Beta Was this translation helpful? Give feedback.
All reactions