-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Worker exceeded 95% memory budget #476
Comments
We need to profile the memory usage. Let's check the delta first to see if that makes sense. |
I initially tried the memory_profiler and %memit magic function. Not sure if it does what we want , or I did not use it correct ( I am investigating that) because it shows that (notebook ) |
I would do all the memory profiling outside of the notebook for starters as that can confuse things. Also, start with only one process to get a good benchmark to make sure you understand the delta between each step in the code. Furthermore, break the code down into imperative steps might help. |
Thanks, Daniel. That is what I am trying to do right now. I will share the delta values of each process |
Filename: memory_try.py Line Mem usage Increment Line Contents
================================================ |
@beyucel is this still an issue? Can this be closed? Please close if you think that this isn't something we can act on |
I just wanted to discuss the memory usage issue with this notebook.
When chunk size is above 25 ( >250 mb), a single worker gets to 6.3 GB memory usage and restarts the kernel. When Chunk size is 25 and below, there is no problem.
My question is, why do 300mb chunks have this high memory usage issue?
The text was updated successfully, but these errors were encountered: