-
Notifications
You must be signed in to change notification settings - Fork 699
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error while running ML algorthims: No module named numpy #112
Comments
Have you installed a Python dependency manager and installed nympy with it? Else, I see your missing step... |
Hi @shwetamittal019 , are you running the example within the python-template? or directly on spark-shell in an iterative way? If via python-template you can add numpy as one of the dependencies on your docker-spark/template/python/Dockerfile Lines 8 to 10 in bc3f212
Feel free to comment more so that we can help. Or better, feel free to share your use-case so that we can also reproduce. Best regards, |
Hi @GezimSejdiu I am also having trouble with this. I did add numpy to requirements.txt yet upon starting the container while the numpy module is being installed I'm getting this error: `Step 1/12 : FROM bde2020/spark-python-template:2.4.3-hadoop2.7 Executing 3 build triggers---> Running in fc96aff6d8d3
Command "/usr/bin/python3.7 /usr/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py prepare_metadata_for_build_wheel /tmp/tmp2ikrbto3" failed with error code 1 in /tmp/pip-install-n0yoj555/numpy I guess the container is missing gcc (from what I have been able to find on google) and thus it cannot install this module. |
I am having the same issue as well. |
I was able to install numpy by adding this line in my dockerfile.
|
You could also extend the Something like this:
|
Run this on all the CLI of the containers
|
File "/usr/bin/spark-3.0.1-bin-hadoop3.2/python/lib/pyspark.zip/pyspark/ml/param/init.py", line 26, in
import numpy as np
ModuleNotFoundError: No module named 'numpy'
Please help
The text was updated successfully, but these errors were encountered: