Basic environment for Data Workshop
Minimum effort & maximum impact :squirrel:
- Jupyter Notebook and Jupyter Lab
- Anaconda with Python 3.6.0
- Additional packages seaborn, ggplot, hyperopt, hyperas, ml_metrics, xgboost
Only Docker (installation Instruction for Mac and Windows)
docker run --net=host --dns 127.0.0.1 --dns 8.8.8.8 --dns 8.8.4.4 -dit -p 8888-8889:8888-8889 --name dataworkshop-environment dataworkshop/environment
- Notebook - localhost:8888
- Lab - localhost:8889
Note: if you're a happy Docker Toolbox
user to find the ip address use docker-machine ls
.
The URL column (docker-machine ls
) contains tcp://192.168.99.100:2376
, so you should copy 192.168.99.100
and add notebook port 192.168.99.100**:8888** or lab port 192.168.99.100**:8889**.
docker start dataworkshop-environment
Note: that in docker terminology
run
means build (a new container)start
means start (already exists) container
docker stop dataworkshop-environment
To get the last changes from dockerhub
docker pull dataworkshop/environment
container
docker rm dataworkshop-environment
or image
docker rmi dataworkshop/environment
docker stats dataworkshop-environment
docker top dataworkshop-environment
Note: run it in Dockerfile directory
docker build -t dataworkshop-environment .
Jupyter servers are started by running start_script.sh in CMD section. However, you can easily override it, by specifying a command at the end of docker run... (in this case servers won't start)