Repo to develop securities analysis libraries driven by different schools of financial and modelling thought including technical analysis, fundamental analysis, quantitative analysis, and machine learning.
This library will possibly include a backtester as well as the ability to connect to a trading api and deploy to a cloud. This effort is currently in the research phase so the scope has not been clearly defined yet.
It is recommended to have some basic understanding of investment theory, to avoid unnecessary risks, and to hopefully manage those risks. If you have no financial background, please see the investment notes, which are largely based on Zvi Bodie's market leading textbook, Investments.
To install the dependencies to run these notebooks, you should first install Anaconda.
In addition, the optimizer used in the notebooks, PyPortfolioOpt, requires C++. For Windows, install Visual Studio Build Tools and modify it to include the C++ build tools.
Once you have installed Anaconda, run:
conda env create -f trade_env.yml
to install all the dependencies into an isolated environment.
Then register the environment with Jupyter:
python -m ipykernel install --user --name=trade_env
Activate the environment by running:
conda activate trade_env
Directly install new packages into the environment using pip. Then update the yaml with:
conda env export --name trade_env --file updated_trade_env.yml
OR you can Update the environment with a new package by adding it in the YAML file and while in the same directory as environment.yml, activate the environment, and run:
conda env update -f environment.yml
This option is not recommended because you have to reinstall everything.
Expand for details
To run algorithmic trading bots, we need infrastructure which is reliable and secure. Setting up your own physical server is not necessary since we can easily rent cloud infrastructure at low cost.First, spin up a virtual machine at https://www.digitalocean.com/ (they call them droplets) with a minimum of 2 GB RAM. For more info on creating a production-ready server, see: https://docs.digitalocean.com/tutorials/recommended-droplet-setup/
To connect to the cloud instance via ssh (we can then easily use the ssh plugin in VS Code to develop on the server), create an ssh key and add the key to the cloud instance. Follow these instructions:
https://docs.digitalocean.com/products/droplets/how-to/add-ssh-keys/
To create an encrypted communication between the JupyterLab server and the web browser, we set up an SSL public key and certificate. From within the cloud/cloud_deploy/ folder, run the following line from the terminal (Git Bash if on Windows):
openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout mykey.key -out mycert.pem
Follow the instructions and enter your location, org, name, and email details.
For more info on SSL, see: https://www.cloudflare.com/learning/ssl/what-is-ssl/
It is essential to have passwords be hashed, in this case the password to the Jupyter Lab server. Hashing is a one-way function (impossible to decrypt). It is used for password validation. Still, you don't want even the hash exposed because hackers can brute force it and figure out the password (Then they would have root access to the cloud instance!!)
To generate an Argon2 hash code for the Jupyter Lab password, run the following from within the cloud/cloud_deploy/ directory (make sure the virtual environment is activated):
python jupyter_hash_code.py [your password here]
This will next get copied to the cloud and set as the password authentication for the browser login.
Luckily, Argon2 hashing is quite secure and according to one source: "Trying to crack a volume encrypted with Argon2 created on a modern laptop would require up to 75,121 powerful machines running for ten years and cost over 4 billion dollars."
In one command, we will copy the SSL keys and jupyter notebook config files to the cloud instance, run the dependencies install script, and launch a jupyterlab server. We simply run (within the cloud/cloud_deploy/ directory):
bash cloud_setup.sh [public ip address of cloud instance]
Then access the server through a browser at https://[public ip address of cloud instance]:9000/lab. The password will be the one which you hashed in the step above.
To shut down the Jupyter Lab server, from within Jupyter Lab, go to "file" then click "shutdown" (the Jupyter lab will run indefinitely at that port until you shut it down).
If you want to start with a clearn ubuntu install on the cloud instance, you can do a rebuild (see below for ssh reconnection).
See: https://dev.to/gamebusterz/digitalocean-permission-denied-publickey-168p
SSH connection issues after rebuild: https://www.digitalocean.com/community/questions/how-can-i-get-rid-of-warning-remote-host-identification-has-changed
Preferably, your API keys should be stored as environment variables. For the Alpaca API, we want to read the api keys from the Conda environment. To store your personal keys and secrets in the Conda environment, follow these instructions.
Python Reddit API Wrapper: https://praw.readthedocs.io/en/stable/index.html
Python Binance API Wrapper: https://python-binance.readthedocs.io/en/latest/index.html
Python Alpaca API Wrapper: https://pypi.org/project/alpaca-trade-api/