Skip to content

Commit

Permalink
Merge pull request #99 from Azulinho/next_release
Browse files Browse the repository at this point in the history
Next release
  • Loading branch information
Azulinho authored Jun 16, 2022
2 parents 629a0b8 + db4aecc commit 7f12e0e
Show file tree
Hide file tree
Showing 8 changed files with 144 additions and 58 deletions.
37 changes: 37 additions & 0 deletions .github/workflows/next_release.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
name: Run build on next_release
on:
push:
branches:
- "next_release"

jobs:
push_to_registry:
name: Push Docker image to GitHub Container Registry
runs-on: ubuntu-latest
permissions:
packages: write
contents: write
steps:
- name: Check out the repo
uses: actions/checkout@v2
with:
fetch-depth: 0

- name: Login to GitHub Container Registry
uses: docker/login-action@v1
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

- name: Pull latest upstream base
run: |
docker pull bitnami/minideb:bullseye
docker pull ghcr.io/azulinho/cryptobot:latest
- name: Push to GitHub Packages
uses: docker/build-push-action@v2
with:
tags: ghcr.io/azulinho/cryptobot:next_release
push: true

2 changes: 1 addition & 1 deletion .python-version
Original file line number Diff line number Diff line change
@@ -1 +1 @@
pyston-2.3.2
pyston-2.3.4
41 changes: 29 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,10 +32,11 @@ A python based trading bot for Binance, which relies heavily on backtesting.
* [PRICE_LOGS](#price_logs)
* [ENABLE_PUMP_AND_DUMP_CHECKS](#enable_pump_and_dump_checks)
* [ENABLE_NEW_LISTING_CHECKS](#enable_new_listing_checks)
* [ENABLE_NEW_LISTING_CHECKS_AGE_IN_DAYS](#enable_new_listing_checks_in_days)
* [STOP_BOT_ON_LOSS](#stop_bot_on_loss)
6. [Bot command center](#bot-command-center)
7. [Automated Backtesting](#automated-backtesting)
8. [Obtaining old price.log files](#obtaining-old-price.log-files)
8. [Obtaining old price log files](#obtaining-old-price-log-files)
9. [Development/New features](#development/new-features)


Expand All @@ -46,21 +47,22 @@ now recovering from that downtrend. It relies on us specifying different
buy and sell points for each coin individually. For example, we can tell the
bot to buy BTCUSDT when the price drops by at least 6% and recovers by 1%. And
then set it to sell when the price increases by another 2%.
While we may choose to do something different with another more volatile coin
Or we may choose trade differently with another more volatile coin
where we buy the coin when the price drops by 25%, wait for it to recover by 2%
and then sell it at 5% profit.

In order to understand what are the best percentages on when to buy and sell for
each one of the coins available in binance, we use backtesting strategies
on a number of recorded price.logs.
These price.logs can be obtained while the bot is running in a special mode
called 'logmode' where it records prices for all the available binance coins
called *logmode* where it records prices for all the available binance coins
every 1 second or other chosen interval. Or we can obtain 1min interval klines
from binance using a tool available in this repository.
from binance using a [tool available in this
repository](#obtaining-old-price-log-files).

Then we would run the bot in backtesting mode which would run our buy strategy
against those price.log files and simulate what sort of returns we would get
from a specify strategy and a time frame of the market.
Then using these price.log files we would run the bot in *backtesting* mode
which would run our buy strategy against those price.log files and simulate
what sort of returns we would get from a specify strategy and a time frame of the market.
In order to help us identify the best buy/sell percentages for each coin, there
is a helper tool in this repo which runs a kind of
[automated-backtesting](#automated-backtesting) against
Expand Down Expand Up @@ -172,7 +174,7 @@ These logs can then be consumed in *backtesting* mode.

The bot doesn't retrieve historic klines from binance, which are limited to a
minimum of 1min granularity. If you want to pull historic klines from binance,
you'll have to do it yourself and convert them to the format used by this bot.
use the [tool available in this repo](#obtaining-old-price-log-files)

Just to get started, here is a
[logfile](https://www.dropbox.com/s/dqpma82vc4ug7l9/MYCOINS.log.gz?dl=0)
Expand All @@ -181,7 +183,7 @@ for testing containing a small set of coins
Don't bother decompressing these files, as the bot consumes them compressed
in the .gz format.

Processing each daily logfile takes around 30 seconds, so for a large number of
Processing each daily logfile on a 1sec interval, takes around 30 seconds, so for a large number of
price log files this can take a long time to run backtesting simulations.
A workaround is to test out each coin individually by generating a price.log
file containing just the coins we care about.
Expand Down Expand Up @@ -216,6 +218,7 @@ DO NOT USE github issues to ask for help. I have no time for you. You'll be told

Also: *NO TORIES, NO BREXITERS, NO WINDOWS USERS, NO TWATS*, this is not negotiable.


## Getting started

If you don't know Python you might be better using an
Expand Down Expand Up @@ -270,13 +273,16 @@ running. But not buy or sell anything.
make logmode CONFIG=config.yaml
```

You can also look into the [obtaining old price.log files
tool](#obtaining-old-price-log-files)

When there is enough data for backtesting in our price.log files, we can now
run a new instance of the bot in *backtesting* mode.

5. Compress all the logs, except for the current live logfile in *gz* format.

```
ls *.log| xargs -i gzip {}"
ls *.log| xargs -i gzip -3 {}"
```

6. Update the config.yaml file and include the list of logfiles we are using for
Expand Down Expand Up @@ -654,9 +660,20 @@ ENABLE_NEW_LISTING_CHECKS: True

defaults to True

Checks that we have at least 30 days of price data on a coin, if we don't we
Enable checks for new coin listings.

### ENABLE_NEW_LISTING_CHECKS_AGE_IN_DAYS

```
ENABLE_NEW_LISTING_CHECKS_AGE_IN_DAYS: 31
```

defaults to 31

Checks that we have at least 31 days of price data on a coin, if we don't we
skip buying this coin.


### STOP_BOT_ON_LOSS

```
Expand Down Expand Up @@ -741,7 +758,7 @@ bot runs that don't contain any losses or stales, only wins.
make automated-backtesting LOGFILE=lastfewdays.USDT.log.gz CONFIG=automated-backtesting.yaml MIN=10 FILTER='' SORTBY='wins'
```

## Obtaining old price.log files
## Obtaining old price log files

In the utils/ directory there's a python script that pulls klines from binance
in the format used by this bot.
Expand Down
62 changes: 56 additions & 6 deletions app.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
import argparse
import importlib
import json
import logging
import math
import pickle
import sys
Expand All @@ -12,11 +13,12 @@
from functools import lru_cache
from hashlib import md5
from itertools import islice
from os import fsync
from os import fsync, getpid
from os.path import basename, exists
from time import sleep
from typing import Any, Dict, List, Tuple

import colorlog
import udatetime
import web_pdb
import yaml
Expand All @@ -31,7 +33,6 @@
c_date_from,
c_from_timestamp,
cached_binance_client,
logging,
mean,
percent,
requests_with_backoff,
Expand Down Expand Up @@ -379,7 +380,7 @@ def check_for_pump_and_dump(self):

return False

def new_listing(self, mode):
def new_listing(self, mode, days):
"""checks if coin is a new listing"""
# wait a few days before going to buy a new coin
# since we list what coins we buy in TICKERS the bot would never
Expand All @@ -389,7 +390,7 @@ def new_listing(self, mode):
# we want to avoid buy these new listings as they are very volatile
# and the bot won't have enough history to properly backtest a coin
# looking for a profit pattern to use.
if mode == "backtesting" and len(self.averages["d"]) < 31:
if mode == "backtesting" and len(self.averages["d"]) < days:
return True
return False

Expand Down Expand Up @@ -471,10 +472,14 @@ def __init__(self, conn, config_file, config) -> None:
self.enable_pump_and_dump_checks: bool = config.get(
"ENABLE_PUMP_AND_DUMP_CHECKS", True
)
# disable buying a new coin if this coin is newer than 31 days
# check if we are looking at a new coin
self.enable_new_listing_checks: bool = config.get(
"ENABLE_NEW_LISTING_CHECKS", True
)
# disable buying a new coin if this coin is newer than 31 days
self.enable_new_listing_checks_age_in_days: int = config.get(
"ENABLE_NEW_LISTING_CHECKS_AGE_IN_DAYS", 31
)
# stops the bot as soon we hit a STOP_LOSS. If we are still holding coins,
# those remain in our wallet. Typically used when MAX_COINS = 1
self.stop_bot_on_loss: bool = config.get("STOP_BOT_ON_LOSS", False)
Expand Down Expand Up @@ -510,7 +515,9 @@ def run_strategy(self, coin) -> None:

# is this a new coin?
if self.enable_new_listing_checks:
if coin.new_listing(self.mode):
if coin.new_listing(
self.mode, self.enable_new_listing_checks_age_in_days
):
return

# has the current price been influenced by a pump and dump?
Expand Down Expand Up @@ -1627,10 +1634,14 @@ def load_klines_for_coin(self, coin) -> None:
# wrap results in a try call, in case our cached files are corrupt
# and attempt to pull the required fields from our data.
try:
logging.debug(f"(trying to read klines from {f_path}")
with open(f_path, "r") as f:
results = json.load(f)
_, _, high, low, _, _, closetime, _, _, _, _, _ = results[0]
except Exception: # pylint: disable=broad-except
logging.debug(
f"calling binance after failed read from {f_path}"
)
results = requests_with_backoff(query).json()
# this can be fairly API intensive for a large number of tickers
# so we cache these calls on disk, each coin, period, start day
Expand Down Expand Up @@ -1761,6 +1772,45 @@ def print_final_balance_report(self):
secrets = yaml.safe_load(_f.read())
cfg["MODE"] = args.mode

PID = getpid()
c_handler = colorlog.StreamHandler(sys.stdout)
c_handler.setFormatter(
colorlog.ColoredFormatter(
"%(log_color)s[%(levelname)s] %(message)s",
log_colors={
"WARNING": "yellow",
"ERROR": "red",
"CRITICAL": "red,bg_white",
},
)
)
c_handler.setLevel(logging.INFO)

if cfg["DEBUG"]:
f_handler = logging.FileHandler("log/debug.log")
f_handler.setLevel(logging.DEBUG)

logging.basicConfig(
level=logging.DEBUG,
format=" ".join(
[
"(%(asctime)s)",
f"({PID})",
"(%(lineno)d)",
"(%(funcName)s)",
"[%(levelname)s]",
"%(message)s",
]
),
handlers=[f_handler, c_handler],
datefmt="%Y-%m-%d %H:%M:%S",
)
else:
logging.basicConfig(
level=logging.INFO,
handlers=[c_handler],
)

if args.mode == "backtesting":
client = cached_binance_client(
secrets["ACCESS_KEY"], secrets["SECRET_KEY"]
Expand Down
28 changes: 2 additions & 26 deletions lib/helpers.py
Original file line number Diff line number Diff line change
@@ -1,42 +1,16 @@
""" helpers module """
import logging
import pickle
import sys
from datetime import datetime
from functools import lru_cache
from os import getpid
from os.path import exists, getctime
from time import sleep

import colorlog
import requests
import udatetime
from binance.client import Client
from tenacity import retry, wait_exponential

PID = getpid()
c_handler = colorlog.StreamHandler(sys.stdout)
c_handler.setFormatter(
colorlog.ColoredFormatter(
"%(log_color)s[%(levelname)s] %(message)s",
log_colors={
"WARNING": "yellow",
"ERROR": "red",
"CRITICAL": "red,bg_white",
},
)
)
c_handler.setLevel(logging.INFO)

f_handler = logging.FileHandler("log/debug.log")
f_handler.setLevel(logging.DEBUG)

logging.basicConfig(
level=logging.DEBUG,
format=f"[%(levelname)s] {PID} %(lineno)d %(funcName)s %(message)s",
handlers=[f_handler, c_handler],
)


def mean(values: list) -> float:
"""returns the mean value of an array of integers"""
Expand Down Expand Up @@ -100,10 +74,12 @@ def cached_binance_client(access_key: str, secret_key: str) -> Client:
if exists(cachefile) and (
udatetime.now().timestamp() - getctime(cachefile) < (30 * 60)
):
logging.debug("re-using local cached binance.client file")
with open(cachefile, "rb") as f:
_client = pickle.load(f)
else:
try:
logging.debug("refreshing cached binance.client")
_client = Client(access_key, secret_key)
except Exception as err:
logging.warning(f"API client exception: {err}")
Expand Down
1 change: 1 addition & 0 deletions tests/automated-backtesting.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ DEFAULTS: &DEFAULTS
HARD_LIMIT_HOLDING_TIME: 99999
STOP_LOSS_AT_PERCENTAGE: -25
STOP_BOT_AT_LOSS: False
ENABLE_NEW_LISTING_CHECKS_AGE_IN_DAYS: 31

STRATEGIES:
BuyDropSellRecoveryStrategy:
Expand Down
Loading

0 comments on commit 7f12e0e

Please sign in to comment.