Skip to content
This repository has been archived by the owner on Jun 22, 2024. It is now read-only.

Commit

Permalink
[FEATURE][FIX] Replaced crawler for better results and fix upcoming m…
Browse files Browse the repository at this point in the history
…atches error (#65)

* [FEATURE][FIX] Replaced crawler for better results and fix upcoming matches error

* Replaced crawler to [espncricinfo](https://www.espncricinfo.com/)

* Fixed pylint errors in app

* Dockerfiles updated

* [TEST][CI] Added tests for crawler and changed ci to circleci

* [TEST][CI] Replaced CI to Circleci

* [TEST] Fixed pytest
  • Loading branch information
roysti10 committed Feb 8, 2021
1 parent 81c2b6f commit 001642f
Show file tree
Hide file tree
Showing 48 changed files with 877 additions and 1,389 deletions.
43 changes: 43 additions & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
version: 2.1

orbs:
python: circleci/[email protected]
docker: circleci/[email protected]

jobs:
build-and-test:
executor: python/default
environment:
TEST: True
steps:
- checkout
- setup_remote_docker
- docker/install-docker-compose
- run:
command: |
python -m pip install --upgrade pip
pip install pylint
pip install -r requirements.txt
pip install scrapy-autounit
name: Install dependencies
- run:
command: |
scrapy crawl espn-live
scrapy crawl espn-players
python3 -m unittest discover autounit/tests/
name: Crawler Test with autounit
- run:
command: |
pylint app -E
pylint app --exit-zero
name: Lint with pylint
- run:
name: Test with pytest
command: |
docker -v
docker-compose -v
docker-compose -f docker/docker-compose-test.yaml up --build --exit-code-from test
workflows:
main:
jobs:
- build-and-test
38 changes: 0 additions & 38 deletions .github/workflows/python-app.yml

This file was deleted.

1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ instance/

# Scrapy stuff:
.scrapy
autounit

# Sphinx documentation
docs/_build/
Expand Down
16 changes: 0 additions & 16 deletions Dockerfile.development

This file was deleted.

55 changes: 44 additions & 11 deletions README.md
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,15 @@ In the past year or so fantasy cricket has been getting a lot of traction and wi

1. [FastAPI](https://fastapi.tiangolo.com/)
2. [sklearn](https://scikit-learn.org/stable/)
3. [pycricbuzz](https://github.com/codophobia/pycricbuzz)
3. [scrapyrt](https://scrapyrt.readthedocs.io/en/stable/)
4. [scrapy](https://docs.scrapy.org/en/latest/)

Install using </br>
```bash
pip3 install -r requirements.txt
```

## I want to run your project
## Local Development

To run our project follow these steps

Expand All @@ -27,24 +27,57 @@ To run our project follow these steps
cd Best11-Fantasycricket
```

3. Run the model :
3.
**Linux and MACOS**

1. Type `nano /etc/hosts` on your terminal or open `/etc/hosts` on your prefered editor

**Windows**
1. Open `C:\windows\system32\drivers\etc\hosts` in your prefered editor


2. And add the below line to the the file and save

`127.0.0.1 espncricinfo`

**OR**

1. Open `app/fantasy_cricket/scrapyrt_client.py` in your prefered editor

2. Change line `16` to

```python
self.url = "http://localhost:9080/crawl.json"
```

4. Open a tab on your terminal and run

`uvicorn app.main:app`

5. `Open http://localhost:8000/` and voila!!
5. Open another tab on your terminal and run

`scrapyrt`


6. Open `http://localhost:8000/` and voila!!

**Note:**
Visit `http://localhost:9080/crawl.json` with the correct queries to see the crawler api

### Docker

1. Follow the steps:
```bash
docker build -t best11fantasycricket:latest "."
```

```bash
docker-compose up
docker build -t espncricinfo:latest "." -f docker/espncricinfo/Dockerfile
docker build -t best11:latest "." -f docker/11tastic/Dockerfile
docker-compose -f docker/docker-compose.yaml up
```

2. Visit `http://localhost:8080/`
2. Visit `http://localhost:8080/` to see the website in action

**Note**
Visit `http://localhost:9080/crawl.json` with the correct queries to see the crawler api


## How do I contribute to this project????
Expand All @@ -69,9 +102,9 @@ If you have any questions regarding our project , you can contact any of the mai

### Acknowledgements

1. Special thanks to [scientes](https://github.com/scientes) for setting up the basic webcrawler
1. Special thanks to [scientes](https://github.com/scientes) for allowing us to use the server to host the website

2. We would like to thank [Howstat](http://www.howstat.com/cricket/home.asp) for their amazing website with daily updates and availabilty to scrape
2. We would like to thank [espncricinfo](https://www.espncricinfo.com/) for their amazing website with daily updates and availabilty to scrape

If you liked our project we would really appreciate you starring this repo.

Expand Down
Empty file modified app/__init__.py
100644 → 100755
Empty file.
Empty file modified app/fantasy_cricket/__init__.py
100644 → 100755
Empty file.
37 changes: 0 additions & 37 deletions app/fantasy_cricket/data/flags.json

This file was deleted.

37 changes: 37 additions & 0 deletions app/fantasy_cricket/fantasy_leagues.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
"""This module defines all supported fantasy_leagues and their functionality.
All browsers must inherit from app.fantasy_cricket.Team`.
"""
from app.fantasy_cricket.team import Team


class Dream11(Team):
"""Dream11 League
Supported formats:
* ODI
* T20
* TEST
"""

name = "Dream11"

batting_dict = {
"runs": [1, 1, 1],
"boundaries": [1, 1, 1],
"sixes": [2, 2, 2],
"50": [4, 4, 8],
"100": [8, 8, 16],
"duck": [-4, -3, -2],
}

bowling_dict = {
"wicket": [16, 25, 25],
"4-wicket-haul": [4, 4, 8],
"5-wicket-haul": [8, 8, 16],
"Maiden": [0, 8, 4],
}

wk_dict = {
"Catch": [8, 8, 8],
"Stump": [12, 12, 12],
}
56 changes: 56 additions & 0 deletions app/fantasy_cricket/matches.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
"""
The module is defined to get upcoming match data of the next 2 days
"""

from typing import List
from app.fantasy_cricket.scrapyrt_client import EspnClient


class Matches:
"""
A class to get upcoming live match data of the next 2 days
"""

def __init__(self) -> None:

self.espn = EspnClient()

def get_upcoming_match(self):
"""
Gets current matches dict
"""
matches = []
for match in self.espn.get_upcoming_dets():
if match["team1_squad"] != [] and match["team2_squad"] != []:
matches.append(
{
"team1": match["team1"],
"team2": match["team2"],
"flag_team1": "https://a.espncdn.com/i/teamlogos/cricket/500/"
+ match["team1_id"]
+ ".png",
"flag_team2": "https://a.espncdn.com/i/teamlogos/cricket/500/"
+ match["team2_id"]
+ ".png",
}
)

return matches

def get_squad_match_type(self, teams: List[str]):
"""
Gets squad and match_class based on teams
"""

for match in self.espn.get_upcoming_dets():

if match["team1"] == teams[0] and match["team2"] == teams[1]:
match_det = {
"team1_squad": match["team1_squad"],
"team2_squad": match["team2_squad"],
"match_type": match["match_id"],
}
break

return match_det
Loading

0 comments on commit 001642f

Please sign in to comment.