Skip to content

Commit

Permalink
Merge pull request #97 from amosproj/dev
Browse files Browse the repository at this point in the history
Dev
  • Loading branch information
chrisklg authored Nov 27, 2024
2 parents b112d77 + 6b06808 commit fc456be
Show file tree
Hide file tree
Showing 86 changed files with 27,158 additions and 18,024 deletions.
31 changes: 26 additions & 5 deletions .env.docker.example
Original file line number Diff line number Diff line change
@@ -1,6 +1,27 @@
#Copy and rename this file to .env.docker
DATABASE_HOST="host.docker.internal"
DATABASE_PORT=5433
DATABASE_USER="postgres"
DATABASE_PASSWORD="postgres"
DATABASE_DATABASE="postgres"

#Backend
BACKEND_DATABASE_HOST="backendDatabase"
BACKEND_DATABASE_PORT=5432
BACKEND_DATABASE_USER="postgres"
BACKEND_DATABASE_PASSWORD="postgres"
BACKEND_DATABASE_DATABASE="postgres"
ANALYZER_URL="http://localhost:8000"

#Analyzer
ANALYZER_FLASK_RUN_HOST="0.0.0.0"
ANALYZER_FLASK_RUN_PORT="8000"
BACKEND_URL="http://backend:3000/api/"
ANALYZER_DATABASE_HOST="analyzerDatabase"
ANALYZER_DATABASE_PORT=5432
ANALYZER_DATABASE_USER="postgres"
ANALYZER_DATABASE_PASSWORD="postgres"
ANALYZER_DATABASE_DATABASE="postgres"

#Mailing
MAIL_HOST=smtp.example.com
MAIL_PORT=465
[email protected]
MAIL_PASSWORD=topsecret
[email protected]
[email protected],[email protected]
43 changes: 43 additions & 0 deletions .github/workflows/analyzer_test_pipeline.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
name: Analyzer Tests

on:
push:
branches:
- dev
paths-ignore:
- 'deliverables/**'
pull_request:
branches:
- dev
paths-ignore:
- 'deliverables/**'

jobs:
test:
runs-on: ubuntu-latest

steps:
# Step 1: Checkout the code
- name: Checkout code
uses: actions/checkout@v3

# Step 2: Set up Python
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'

# Step 3: Install dependencies
- name: Install dependencies
run: |
npm ci
cd ./apps/analyzer/metadata_analyzer
python -m pip install --upgrade pip
pip install pipx
pipx install poetry
poetry install
# Step 4: Run tests
- name: Run tests
#working-directory: rootfolder/apps/analyzer
run: npx nx run metadata-analyzer:test
37 changes: 37 additions & 0 deletions .github/workflows/backend_test_pipeline.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
name: Backend Tests

# Trigger on push to dev and on pull request creation, excluding "deliverables" folder
on:
push:
branches:
- dev
paths-ignore:
- 'deliverables/**'
pull_request:
branches:
- dev
paths-ignore:
- 'deliverables/**'

jobs:
test:
runs-on: ubuntu-latest

steps:
# Step 1: Checkout the code
- name: Checkout code
uses: actions/checkout@v3

# Step 2: Set up Node.js environment
- name: Set up Node.js
uses: actions/setup-node@v3
with:
node-version: '18'

# Step 3: Install dependencies
- name: Install dependencies
run: npm ci

# Step 4: Run tests
- name: Run tests
run: npx nx run metadata-analyzer-backend:test
13 changes: 13 additions & 0 deletions .github/workflows/ci_test2.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
name: CI Test 2

on:
push:
branches:
- dev

jobs:
hello:
runs-on: ubuntu-latest
steps:
- name: Hello World
run: echo "CI Test 2"
7 changes: 6 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -50,4 +50,9 @@ apps/*/.env
# Test results
apps/analyzer/.coverage
reports/*
coverage/*
coverage/*

# DB Dumps
*.dmp
*.sql
!00-init-roles.sql
1 change: 1 addition & 0 deletions 00-init-roles.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
CREATE ROLE root WITH SUPERUSER LOGIN PASSWORD 'root';
11 changes: 8 additions & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
# Container for the shared node module
FROM node:18-alpine
FROM node:18-bullseye



WORKDIR /app

COPY . .
COPY package*.json ./
#ENV NODE_ENV=development

RUN npm i -g [email protected]
RUN npm install
RUN npm i
COPY . .
#RUN npm ci
9 changes: 3 additions & 6 deletions Documentation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,8 @@ cd ./apps/analyzer/metadata_analyzer ; poetry install

- `npm ci`: dependency install

- copy `.env.example` file in backend and rename to `.env` (adjust database properties according to database setup if
necessary)
- copy `.env.example` file in analyzer and rename to `.env` (adjust port properties according to backend setup if
necessary)
- To insert dummy data into table backupData you can use the SQL script `dummyData.sql` in `apps/backend/src/app/utils`
- copy `.env.example` file in backend and rename to `.env` (adjust database properties according to database setup if necessary)
- copy `.env.example` file in analyzer and rename to `.env` (adjust port properties according to backend setup if necessary)

### Running the code locally:

Expand All @@ -29,7 +26,7 @@ cd ./apps/analyzer/metadata_analyzer ; poetry install
- the entity files need to be annotated with `@Entity(<table-name>)`
- append the entity file to the `entities` array in `db-config.service.ts`
- run the following command to generate a migration file:
- `nx run metadata-analyzer-backend:migrations:generate --name <migration-name>`
- `nx run metadata-analyzer-backend:migrations:generate --name <migration-name>`
- append the generated file to the `migrations` array in `db-config.service.ts`

### Running tests
Expand Down
41 changes: 32 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,31 +1,54 @@
# AMOS Backup Metadata Analyzer


## Prerequisites

Make sure the following are installed on your machine:

- **Node 20**
- **Docker**
- **Docker Compose**

## Setup Instructions
## Docker Build Setup Instructions

1. **Clone the repository**:

```bash
git clone https://github.com/amosproj/amos2024ws02-backup-metadata-analyzer.git

```

2. **Change directory**:

```bash
cd ./amos2024ws02-backup-metadata-analyzer/
cd ./amos2024ws02-backup-metadata-analyzer/

```

3. **Setup .env files**:

```bash
cp .env.docker.example .env.docker
cp apps/backend/.env.example apps/backend/.env
cp .env.docker.example .env.docker

```

4. **Copy database dump into project**:

Copy the database dump .dmp file in the projects root folder and rename it to **db_dump.sql**

5. **Clean Docker node_modules**:

4. **Docker compose up**:
```bash
docker-compose --env-file .env.docker up --build
docker volume rm amos2024ws02-backup-metadata-analyzer_mono-node-modules
```

6. **Build and start Docker container**:

```bash
docker compose --env-file .env.docker up --build

```

5. **Docker compose down**:
7. **Stop Docker Container**:
```bash
docker-compose --env-file .env.docker down
docker compose --env-file .env.docker down
```
8 changes: 7 additions & 1 deletion apps/analyzer/.env.example
Original file line number Diff line number Diff line change
@@ -1,2 +1,8 @@
FLASK_RUN_HOST="localhost"
FLASK_RUN_PORT="8000"
FLASK_RUN_PORT="8000"
BACKEND_URL = "http://localhost:3000/api/"
DATABASE_HOST="localhost"
DATABASE_PORT=5432
DATABASE_USER="postgres"
DATABASE_PASSWORD="postgres"
DATABASE_DATABASE="postgres"
29 changes: 29 additions & 0 deletions apps/analyzer/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Use Alpine 3.17, which supports Python 3.11
FROM node:18-alpine3.17

# Install Python 3.11 and other dependencies
RUN apk add --no-cache python3 py3-pip python3-dev gcc musl-dev libffi-dev openssl-dev bash

# Create the virtual environment
RUN python3 -m venv /app/.venv

# Set the virtual environment path
ENV PATH="/app/.venv/bin:$PATH"

# Install Poetry
RUN pip install --no-cache --upgrade pip setuptools && \
pip install poetry

# Copy the dependency files
WORKDIR /app
COPY pyproject.toml poetry.lock ./

# Install dependencies with Poetry
RUN poetry config virtualenvs.create false && \
poetry install --no-interaction --no-ansi

#Copy the remaining code
COPY . .

# Standard command to start the application
#CMD ["/app/.venv/bin/python3", "main.py" , "--host", "0.0.0.0"]
70 changes: 70 additions & 0 deletions apps/analyzer/metadata_analyzer/analyzer.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
class Analyzer:
def init(database, backend, simple_analyzer, simple_rule_based_analyzer):
Analyzer.database = database
Analyzer.backend = backend
Analyzer.simple_analyzer = simple_analyzer
Analyzer.simple_rule_based_analyzer = simple_rule_based_analyzer

def analyze():
data = list(Analyzer.database.get_results())
converted_data = []

for elem in data:
if elem.data_size != None:
converted_data.append(Analyzer._convert_result(elem))

result = Analyzer.simple_analyzer.analyze(converted_data)

return result

# Convert a result from the database into the format used by the backend
def _convert_result(result):
return {
"id": result.uuid,
"sizeMB": result.data_size / 1_000_000,
"creationDate": result.start_time.isoformat(),
}

def update_data():
results = list(Analyzer.database.get_results())

# Batch the api calls to the backend for improved efficiency
batch = []
count = 0
for result in results:
# Only send 'full' backups
if result.fdi_type != "F":
continue

# Only send backups where the relevant data is not null
if result.data_size is None or result.start_time is None:
continue

batch.append(Analyzer._convert_result(result))
count += 1

# Send a full batch
if len(batch) == 100:
Analyzer.backend.send_backup_data_batched(batch)
batch = []

# Send the remaining results
if len(batch) > 0:
Analyzer.backend.send_backup_data_batched(batch)

return {"count": count}

def simple_rule_based_analysis(alert_limit):
data = list(Analyzer.database.get_results())
result = Analyzer.simple_rule_based_analyzer.analyze(data, alert_limit)
return result

def simple_rule_based_analysis_diff(alert_limit):
data = list(Analyzer.database.get_results())
result = Analyzer.simple_rule_based_analyzer.analyze_diff(data,alert_limit)
return result

def simple_rule_based_analysis_inc(alert_limit):
data = list(Analyzer.database.get_results())
result = Analyzer.simple_rule_based_analyzer.analyze_inc(data,alert_limit)
return result
15 changes: 15 additions & 0 deletions apps/analyzer/metadata_analyzer/backend.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
import requests

class Backend:
def __init__(self, backend_url):
self.backend_url = backend_url

def send_backup_data_batched(self, batch):
url = self.backend_url + "backupData/batched"
r = requests.post(url, json=batch)
r.raise_for_status()

def create_alert(self, alert):
url = self.backend_url + "alerting"
r = requests.post(url, json=alert)
r.raise_for_status()
Loading

0 comments on commit fc456be

Please sign in to comment.