Skip to content

Commit

Permalink
0.3.1 - Patch (#57)
Browse files Browse the repository at this point in the history
* ➕ Added utility bash script to automatically purge all trainer/exporter related files.

Signed-off-by: Bey Hao Yun <[email protected]>

* ➕ 📖 Updated CHANGELOG.rst with EPD v0.3.1-Patch.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Fix for potentially faulty default p2_train_verification.json and p3_train_verification.json.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Fix for P3 training misprints when validating Training and other Train GUI window features are used.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Updated .gitignore to ignore latest version of local transient p3_exporter, p2_exporter, p3_trainer and p2_trainer.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Fix for failing GUI_CI linting.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Updated unit testing for CountingWindow GUI feature. Fix for failing GUI_CI Dynamic Analysis.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Replace all instances of deprecated squeezenet and imagenet_classes.txt with MaskRCNN-10 and coco_classes.txt.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔥 Removed false negative unit testcase for test_setDataset_TrainWindow.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Reinforced higher stringent checks for required MSCOCO-formatted training datasets folder when Choose Dataset GUI feature is used.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Reinforced checks for User-Provided Training Dataset for conformDatasetToCOCO function call. Unit tests yet to be written.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Fix for failing GUI_CI linting.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔥 🔨 Transferred unit testing for docker pull operations in P3Trainer and P2Trainer GUI features to local unit testing for GPU-specific GUI operations.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Added Good-To-Have visual indicator that provided Training Dataset is valid.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Fix for purge_trainer_exporter bash script. Adhered to shellcheck linting for bash.

Signed-off-by: Bey Hao Yun <[email protected]>

* Added copyright to trainer/training_files/scripts/copy_op.bash as well adhered to shellcheck linting standards.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Added copyright and adhered trainer/training_files bash scripts to shellcheck linting standards.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Added copyright and adhered trainer/exporter_files bash scripts to shellcheck linting standards.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Added sudo requirement for all docker command operations in deploy.sh and kill.sh when using Deploy GUI feature.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Propagated sudo requirement for docker operation as well as transformed all print statement to logging system.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Adhered to GUI_CI PEP08 linting standards.

Signed-off-by: Bey Hao Yun <[email protected]>

* ➕ Added localized log folder for EPD GUI.

Signed-off-by: Bey Hao Yun <[email protected]>

* 🔨 Further fix for failing GUI_CI GitHub Action.

Signed-off-by: Bey Hao Yun <[email protected]>

Signed-off-by: Bey Hao Yun <[email protected]>
  • Loading branch information
cardboardcode authored Aug 24, 2022
1 parent fc2356f commit 9cc23ae
Show file tree
Hide file tree
Showing 33 changed files with 1,068 additions and 488 deletions.
13 changes: 0 additions & 13 deletions .github/workflows/gui_ci_action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,19 +30,6 @@ jobs:
pip install pytest==6.0.1
pip install pytest-qt==3.3.0
pip install pyyaml==5.3.1
sudo apt-get install -y \
apt-transport-https \
ca-certificates \
curl \
gnupg-agent \
software-properties-common
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
sudo add-apt-repository \
"deb [arch=amd64] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) \
stable"
sudo apt-get update
sudo apt-get install docker-ce docker-ce-cli containerd.io
- name: Static Analysis
run: |
pycodestyle --show-source easy_perception_deployment/gui/main.py \
Expand Down
6 changes: 4 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,10 @@
easy_perception_deployment/build/*
easy_perception_deployment/log/*
easy_perception_deployment/install/*
easy_perception_deployment/gui/trainer/P2TrainFarm/*
easy_perception_deployment/gui/trainer/P3TrainFarm/*
easy_perception_deployment/gui/p2_trainer/
easy_perception_deployment/gui/p3_trainer/
easy_perception_deployment/gui/p2_exporter/
easy_perception_deployment/gui/p3_exporter/
epd_msgs/build/*
epd_msgs/log/*
epd_msgs/install/*
Expand Down
15 changes: 15 additions & 0 deletions easy_perception_deployment/CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -66,3 +66,18 @@ Changelog for package easy_perception_deployment
* Removed "Use Case =" label in Deploy GUI window. Lengthened Visualize/Action toggle for neater UI appearance.
* Added custom_dataset image collation with test label_list for GPU local reproducible unit-testing.
* Contributor(s): Bey Hao Yun

0.3.1 (2022-08-20)
------------------
* Added utilty bash script to allow users to purge all trainer/exporter related files, folders, docker images and containers.
* Fix for potential faulty default p3_train_verification.json and p2_train_verification.json.
* Fix for confusing GUI Training terminal misprints that occur whenever validating datasets.
* Elaborated checks for COCO-formatted training datasets when using Generate Dataset GUI feature.
* Spilt GUI train button into train and export buttons for more modular operations and thus easier debugging.
* Implemented logging system for GUI.
* Shifted Validate Dataset feature to run automatically once Choose Dataset has been called.
* Renamed Validate Dataset to Validate Training.
* Combined initModel to setModel in Train.py GUI Window features.
* Removed overlooked debug statements in Counting GUI Window.
* Modified Counting GUI window to close when writeToUseCaseConfig function is called.
* Contributor(s): Bey Hao Yun
4 changes: 3 additions & 1 deletion easy_perception_deployment/config/p2_train_verification.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
{
"isTrainFarmDockerImagePulled": false,
"isTrainFarmDockerContainerCreated": false,
"isTrainDependenciesInstalled": false,
"isExporterDockerImagePulled": false,
"isExportDockerContainerCreated": false,
"isExportDependenciesInstalled": false
}
}
4 changes: 3 additions & 1 deletion easy_perception_deployment/config/p3_train_verification.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
{
"isTrainFarmDockerImagePulled": false,
"isTrainFarmDockerContainerCreated": false,
"isTrainDependenciesInstalled": false,
"isExporterDockerImagePulled": false,
"isExportDockerContainerCreated": false,
"isExportDependenciesInstalled": false
}
}
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Empty file.
20 changes: 9 additions & 11 deletions easy_perception_deployment/gui/scripts/deploy.sh
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ showImage=$2

# Check if Docker is installed.
# If not installed, install it.
if output=$(docker --version > /dev/null 2>&1); then
if output=$(sudo docker --version > /dev/null 2>&1); then
:
else
echo "Installing Docker..."
Expand Down Expand Up @@ -41,35 +41,35 @@ echo "Docker [ FOUND ]"
if [ "$useCPU" = True ] ; then
# Check if epd-foxy-base:CPU Docker Image has NOT been built.
# If true, build it.
if output=$(docker images | grep cardboardcode/epd-foxy-base | grep CPU > /dev/null 2>&1); then
if output=$(sudo docker images | grep cardboardcode/epd-foxy-base | grep CPU > /dev/null 2>&1); then
echo "epd-foxy-base:CPU Docker Image [ FOUND ]"
else
# If there is internet connection,
# Download public docker image.
wget -q --spider http://google.com
if [ $? -eq 0 ]; then
docker pull cardboardcode/epd-foxy-base:CPU
sudo docker pull cardboardcode/epd-foxy-base:CPU
# Otherwise, build locally
else
docker build --tag cardboardcode/epd-foxy-base:CPU ../../Dockerfiles/CPU/
sudo docker build --tag cardboardcode/epd-foxy-base:CPU ../../Dockerfiles/CPU/
fi
echo "epd-foxy-base:CPU Docker Image [ CREATED ]"
fi

else
# Check if epd-foxy-base:GPU Docker Image has NOT been built.
# If true, build it.
if output=$(docker images | grep cardboardcode/epd-foxy-base | grep GPU > /dev/null 2>&1); then
if output=$(sudo docker images | grep cardboardcode/epd-foxy-base | grep GPU > /dev/null 2>&1); then
echo "epd-foxy-base:GPU Docker Image [ FOUND ]"
else
# If there is internet connection,
# Download public docker image.
wget -q --spider http://google.com
if [ $? -eq 0 ]; then
docker pull cardboardcode/epd-foxy-base:GPU
sudo docker pull cardboardcode/epd-foxy-base:GPU
# Otherwise, build locally
else
docker build --tag cardboardcode/epd-foxy-base:GPU ../../Dockerfiles/GPU/
sudo docker build --tag cardboardcode/epd-foxy-base:GPU ../../Dockerfiles/GPU/
fi
echo "epd-foxy-base:GPU Docker Image [ CREATED ]"
fi
Expand Down Expand Up @@ -102,18 +102,16 @@ elif [[ $input == "n" ]]; then
fi

if [ "$useCPU" = True ] ; then
docker run -it --rm \
sudo docker run -it --rm \
--name epd_test_container \
-v $(pwd):/root/epd_ros2_ws/src/easy_perception_deployment \
-u 0 \
cardboardcode/epd-foxy-base:CPU \
$launch_script
else
docker run -it --rm \
sudo docker run -it --rm \
--name epd_test_container \
-v $(pwd):/root/epd_ros2_ws/src/easy_perception_deployment \
--gpus all \
-u 0 \
cardboardcode/epd-foxy-base:GPU \
$launch_script
fi
Expand Down
2 changes: 1 addition & 1 deletion easy_perception_deployment/gui/scripts/kill.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/bin/env bash

pkill showimage
docker stop epd_test_container -t 1
sudo docker stop epd_test_container -t 1
208 changes: 6 additions & 202 deletions easy_perception_deployment/gui/test_gui.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,8 @@
p2.communicate()

dict = {
"path_to_model": './data/model/squeezenet1.1-7.onnx',
"path_to_label_list": './data/label_list/imagenet_classes.txt',
"path_to_model": './data/model/MaskRCNN-10.onnx',
"path_to_label_list": './data/label_list/coco_classes.txt',
"visualizeFlag": 'visualize',
"useCPU": 'CPU'
}
Expand Down Expand Up @@ -151,9 +151,9 @@ def test_invalidSession_invalidUseCase_DeployWindow(qtbot):

def test_validSession_validUseCase_DeployWindow(qtbot):

local_path_to_model = './data/model/squeezenet1.1-7.onnx'
local_path_to_model = './data/model/MaskRCNN-10.onnx'
local_path_to_label_list = ('./data/label_list/' +
'imagenet_classes.txt')
'coco_classes.txt')

dict = {
"path_to_model": local_path_to_model,
Expand Down Expand Up @@ -357,16 +357,6 @@ def test_setLabelList_TrainWindow(qtbot):
assert widget._is_labellist_linked is True


def test_setDataset_TrainWindow(qtbot):

widget = TrainWindow(True)
qtbot.addWidget(widget)

qtbot.mouseClick(widget.dataset_button, QtCore.Qt.LeftButton)

assert widget._is_dataset_linked is True


def test_setMax_Iteration(qtbot):

widget = TrainWindow(True)
Expand Down Expand Up @@ -415,37 +405,7 @@ def test_setSteps_Period(qtbot):
assert widget.steps == '(100, 200, 300)'


def test_conformDatasetToCOCO_TrainWindow(qtbot):

if not os.path.exists('../data/datasets/p2p3_dummy_dataset'):
p1 = subprocess.Popen([
'mkdir',
'-p',
'../data/datasets/p2p3_dummy_dataset/train_dataset'])
p1.communicate()
p2 = subprocess.Popen([
'mkdir',
'-p',
'../data/datasets/p2p3_dummy_dataset/val_dataset'])
p2.communicate()

widget = TrainWindow(True)
qtbot.addWidget(widget)

qtbot.mouseClick(widget.generate_button, QtCore.Qt.LeftButton)

assert widget.label_train_process is not None
widget.label_train_process.kill()
assert widget.label_val_process is not None
widget.label_val_process.kill()

# Clean up test materials.
if os.path.exists('../data/datasets/p2p3_dummy_dataset'):
p3 = subprocess.Popen([
'rm',
'-r',
'../data/datasets/p2p3_dummy_dataset'])
p3.communicate()
# def test_conformDatasetToCOCO_TrainWindow(qtbot):


def test_writeToUseCaseConfig_CountingWindow(qtbot):
Expand All @@ -467,7 +427,7 @@ def test_writeToUseCaseConfig_CountingWindow(qtbot):

assert usecase_mode == 1
assert class_list[0] == 'person'
assert widget.isVisible() is True
assert widget.isVisible() is False


def test_addObject_CountingWindow():
Expand Down Expand Up @@ -571,159 +531,3 @@ def test_P3Trainer_Training_Config(qtbot):
assert dict['SOLVER']['CHECKPOINT_PERIOD'] == 100
assert dict['SOLVER']['TEST_PERIOD'] == 100
assert dict['SOLVER']['STEPS'] == '(100, 200, 300)'


def test_P3Trainer_pullTrainFarmDockerImage(qtbot):

path_to_dataset = 'path_to_dummy_dataset'
model_name = 'maskrcnn'
label_list = ['__ignore__', '_background_', 'teabox']
_TRAIN_DOCKER_IMG = "cardboardcode/epd-trainer:latest"

widget = TrainWindow(True)
qtbot.addWidget(widget)

widget.max_iteration = 100
widget.checkpoint_period = 100
widget.test_period = 100
widget.steps = '(100, 200, 300)'

p3_trainer = P3Trainer(
path_to_dataset,
model_name,
label_list,
100,
100,
100,
'(100, 200, 300)')

p3_trainer.pullTrainFarmDockerImage()

cmd = ["docker", "inspect", "--type=image", _TRAIN_DOCKER_IMG]

docker_inspect_process = subprocess.Popen(
cmd,
universal_newlines=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
env=None)
docker_inspect_process.communicate()

assert docker_inspect_process.returncode == 0


def test_P3Trainer_pullExporterDockerImage(qtbot):

path_to_dataset = 'path_to_dummy_dataset'
model_name = 'maskrcnn'
label_list = ['__ignore__', '_background_', 'teabox']
_EXPORT_DOCKER_IMG = "cardboardcode/epd-exporter:latest"

widget = TrainWindow(True)
qtbot.addWidget(widget)

widget.max_iteration = 100
widget.checkpoint_period = 100
widget.test_period = 100
widget.steps = '(100, 200, 300)'

p3_trainer = P3Trainer(
path_to_dataset,
model_name,
label_list,
100,
100,
100,
'(100, 200, 300)')

p3_trainer.pullExporterDockerImage()

cmd = ["docker", "inspect", "--type=image", _EXPORT_DOCKER_IMG]

docker_inspect_process = subprocess.Popen(
cmd,
universal_newlines=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
env=None)
docker_inspect_process.communicate()

assert docker_inspect_process.returncode == 0


def test_P2Trainer_pullTrainFarmDockerImage(qtbot):

path_to_dataset = 'path_to_dummy_dataset'
model_name = 'fasterrcnn'
label_list = ['__ignore__', '_background_', 'teabox']
_TRAIN_DOCKER_IMG = "cardboardcode/epd-trainer:latest"

widget = TrainWindow(True)
qtbot.addWidget(widget)

widget.max_iteration = 100
widget.checkpoint_period = 100
widget.test_period = 100
widget.steps = '(100, 200, 300)'

p2_trainer = P2Trainer(
path_to_dataset,
model_name,
label_list,
100,
100,
100,
'(100, 200, 300)')

p2_trainer.pullTrainFarmDockerImage()

cmd = ["docker", "inspect", "--type=image", _TRAIN_DOCKER_IMG]

docker_inspect_process = subprocess.Popen(
cmd,
universal_newlines=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
env=None)
docker_inspect_process.communicate()

assert docker_inspect_process.returncode == 0


def test_P2Trainer_pullExporterDockerImage(qtbot):

path_to_dataset = 'path_to_dummy_dataset'
model_name = 'fasterrcnn'
label_list = ['__ignore__', '_background_', 'teabox']
_EXPORT_DOCKER_IMG = "cardboardcode/epd-exporter:latest"

widget = TrainWindow(True)
qtbot.addWidget(widget)

widget.max_iteration = 100
widget.checkpoint_period = 100
widget.test_period = 100
widget.steps = '(100, 200, 300)'

p2_trainer = P2Trainer(
path_to_dataset,
model_name,
label_list,
100,
100,
100,
'(100, 200, 300)')

p2_trainer.pullExporterDockerImage()

cmd = ["docker", "inspect", "--type=image", _EXPORT_DOCKER_IMG]

docker_inspect_process = subprocess.Popen(
cmd,
universal_newlines=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
env=None)
docker_inspect_process.communicate()

assert docker_inspect_process.returncode == 0
Loading

0 comments on commit 9cc23ae

Please sign in to comment.