Skip to content

Commit

Permalink
v0.4.0 (#304)
Browse files Browse the repository at this point in the history
* EdgeX Transformer (#271)

* added transformer component

Signed-off-by: rodalynbarce <[email protected]>

* added edgeX test

Signed-off-by: JamesKnBr <[email protected]>

* update test

Signed-off-by: JamesKnBr <[email protected]>

* test updates

Signed-off-by: rodalynbarce <[email protected]>

* added kafka destination write format

Signed-off-by: JamesKnBr <[email protected]>

* component update

Signed-off-by: rodalynbarce <[email protected]>

* removed timestamp format

Signed-off-by: JamesKnBr <[email protected]>

* updated docs

Signed-off-by: JamesKnBr <[email protected]>

---------

Signed-off-by: rodalynbarce <[email protected]>
Signed-off-by: JamesKnBr <[email protected]>
Co-authored-by: rodalynbarce <[email protected]>

* Feature/00264 (#272)

* EdgeX Transformer (#271)

* added transformer component

Signed-off-by: rodalynbarce <[email protected]>

* added edgeX test

Signed-off-by: JamesKnBr <[email protected]>

* update test

Signed-off-by: JamesKnBr <[email protected]>

* test updates

Signed-off-by: rodalynbarce <[email protected]>

* added kafka destination write format

Signed-off-by: JamesKnBr <[email protected]>

* component update

Signed-off-by: rodalynbarce <[email protected]>

* removed timestamp format

Signed-off-by: JamesKnBr <[email protected]>

* updated docs

Signed-off-by: JamesKnBr <[email protected]>

---------

Signed-off-by: rodalynbarce <[email protected]>
Signed-off-by: JamesKnBr <[email protected]>
Co-authored-by: rodalynbarce <[email protected]>
Signed-off-by: Shivam Saxena <[email protected]>

* Adding support for MISO Sources

Signed-off-by: Shivam Saxena <[email protected]>

* Fixing sonar smells

Signed-off-by: Shivam Saxena <[email protected]>

* Fixing sonar smells

Signed-off-by: Shivam Saxena <[email protected]>

* Removing ISO Runner file

Signed-off-by: Shivam Saxena <[email protected]>

* Making ISO methods internal

Signed-off-by: Shivam Saxena <[email protected]>

---------

Signed-off-by: rodalynbarce <[email protected]>
Signed-off-by: JamesKnBr <[email protected]>
Signed-off-by: Shivam Saxena <[email protected]>
Co-authored-by: JamesKnBr <[email protected]>
Co-authored-by: rodalynbarce <[email protected]>

* Add Partition Pruning to non broadcast join PCDM Merge (#274)

* Add Partition Pruning on non broadcast join merge

Signed-off-by: GBBBAS <[email protected]>

* Update DropDuplicates

Signed-off-by: GBBBAS <[email protected]>

* Fix Drop Duplicates Logic

Signed-off-by: GBBBAS <[email protected]>

---------

Signed-off-by: GBBBAS <[email protected]>

* update docs (#286)

* update docs

Signed-off-by: JamesKnBr <[email protected]>

* linked issues page

Signed-off-by: JamesKnBr <[email protected]>

* changed issues link

Signed-off-by: JamesKnBr <[email protected]>

---------

Signed-off-by: JamesKnBr <[email protected]>
Co-authored-by: JamesKnBr <[email protected]>

* Fix mkdocs colour bug (#291)

* update

Signed-off-by: cching95 <[email protected]>

* fix colour issue on mkdocs

Signed-off-by: cching95 <[email protected]>

* fix colour bug on homepage

Signed-off-by: cching95 <[email protected]>

---------

Signed-off-by: cching95 <[email protected]>

* Add Filter Parameter and TagName field mapping for OPC Publisher Transformer (#296)

* Additional OPC Publisher Logic

Signed-off-by: GBBBAS <[email protected]>

* Add Device ID Filter

Signed-off-by: GBBBAS <[email protected]>

* Update Docs

Signed-off-by: GBBBAS <[email protected]>

* Update to Filter Logic

Signed-off-by: GBBBAS <[email protected]>

---------

Signed-off-by: GBBBAS <[email protected]>

* Weather Domain and Data Model (#294)

* added new overview and data model

Signed-off-by: PaveeG <[email protected]>

* adding signoff

Signed-off-by: PaveeG <[email protected]>

* added overview and data model

Signed-off-by: PaveeG <[email protected]>

* Adding signoff Signed-off-by: Author Name <[email protected]>

Signed-off-by: PaveeG <[email protected]>

---------

Signed-off-by: PaveeG <[email protected]>

* Refactor for odbc and functions folders (#299)

* Folder Refactor

Signed-off-by: GBBBAS <[email protected]>

* Code smell fixes

Signed-off-by: GBBBAS <[email protected]>

---------

Signed-off-by: GBBBAS <[email protected]>

* Pyspark 3.4.0 and delta-spark 2.4.0 support (#297)

* Preview Support of Pyspark 3.4.0

Signed-off-by: GBBBAS <[email protected]>

* Fix for running github actions tests

Signed-off-by: GBBBAS <[email protected]>

* Updates for delta-spark release candidate

Signed-off-by: GBBBAS <[email protected]>

* Update tests

Signed-off-by: GBBBAS <[email protected]>

* Downgrade Turbodbc to include 3.8

Signed-off-by: GBBBAS <[email protected]>

* Add Spark Connect

Signed-off-by: GBBBAS <[email protected]>

* Updates to packages

Signed-off-by: GBBBAS <[email protected]>

* Updates for release of 2.4.0

Signed-off-by: GBBBAS <[email protected]>

* Add hvac back

Signed-off-by: GBBBAS <[email protected]>

* Fixes for tests

Signed-off-by: GBBBAS <[email protected]>

* Remove Whitespace

Signed-off-by: GBBBAS <[email protected]>

* Build sdist and wheel

Signed-off-by: GBBBAS <[email protected]>

* Change micromamba Action

Signed-off-by: GBBBAS <[email protected]>

* Update Tests Workflow

Signed-off-by: GBBBAS <[email protected]>

* Code Smells

Signed-off-by: GBBBAS <[email protected]>

* Sonarqube Bug Fix

Signed-off-by: GBBBAS <[email protected]>

* Update Readme with logo

Signed-off-by: GBBBAS <[email protected]>

* Center table

Signed-off-by: GBBBAS <[email protected]>

* Update readme

Signed-off-by: GBBBAS <[email protected]>

* Folder Refactor

Signed-off-by: GBBBAS <[email protected]>

* Code smell fixes

Signed-off-by: GBBBAS <[email protected]>

* Updates for refactoriung spark connector

Signed-off-by: GBBBAS <[email protected]>

---------

Signed-off-by: GBBBAS <[email protected]>

* Add OPC UA Transformer parameters (#301)

Signed-off-by: GBBBAS <[email protected]>

* Update for Micromamba Github Actions Parameters (#302)

* Update micromamba parameters

Signed-off-by: GBBBAS <[email protected]>

* Update for Sonarqube

Signed-off-by: GBBBAS <[email protected]>

---------

Signed-off-by: GBBBAS <[email protected]>

* Upgrade FastAPI package (#303)

Signed-off-by: GBBBAS <[email protected]>

---------

Signed-off-by: rodalynbarce <[email protected]>
Signed-off-by: JamesKnBr <[email protected]>
Signed-off-by: Shivam Saxena <[email protected]>
Signed-off-by: GBBBAS <[email protected]>
Signed-off-by: cching95 <[email protected]>
Signed-off-by: PaveeG <[email protected]>
Co-authored-by: JamesKnBr <[email protected]>
Co-authored-by: rodalynbarce <[email protected]>
Co-authored-by: IW-SS <[email protected]>
Co-authored-by: rodalynbarce <[email protected]>
Co-authored-by: JamesKnBr <[email protected]>
Co-authored-by: cching95 <[email protected]>
Co-authored-by: PaveeG <[email protected]>
  • Loading branch information
8 people authored May 30, 2023
1 parent ff33810 commit ea074bc
Show file tree
Hide file tree
Showing 132 changed files with 2,160 additions and 412 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/develop.yml
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ jobs:
shell: python
- name: Build Wheel
run: |
python -m build --wheel
python -m build
env:
RTDIP_SDK_NEXT_VER: ${{ steps.next_ver.outputs.rtdip_sdk_next_ver }}
- name: Upload Python wheel as artifact
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ jobs:
python -m pip install --upgrade pip
pip install twine
pip install build
python -m build --wheel
python -m build
env:
RTDIP_SDK_NEXT_VER: ${{ github.ref_name }}
- name: Upload Python wheel as artifact
Expand Down Expand Up @@ -109,10 +109,10 @@ jobs:
sudo apt install -y libboost-all-dev
- name: Install Conda environment with Micromamba
uses: mamba-org/provision-with-micromamba@main
uses: mamba-org/setup-micromamba@main
with:
environment-file: environment.yml
cache-env: true
cache-environment: true

- name: Deploy
run: |
Expand Down
12 changes: 6 additions & 6 deletions .github/workflows/sonarcloud_reusable.yml
Original file line number Diff line number Diff line change
Expand Up @@ -47,9 +47,9 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: ["3.10"]
pyspark: ["3.3.2"]
delta-spark: ["2.3.0"]
python-version: ["3.11"]
pyspark: ["3.4.0"]
delta-spark: ["2.4.0"]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v3
Expand All @@ -74,14 +74,14 @@ jobs:
echo $CONDA/bin >> $GITHUB_PATH
- name: Install Conda environment with Micromamba
uses: mamba-org/provision-with-micromamba@main
uses: mamba-org/setup-micromamba@main
with:
environment-file: environment.yml
extra-specs: |
create-args: >-
python=${{ matrix.python-version }}
pyspark=${{ matrix.pyspark }}
delta-spark=${{ matrix.delta-spark }}
cache-env: true
cache-environment: true

- name: Test
run: |
Expand Down
13 changes: 8 additions & 5 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,17 +23,20 @@ jobs:
run:
shell: bash -l {0}
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest]
python-version: ["3.8", "3.9", "3.10"]
pyspark: ["3.3.0", "3.3.1", "3.3.2"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
pyspark: ["3.3.0", "3.3.1", "3.3.2", "3.4.0"]
include:
- pyspark: "3.3.0"
delta-spark: "2.2.0"
- pyspark: "3.3.1"
delta-spark: "2.3.0"
- pyspark: "3.3.2"
delta-spark: "2.3.0"
- pyspark: "3.4.0"
delta-spark: "2.4.0"
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v3
Expand All @@ -56,14 +59,14 @@ jobs:
echo $CONDA/bin >> $GITHUB_PATH
- name: Install Conda environment with Micromamba
uses: mamba-org/provision-with-micromamba@main
uses: mamba-org/setup-micromamba@main
with:
environment-file: environment.yml
extra-specs: |
create-args: >-
python=${{ matrix.python-version }}
pyspark=${{ matrix.pyspark }}
delta-spark=${{ matrix.delta-spark }}
cache-env: true
cache-environment: true

- name: Test
run: |
Expand Down
20 changes: 12 additions & 8 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,26 +4,26 @@
"azureFunctions.projectLanguage": "Python",
"azureFunctions.projectRuntime": "~4",
"python.linting.enabled": true,
"python.linting.pylintEnabled": true,
"python.linting.pylintEnabled": false,
"python.formatting.autopep8Path": "/opt/conda/bin/autopep8",
"python.formatting.yapfPath": "/opt/conda/bin/yapf",
"python.linting.flake8Path": "/opt/conda/bin/flake8",
"python.linting.pycodestylePath": "/opt/conda/bin/pycodestyle",
"python.linting.pydocstylePath": "/opt/conda/bin/pydocstyle",
"python.linting.pylintPath": "/opt/conda/bin/pylint",
// "python.linting.pylintPath": "/opt/conda/bin/pylint",
"python.testing.pytestArgs": [
"--cov=.",
"--cov-report=xml:cov.xml",
"tests",
"-vv"
"-v"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true,
// "python.testing.cwd": "${workspaceFolder}",
"python.testing.cwd": "${workspaceFolder}",
"python.analysis.extraPaths": ["${workspaceFolder}"],
"terminal.integrated.env.osx":{
"PYTHONPATH": "${workspaceFolder}:${env:PYTHONPATH}"
},
// "terminal.integrated.env.osx":{
// "PYTHONPATH": "${workspaceFolder}:${env:PYTHONPATH}"
// },
"terminal.integrated.env.linux":{
"PYTHONPATH": "${workspaceFolder}:${env:PYTHONPATH}"
},
Expand All @@ -33,5 +33,9 @@
"git.alwaysSignOff": true,
"githubPullRequests.ignoredPullRequestBranches": [
"develop"
]
],
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter"
},
"python.formatting.provider": "none"
}
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,17 @@
# Real Time Data Ingestion Platform (RTDIP)

<p align="center"><img src=https://raw.githubusercontent.com/rtdip/core/develop/docs/getting-started/images/rtdip-horizontal-color.png alt="rtdip" width=50% height=50%/></p>

<div align="center">

| Branch | Workflow Status | Code Coverage | Vulnerabilities | Bugs |
|--------|-----------------|---------------|----------|------|
| main | [![Main](https://github.com/rtdip/core/actions/workflows/main.yml/badge.svg?branch=main)](https://github.com/rtdip/core/actions/workflows/main.yml) | [![Coverage](https://sonarcloud.io/api/project_badges/measure?project=rtdip_core&metric=coverage&branch=main)](https://sonarcloud.io/summary/new_code?id=rtdip_core) | [![Vulnerabilities](https://sonarcloud.io/api/project_badges/measure?project=rtdip_core&metric=vulnerabilities&branch=main)](https://sonarcloud.io/summary/new_code?id=rtdip_core) | [![Bugs](https://sonarcloud.io/api/project_badges/measure?project=rtdip_core&metric=bugs&branch=main)](https://sonarcloud.io/summary/new_code?id=rtdip_core) |
| develop | [![Develop](https://github.com/rtdip/core/actions/workflows/develop.yml/badge.svg)](https://github.com/rtdip/core/actions/workflows/develop.yml) | [![Coverage](https://sonarcloud.io/api/project_badges/measure?project=rtdip_core&metric=coverage&branch=develop)](https://sonarcloud.io/summary/new_code?id=rtdip_core) | [![Vulnerabilities](https://sonarcloud.io/api/project_badges/measure?project=rtdip_core&metric=vulnerabilities&branch=develop)](https://sonarcloud.io/summary/new_code?id=rtdip_core) | [![Bugs](https://sonarcloud.io/api/project_badges/measure?project=rtdip_core&metric=bugs&branch=develop)](https://sonarcloud.io/summary/new_code?id=rtdip_core) |
| feature | [![.github/workflows/pr.yml](https://github.com/rtdip/core/actions/workflows/pr.yml/badge.svg)](https://github.com/rtdip/core/actions/workflows/pr.yml) |

</div>

This repository contains Real Time Data Ingestion Platform SDK functions and documentation. This README will be a developer guide to understand the repository.

## What is RTDIP SDK?
Expand Down
12 changes: 6 additions & 6 deletions docs/assets/extra.css
Original file line number Diff line number Diff line change
Expand Up @@ -15,12 +15,12 @@
*/

:root {
--md-primary-fg-color: #4e08c7;
--md-primary-mg-color: #d445a3;
--md-accent-fg-color: #bb1fa4;
--md-primary-bg-color: white;
--md-primary-text-slate: white;
--md-primary-bg-slate: #2f303e;
--md-primary-fg-color: #4e08c7 !important;
--md-primary-mg-color: #d445a3 !important;
--md-accent-fg-color: #bb1fa4 !important;
--md-primary-bg-color: white !important;
--md-primary-text-slate: white !important;
--md-primary-bg-slate: #2f303e !important;
}

/* header font colour */
Expand Down
3 changes: 2 additions & 1 deletion docs/blog/rtdip_ingestion_pipelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,13 +127,14 @@ Sources are components that connect to source systems and extract data from them
| IoT Hub|*:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:||Q2 2023|
| Kafka|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|Q2 2023|
| Kinesis||:heavy_check_mark:|:heavy_check_mark:||:heavy_check_mark:|Q2 2023|
| IoT Core||:heavy_check_mark:|:heavy_check_mark:||:heavy_check_mark:|Q2 2023|
| SSIP PI Connector||:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|Q2 2023|
| Rest API|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|Q2 2023|
| MongoDB|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|Q3 2023|

*:heavy_check_mark: - target to deliver in the following quarter

There is currently no spark connector for IoT Core. If you know a way to add it as a source component, please raise it by creating an [issue](https://github.com/rtdip/core/blob/develop/CONTRIBUTING.md#issues-guidelines){ target="_blank" } on the GitHub repo.

### Transformers

Transformers are components that perform transformations on data. These will target certain data models and common transformations that sources or destination components require to be performed on data before it can be ingested or consumed.
Expand Down
31 changes: 21 additions & 10 deletions docs/domains/process_control/data_model.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,33 +38,45 @@ erDiagram

### Fledge OPC UA South Plugin

[Fledge](https://www.lfedge.org/projects/fledge/) provides support for sending data between various data sources and data destinations. The mapping below is for the [OPC UA South Pugin](https://fledge-iot.readthedocs.io/en/latest/plugins/fledge-south-opcua/index.html) that can be sent to message brokers like Kafka, Azure IoT Hub etc.
[Fledge](https://www.lfedge.org/projects/fledge/){target=_blank} provides support for sending data between various data sources and data destinations. The mapping below is for the [OPC UA South Pugin](https://fledge-iot.readthedocs.io/en/latest/plugins/fledge-south-opcua/index.html){target=_blank} that can be sent to message brokers like Kafka, Azure IoT Hub etc.

This mapping is performed by the [RTDIP Fledge to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/fledge_json_to_pcdm.md) and can be used in an [RTDIP Ingestion Pipeline.](../../sdk/pipelines/framework.md)
This mapping is performed by the [RTDIP Fledge to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/fledge_opcua_json_to_pcdm.md) and can be used in an [RTDIP Ingestion Pipeline.](../../sdk/pipelines/framework.md)

| From Data Model | From Field | From Type | To Data Model |To Field| To Type | Mapping Logic |
|------|----|---------|------|------|--------|-----------|
| Fledge OPC UA | Object ID | string | EVENTS| TagName | string | |
| Fledge OPC UA | EventTime | string | EVENTS| EventTime | timestamp | Converted to a timestamp |
| | | | EVENTS| Status | string | Can be defaulted in [RTDIP Fledge to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/fledge_json_to_pcdm.md) otherwise Null |
| | | | EVENTS| Status | string | Can be defaulted in [RTDIP Fledge to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/fledge_opcua_json_to_pcdm.md) otherwise Null |
| Fledge OPC UA | Value | string | EVENTS | Value | dynamic | Converts Value into either a float number or string based on how it is received in the message |

### OPC Publisher

[OPC Publisher](https://learn.microsoft.com/en-us/azure/industrial-iot/overview-what-is-opc-publisher) connects to OPC UA assets and publishes data to the Microsoft Azure Cloud's IoT Hub.
[OPC Publisher](https://learn.microsoft.com/en-us/azure/industrial-iot/overview-what-is-opc-publisher){target=_blank} connects to OPC UA assets and publishes data to the Microsoft Azure Cloud's IoT Hub.

The mapping below is performed by the [RTDIP OPC Publisher to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/opc_publisher_json_to_pcdm.md) and can be used in an [RTDIP Ingestion Pipeline.](../../sdk/pipelines/framework.md)
The mapping below is performed by the [RTDIP OPC Publisher to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/opc_publisher_opcua_json_to_pcdm.md) and can be used in an [RTDIP Ingestion Pipeline.](../../sdk/pipelines/framework.md)

| From Data Model | From Field | From Type | To Data Model |To Field| To Type | Mapping Logic |
|------|----|---------|------|------|--------|-----------|
| OPC Publisher | DisplayName | string | EVENTS| TagName | string | |
| OPC Publisher | DisplayName | string | EVENTS| TagName | string | From Field can be specified in Component |
| OPC Publisher | SourceTimestamp | string | EVENTS| EventTime | timestamp | Converted to a timestamp |
| OPC Publisher | StatusCode.Symbol | string | EVENTS| Status | string | Null values can be overriden in the [RTDIP OPC Publisher to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/opc_publisher_json_to_pcdm.md) |
| OPC Publisher | StatusCode.Symbol | string | EVENTS| Status | string | Null values can be overriden in the [RTDIP OPC Publisher to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/opc_publisher_opcua_json_to_pcdm.md) |
| OPC Publisher | Value.Value | string | EVENTS | Value | dynamic | Converts Value into either a float number or string based on how it is received in the message |

### EdgeX
[EdgeX](https://www.lfedge.org/projects/edgexfoundry/){target=_blank} provides support for sending data between various data sources and data destinations.

This mapping is performed by the [RTDIP EdgeX to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/edgex_opcua_json_to_pcdm.md) and can be used in an [RTDIP Ingestion Pipeline.](../../sdk/pipelines/framework.md)

| From Data Model | From Field | From Type | To Data Model |To Field| To Type | Mapping Logic |
|------|----|---------|------|------|--------|-----------|
| EdgeX | deviceName | string | EVENTS| TagName | string | |
| EdgeX | origin | string | EVENTS| EventTime | timestamp | Converted to a timestamp |
| | | | EVENTS| Status | string | Can be defaulted in [RTDIP EdgeX to PCDM Component](../../sdk/code-reference/pipelines/transformers/spark/edgex_opcua_json_to_pcdm.md) otherwise Null |
| EdgeX | value | string | EVENTS | Value | dynamic | Converts Value into either a float number or string based on how it is received in the message |

### SSIP PI

[SSIP PI](https://bakerhughesc3.ai/oai-solution/shell-sensor-intelligence-platform/) connects to Osisoft PI Historians and sends the data to the Cloud.
[SSIP PI](https://bakerhughesc3.ai/oai-solution/shell-sensor-intelligence-platform/){target=_blank} connects to Osisoft PI Historians and sends the data to the Cloud.

The mapping below is performed by the RTDIP SSIP PI to PCDM Component and can be used in an [RTDIP Ingestion Pipeline.](../../sdk/pipelines/framework.md)

Expand All @@ -73,5 +85,4 @@ The mapping below is performed by the RTDIP SSIP PI to PCDM Component and can be
| SSIP PI | TagName | string | EVENTS| TagName | string | |
| SSIP PI | EventTime | string | EVENTS| EventTime | timestamp | |
| SSIP PI | Status | string | EVENTS| Status | string | |
| SSIP PI | Value | dynamic | EVENTS | Value | dynamic | |

| SSIP PI | Value | dynamic | EVENTS | Value | dynamic | |
Loading

0 comments on commit ea074bc

Please sign in to comment.