Skip to content

Latest commit

 

History

History
88 lines (60 loc) · 4.24 KB

README.md

File metadata and controls

88 lines (60 loc) · 4.24 KB

How to evaluate Ontop

  1. Modify testcases/config-postgresql.ini or testcases/config-mysql.ini to config Ontop
  2. Stop MySQL or PostgreSQL on your local machine (if any)
  3. Run the tests
$ cd testcases
$ python3 -m pip install -r requirements.txt
$ python3 test.py config-mysql.ini 
$ python3 test.py config-postgresql.ini 

R2RML Implementation report

Test the capabilities of your R2RML engine with the R2RML test cases. Use the resources provided in this repository to automatically generate an EARL report with your results. Following the configuration steps to include your results in the R2RML implementation report .

Requirements for running the implementation report:

  • Linux based OS
  • Docker and docker-compose
  • Python
  • Java

RDBMS coverage and properties info:

  • MySQL (port = 3306)
  • PostgreSQL (port = 5432)

Connection properties for any RDBMS are: database = r2rml, user = r2rml, password = r2rml.

For testing purposes, mapping path is invariable, it is always test-cases/r2rml.ttl

Steps to include your results in the R2RML implementation report website:

We follow a decentralized approach to query and obtain the results for the R2RML parsers that want to include their results in the R2RML implementation report website. More in detail, we use Walder to generate the website, querying the EARL reports provide by any R2RML-parser developer. The steps to include your results in the implementation report are:

  1. Have an access point for your results (it could be a LDF server, RDF dump, SPARQL endpoint, etc.). As the reports are not very heavy, the easiest way to provide the results could be an RDF dump uploaded to GitHub repository (e.g., https://raw.githubusercontent.com/[YOUR-USER]/[YOUR-REPO]/main/test-cases/results.ttl). We explain how to generate the R2RML test-cases report in the next section.
  2. Fork this repository.
  3. Add the access point in the WALDER config file.
  4. Make a pull request to include the results in the website.

Overview of the configuration steps: Configuration setp

Steps to generate the results from the R2RML test-cases:

  1. Clone or download this repository.
  2. To include the R2RML test cases for the support of the development of your R2RML parser you can:
    • Copy the complete test-cases folder into your repository (e.g., in a testing folder in the master branch or in a new testing branch).
    • Include the executable file(s) of your engine inside the test-cases folder.
  3. Install the requirements of the script python3 -m pip install -r test-cases/requirements.txt
  4. Modify the test-cases/config.ini file with your information. For configurating your engine, remember that the path of the mapping file is always test-cases/r2rml.ttl. For example:
[tester]
tester_name: David Chaves # tester name
tester_url: https://dchaves.oeg-upm.net/ # tester homepage
tester_contact: [email protected] # tester contact

[engine]
test_date: 2021-01-07 # engine test-date (YYYY-MM-DD)
engine_version: 3.12.5 # engine version
engine_name: Morph-RDB # engine name
engine_created: 2013-12-01 # engine date created (YYYY-MM-DD)
engine_url: https://morph.oeg.fi.upm.es/tool/morph-rdb # URL of the engine (e.g., GitHub repo)


[properties]
database_system: [mysql|postgresql] # choose only one
output_results: ./output.ttl # path to the result graph of your engine
output_format: ntriples # output format of the results from your engine
engine_command: java -jar morph-rdb.jar -p properties.properties # command to run your engine
  1. Run the script python3 test.py config.ini
  2. Your results will appear in test-cases/results.ttl in RDF and in test-cases/results.csv in CSV.
  3. Upload or update the obtained results the access point you have provided in the configuration step.
  4. For each new version of your engine, repeat the process from step 4 to 7.

Overview of the testing steps: Testing step