Skip to content

Commit

Permalink
Doc fixes & more metadata in pyproject for PyPI (#147)
Browse files Browse the repository at this point in the history
* Add more pkg metadata including link to repo & docs

* Add link to repo & other minor doc fixes

* Fix properties in pyproject for poetry

* Fixes in docs to reduce sphinx warnings

The only remaining warning are related to metamodels docs.

* Add utilities package to index of docs
  • Loading branch information
dalito authored Aug 19, 2024
1 parent 7405c9c commit 168936d
Show file tree
Hide file tree
Showing 17 changed files with 137 additions and 136 deletions.
4 changes: 2 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ For running tests, we use `pytest`.
## Discussion

If you run into any issues or find certain functionality not documented/explained properly then feel free to
raise a ticket in the project's [issue tracker](https://github.com/linkml/issues).
raise a ticket in the project's [issue tracker](https://github.com/linkml/schema-automator/issues).
There are issue templates to capture certain types of issues.

## First Time Contributors
Expand Down Expand Up @@ -73,7 +73,7 @@ with a 'Do Not Merge' label.
## How to Report a Bug

We recommend making a new ticket for each bug that you encounter while working with KGX. Please be sure to provide
sufficient context for a bug you are reporting. There are [Issue Templates](https://github.com/linkml/issues/new/choose)
sufficient context for a bug you are reporting. There are [Issue Templates](https://github.com/linkml/schema-automator/issues/new/choose)
that you can use as a starting point.

## How to Request an Enhancement
Expand Down
Empty file added docs/_static/.gitkeep
Empty file.
14 changes: 7 additions & 7 deletions docs/cli.rst
Original file line number Diff line number Diff line change
@@ -1,24 +1,24 @@
.. cli:
.. _cli:

Command Line
============
Command Line Interface
======================

All Schema Automator functionality is available via the ``schemauto`` command
All Schema Automator functionality is available via the ``schemauto`` command.

Preamble
--------

.. warning ::
.. warning::

Previous versions had specific commands like ``tsv2linkml`` these are now deprecated.
Instead these are now *subcommands* of the main ``schemauto`` command, and have been renamed.

.. note ::
.. note::

we follow the `CLIG <https://clig.dev/>`_ guidelines as far as possible

Main commands
---------
-------------

.. currentmodule:: schema_automator.cli

Expand Down
7 changes: 4 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,13 +1,15 @@
LinkML Schema Automator
============================================
=======================

Schema Automator is a toolkit for bootstrapping and automatically enhancing schemas from a variety of sources.

The project is open source (BSD 3-clause license) and hosted on `GitHub <https://github.com/linkml/schema-automator>`_.

Use cases include:

1. Inferring an initial schema or data dictionary from a dataset that is a collection of TSVs
2. Automatically annotating schema elements and enumerations using the BioPortal annotator
3. Importing from a language like RDFS/OWL
3. Importing from a language like RDFS/OWL/SQL

The primary output of Schema Automator is a `LinkML Schema <https://linkml.io/linkml>`_. This can be converted to other
schema frameworks, including:
Expand All @@ -23,7 +25,6 @@ schema frameworks, including:
:maxdepth: 3
:caption: Contents:

index
introduction
install
cli
Expand Down
6 changes: 3 additions & 3 deletions docs/install.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
Installation
======
============

Direct Installation
------------
-------------------

``schema-automator`` and its components require Python 3.9 or greater.

Expand All @@ -17,7 +17,7 @@ To check this works:
schemauto --help
Running via Docker
------------
------------------

You can use the `Schema Automator Docker Container <https://hub.docker.com/r/linkml/schema-automator>`_

Expand Down
14 changes: 6 additions & 8 deletions docs/introduction.rst
Original file line number Diff line number Diff line change
@@ -1,15 +1,13 @@
.. _introduction:

Introduction
=======================
============

This is a toolkit that assists with generating and enhancing schemas and data models from a variety
of sources.

The primary end target is a `LinkML <https://linkml.io>`_ schema, but the framework can be used
to generate JSON-Schema, SHACL, SQL DDL etc via the `LinkML Generator <https://linkml.io/linkml/generators>`_ framework.

All functionality is available via a :ref:`cli`. In future there will be a web-based interface.
All functionality is available via a :ref:`CLI <cli>`. In future there will be a web-based interface.
The functionality is also available by using the relevant Python :ref:`packages`.

Generalization from Instance Data
Expand All @@ -24,7 +22,7 @@ Generalizers allow you to *bootstrap* a schema by generalizing from existing dat
* RDF instance graphs

Importing from alternative modeling frameworks
---------------------------------
----------------------------------------------

See :ref:`importers`

Expand All @@ -35,7 +33,7 @@ See :ref:`importers`
In future other frameworks will be supported

Annotating schemas
---------------------------------
------------------

See :ref:`annotators`

Expand All @@ -46,7 +44,7 @@ Annotators to provide ways to automatically add metadata to your schema, includi
* Annotate using Large Language Models (LLMs)

General Utilities
---------------------------------
-----------------

See :ref:`utilitiess`
See :ref:`utilities`

10 changes: 0 additions & 10 deletions docs/metamodels/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,6 @@ metamodels in order to define transformations.
:maxdepth: 3
:caption: Contents:

index
cadsr/index
frictionless/index
dosdp/index
fhir/index


Indices and tables
==================

* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
4 changes: 1 addition & 3 deletions docs/packages/annotators.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
.. annotators:
Annotators
=========
==========

Importers take an existing schema and *annotate* it with information

Expand Down
129 changes: 64 additions & 65 deletions docs/packages/generalizers.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
.. generalizers:
Generalizers
=========
============

Generalizers take example data and *generalizes* to a schema

Expand All @@ -11,7 +9,7 @@ Generalizers take example data and *generalizes* to a schema
that *semi*-automates the creation of a new schema for you.

Generalizing from a single TSV
-----------------
------------------------------

.. code-block::
Expand Down Expand Up @@ -90,65 +88,9 @@ Enums will be automatically inferred:
Lowland Black Spruce:
description: Lowland Black Spruce
Chaining an annotator
-----------------

If you provide an ``--annotator`` option you can auto-annotate enums:

.. code-block::
schemauto generalize-csv \
--annotator bioportal:envo \
tests/resources/NWT_wildfires_biophysical_2016.tsv \
-o wildfire.yaml
.. code-block:: yaml
ecosystem_enum:
from_schema: https://w3id.org/MySchema
permissible_values:
Open Fen:
description: Open Fen
meaning: ENVO:00000232
exact_mappings:
- ENVO:00000232
Treed Fen:
description: Treed Fen
meaning: ENVO:00000232
exact_mappings:
- ENVO:00000232
Black Spruce:
description: Black Spruce
Poor Fen:
description: Poor Fen
meaning: ENVO:00000232
exact_mappings:
- ENVO:00000232
Fen:
description: Fen
meaning: ENVO:00000232
Lowland:
description: Lowland
Upland:
description: Upland
meaning: ENVO:00000182
Bog:
description: Bog
meaning: ENVO:01000534
exact_mappings:
- ENVO:01000535
- ENVO:00000044
- ENVO:01001209
- ENVO:01000527
Lowland Black Spruce:
description: Lowland Black Spruce
The annotation can also be run as a separate step

See :ref:`annotators`
Generalizing from multiple TSVs
------------
-------------------------------

You can use the ``generalize-tsvs`` command to generalize from *multiple* TSVs, with
foreign key linkages auto-inferred.
Expand Down Expand Up @@ -217,7 +159,7 @@ slots:
range: string

Generalizing from tables on the web
-----------------
-----------------------------------

You can use ``generalize-htmltable``

Expand Down Expand Up @@ -274,12 +216,69 @@ Will generate:
- TWAS P value
Generalizing from JSON
-----------
----------------------

tbw

Chaining an annotator
---------------------

If you provide an ``--annotator`` option you can auto-annotate enums:

.. code-block::
schemauto generalize-csv \
--annotator bioportal:envo \
tests/resources/NWT_wildfires_biophysical_2016.tsv \
-o wildfire.yaml
.. code-block:: yaml
ecosystem_enum:
from_schema: https://w3id.org/MySchema
permissible_values:
Open Fen:
description: Open Fen
meaning: ENVO:00000232
exact_mappings:
- ENVO:00000232
Treed Fen:
description: Treed Fen
meaning: ENVO:00000232
exact_mappings:
- ENVO:00000232
Black Spruce:
description: Black Spruce
Poor Fen:
description: Poor Fen
meaning: ENVO:00000232
exact_mappings:
- ENVO:00000232
Fen:
description: Fen
meaning: ENVO:00000232
Lowland:
description: Lowland
Upland:
description: Upland
meaning: ENVO:00000182
Bog:
description: Bog
meaning: ENVO:01000534
exact_mappings:
- ENVO:01000535
- ENVO:00000044
- ENVO:01001209
- ENVO:01000527
Lowland Black Spruce:
description: Lowland Black Spruce
The annotation can also be run as a separate step

See :ref:`annotators`

Packages
--------
Packages for generalizing
-------------------------

.. currentmodule:: schema_automator.generalizers

Expand Down
16 changes: 7 additions & 9 deletions docs/packages/importers.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
.. importers:
Importers
=========

Expand All @@ -15,7 +13,7 @@ Importers are the opposite of `Generators <https://linkml.io/linkml/generators/i
will be created.

Importing from JSON-Schema
---------
--------------------------

The ``import-json-schema`` command can be used:

Expand All @@ -24,7 +22,7 @@ The ``import-json-schema`` command can be used:
schemauto import-json-schema tests/resources/model_card.schema.json
Importing from Kwalify
---------
----------------------

The ``import-kwalify`` command can be used:

Expand All @@ -33,7 +31,7 @@ The ``import-kwalify`` command can be used:
schemauto import-kwalify tests/resources/test.kwalify.yaml
Importing from OWL
---------
------------------

You can import from a schema-style OWL ontology. This must be in functional syntax

Expand All @@ -45,7 +43,7 @@ Use robot to convert ahead of time:
schemauto import-owl schemaorg.ofn
Importing from SQL
---------
------------------

You can import a schema from a SQL database

Expand All @@ -65,7 +63,7 @@ For example, for the `RNA Central public database <https://rnacentral.org/help/p
schemauto import-sql postgresql+psycopg2://reader:[email protected]:5432/pfmegrnargs
Importing from caDSR
---------
--------------------

caDSR is an ISO-11179 compliant metadata registry. The ISO-11179 conceptual model can be mapped to LinkML. The
canonical mapping maps a CDE onto a LinkML *slot*.
Expand All @@ -79,8 +77,8 @@ NCI implements a JSON serialization of ISO-11197. You can import this JSON and c
schemauto import-cadsr "cdes/*.json"
Packages
-------
Packages for importing
----------------------

.. currentmodule:: schema_automator.importers

Expand Down
Loading

0 comments on commit 168936d

Please sign in to comment.