Given an SDS stream and its correspondent stream of members, this processor will write everything into a supported data storage system. So far, it only supports MongoDB instances.
SDS stream updates are stored into MongoDB collections for the LDES server to find this information when serving requests.
An example of a SDS data stream with a predefined fragmentation strategy is shown next:
# Member ex:sample1 exists
ex:sample1 a ex:Object;
ex:x "2";
ex:y "5".
# <bucketizedStream> contains this member and this member is part of bucket <bucket2>
[] sds:stream <bucketizedStream>;
sds:payload ex:sample1;
sds:bucket <bucket2>.
# <bucket1> has a relation to <bucket2>
<bucket1> sds:relation [
sds:relationType tree:GreaterThanRelation ;
sds:relationBucket <bucket2> ;
sds:relationValue 1;
sds:relationPath ex:x
] .
With this information, the data of the member is stored in the MongoDB collection, and the required relations are also stored in the database.
As a RDF-Connect processor
This repository exposes the following RDF-Connect processors:
This processor can be used within data processing pipelines to write a SDS streams into a MongoDB instance. The processor can be configured as follows:
@prefix : <https://w3id.org/conn#>.
@prefix js: <https://w3id.org/conn/js#>.
@prefix sh: <http://www.w3.org/ns/shacl#>.
[ ] a js:Ingest;
js:dataInput <inputDataReader>;
js:metadataInput <inputMetadataReader>;
js:database [
js:url <http://myLDESView.org>;
js:metadata "METADATA";
js:data "DATA";
js:index "INDEX";
].
The library exposes one function ingest
, which handles everything.
async function ingest(
data: Stream<string | Quad[]>,
metadata: Stream<string | RDF.Quad[]>,
database: DBConfig,
) { /* snip */ }
arguments:
data
: a stream reader that carries data (asstring
orQuad[]
).metadata
: a stream reader that carries SDS metadata (asstring
orQuad[]
).database
: connection parameters for a reachable MongoDB instance.
Arthur Vercruysse [email protected] Julián Rojas [email protected]
© Ghent University - IMEC. MIT licensed