- You must use a terminal application in order to run commands from the TriplyETL CLI. Here are some examples of terminal applications on different operating systems:
+ You must use a terminal application in order to run commands from the TriplyETL CLI. Here are some examples of terminal applications on different operating systems:
- On Windows
- Most Windows versions come with some version of PowerShell preinstalled. You can also follow these instructions by Microsoft to update to the latest version of PowerShell.
@@ -64,7 +64,7 @@ git config --global user.name "Ada Lovelace"
The TriplyETL Generator allows you to create new ETL projects in your terminal application.
-If a TriplyETL project already exists, use the [TriplyETL Runner](/triply-etl/cli/#triplyetl-runner) instead.
+If a TriplyETL project already exists, use the [TriplyETL Runner](../cli/#triplyetl-runner) instead.
In order to use TriplyETL Generator, you must have:
@@ -122,7 +122,7 @@ npx triply-etl-generator
cd my-etl
```
-4. You can now use the [TriplyETL Runner](/triply-etl/cli#triplyetl-runner) to run the ETL:
+4. You can now use the [TriplyETL Runner](../cli#triplyetl-runner) to run the ETL:
```sh
npx etl
@@ -178,7 +178,7 @@ npx etl
At this point, you should see a first TriplyETL process in your terminal application. If this is not the case, please contact [support@triply.cc](mailto:support@triply.cc) to help you out.
-Visit the [TriplyETL CLI documentation](/triply-etl/cli#triplyetl-runner) to learn more about how you can use the TriplyETL Runner. Visit the [TriplyETL CI/CD documentation](/triply-etl/maintenance#configure-cicd) to learn more about how you can automate TriplyETL runs.
+Visit the [TriplyETL CLI documentation](../cli#triplyetl-runner) to learn more about how you can use the TriplyETL Runner. Visit the [TriplyETL CI/CD documentation](../maintenance#configure-cicd) to learn more about how you can automate TriplyETL runs.
diff --git a/docs/triply-etl/maintenance/index.md b/docs/triply-etl/maintenance/index.md
index 5c8f2c52..44bcd66f 100644
--- a/docs/triply-etl/maintenance/index.md
+++ b/docs/triply-etl/maintenance/index.md
@@ -56,9 +56,9 @@ TriplyETL uses the Semantic Versioning approach: `{major}.{minor}.{patch}` The i
- Patch update
- Only the
{patch}
number has increased. This means that one or more bugs have been fixed in a backward compatible manner. You should always be able to perform a patch update without having to make any changes to your configuration.
- Minor update
- - The
{minor}
number has increased, but the {major}
number is still the same. This means that new functionality was added in a backward compatible manner. You should always be able to perform a minor update without having to make any changes to your configuration. But you may want to check the changelog to see which new functionalities were added.
+ - The
{minor}
number has increased, but the {major}
number is still the same. This means that new functionality was added in a backward compatible manner. You should always be able to perform a minor update without having to make any changes to your configuration. But you may want to check the changelog to see which new functionalities were added.
- Major update
- - The
{major}
number has increased. This means that there are incompatible changes. This means that features may have been removed, or existing features may have changed. In such cases, changes to your configuration are almost certainly necessary, and may take some time to implement. Any changes you need to make are described in the changelog.
+ - The
{major}
number has increased. This means that there are incompatible changes. This means that features may have been removed, or existing features may have changed. In such cases, changes to your configuration are almost certainly necessary, and may take some time to implement. Any changes you need to make are described in the changelog.
### Perform the update
@@ -101,7 +101,7 @@ TriplyETL pipelines can be configured to run automatically in any CI/CD environm
### CI/CD configuration file
-The [TriplyETL Generator](/triply-etl/getting-started#triplyetl-generator) creates a basic configuration file for running TriplyETL in GitLab CI/CD. The configuration file is called `.gitlab-ci.yml`.
+The [TriplyETL Generator](../getting-started#triplyetl-generator) creates a basic configuration file for running TriplyETL in GitLab CI/CD. The configuration file is called `.gitlab-ci.yml`.
The configuration contains a list of stages:
diff --git a/docs/triply-etl/publish/index.md b/docs/triply-etl/publish/index.md
index 26c0d08c..8edd8f16 100644
--- a/docs/triply-etl/publish/index.md
+++ b/docs/triply-etl/publish/index.md
@@ -150,7 +150,7 @@ The function destination expects that source data is linked data. Copying a sour
## Using TriplyDB.js in TriplyETL
-All operations that can be performed in a TriplyDB instance can be automated with classes and methods in the [TriplyDB.js](/triplydb-js) library. This library is also used by TriplyETL in the background to implement many of the TriplyETL functionalities.
+All operations that can be performed in a TriplyDB instance can be automated with classes and methods in the [TriplyDB.js](../../triplydb-js) library. This library is also used by TriplyETL in the background to implement many of the TriplyETL functionalities.
Sometimes it is useful to use classes and methods in TriplyDB.js directly. This is done in the following way:
@@ -162,7 +162,7 @@ const etl = new Etl()
console.log((await etl.triplyDb.getInfo()).name)
```
-The above example prints the name of the TriplyDB instance. But any other [TriplyDB.js](/triplydb-js) operations can be performed. For example, the user of the current API Token can change their avatar image in TriplyDB:
+The above example prints the name of the TriplyDB instance. But any other [TriplyDB.js](../../triplydb-js) operations can be performed. For example, the user of the current API Token can change their avatar image in TriplyDB:
```ts
@@ -220,7 +220,7 @@ You can also set the `ENV` variable in the GitLab CI/CD environment. This allows
## Upload prefix declarations
-At the end of a TriplyETL script, it is common to upload the [prefix declarations](/triply-etl/declare#declarePrefix) that are configured for that pipeline.
+At the end of a TriplyETL script, it is common to upload the [prefix declarations](../declare#declarePrefix) that are configured for that pipeline.
This is often done directly before or after graphs are uploaded (function [toTriplyDb()](#remote-data-destinations)):
diff --git a/docs/triply-etl/tmp/automation.md b/docs/triply-etl/tmp/automation.md
index 54a81a2e..699c2c31 100644
--- a/docs/triply-etl/tmp/automation.md
+++ b/docs/triply-etl/tmp/automation.md
@@ -18,4 +18,4 @@ TriplyETL runs within a Gitlab CI environment ([Figure 1](#figure-1)).
The special key `$environment` denotes the DTAP environment in which the TriplyETL pipeline is running. This allows special actions to be performed based on whether the pipeline runs in `"Debug"`, `"Test"`, `"Acceptance"`, or `"Production"` mode.
-See the [DTAP documentation](/triply-etl/dtap) for more information.
+See the [DTAP documentation](../../publish/#setting-up-acceptanceproduction-runs-dtap) for more information.
diff --git a/docs/triply-etl/tmp/static-dynamic-statements.md b/docs/triply-etl/tmp/static-dynamic-statements.md
index f20728bd..09c23a99 100644
--- a/docs/triply-etl/tmp/static-dynamic-statements.md
+++ b/docs/triply-etl/tmp/static-dynamic-statements.md
@@ -13,7 +13,7 @@ We use the following Record as an example:
| Germany | 83190556 |
| Netherlands | 17650200 |
-We start with creating the prefix and term declarations (see the [Declare](/triply-etl/declare) documentation for more information):
+We start with creating the prefix and term declarations (see the [Declare](../../declare) documentation for more information):
```ts
const base = declarePrefix('https://triplydb.com/Triply/example/')
diff --git a/docs/triply-etl/transform/index.md b/docs/triply-etl/transform/index.md
index 099e90e2..4a71310a 100644
--- a/docs/triply-etl/transform/index.md
+++ b/docs/triply-etl/transform/index.md
@@ -20,7 +20,7 @@ graph LR
tdb[(Triple Store)]
```
-If you do not have a stream of records yet, read the documentation for the [**Extract** step](/triply-etl/extract) first.
+If you do not have a stream of records yet, read the documentation for the [**Extract** step](../extract) first.
Once you have a stream of records, the following transformations are typically needed:
- Values need to be mapped onto a prepared list of IRIs or literals (e.g. from country names to country-denoting IRIs).
@@ -31,8 +31,8 @@ Once you have a stream of records, the following transformations are typically n
TriplyETL supports the following transformation approaches:
-- 2A. [**RATT**](/triply-etl/transform/ratt) transformations are a set of commonly used transformation functions that are developed and maintained by Triply.
-- 2B. [**TypeScript**](/triply-etl/transform/typescript) can be used to write new customer transformations.
+- 2A. [**RATT**](ratt) transformations are a set of commonly used transformation functions that are developed and maintained by Triply.
+- 2B. [**TypeScript**](typescript) can be used to write new customer transformations.
@@ -40,4 +40,4 @@ TriplyETL supports the following transformation approaches:
The Transform step results in a cleaned and enriched record. The following link documents how you can use the record to make linked data assertions:
-- [3. **Assert**](/triply-etl/assert) uses data from the Record to generate linked data in the Internal Store.
+- [3. **Assert**](../assert) uses data from the Record to generate linked data in the Internal Store.
diff --git a/docs/triply-etl/transform/ratt/index.md b/docs/triply-etl/transform/ratt/index.md
index 316e1cb4..ca7ff8e5 100644
--- a/docs/triply-etl/transform/ratt/index.md
+++ b/docs/triply-etl/transform/ratt/index.md
@@ -4,7 +4,7 @@
RATT transformations are a core set of functions that are commonly used to change the content of TriplyETL Records.
-RATT transformations started out as [TypeScript transformations](/triply-etl/transform/typescript) that turned out to be useful in a wide variety of TriplyETL pipelines. Triply maintains this core set of transformation functions to allow new ETLs to make use of off-the-shelf functionality that has proven useful in the past.
+RATT transformations started out as [TypeScript transformations](../typescript) that turned out to be useful in a wide variety of TriplyETL pipelines. Triply maintains this core set of transformation functions to allow new ETLs to make use of off-the-shelf functionality that has proven useful in the past.
## Overview
@@ -47,7 +47,7 @@ Creates an IRI based on the specified IRI prefix and the hash calculated over th
### Parameters
- `prefix` An IRI, or a key that contains an IRI value.
-- `content` A key that contains a string value, or a string value specified with function [str()](/triply-etl/assert/ratt#function-str).
+- `content` A key that contains a string value, or a string value specified with function `str()`.
- `key` A new key where the created hashed IRI is stored.
### Use cases
@@ -211,7 +211,7 @@ This transformation can be used in the following two ways:
### See also
-If the created IRI is used exactly once, it is often better to use inline function [iri()](/triply-etl/assert/ratt/term#function-iri) instead.
+If the created IRI is used exactly once, it is often better to use inline function [iri()](../../assert/ratt/term#function-iri) instead.
### Example: Prefix declaration and local name
@@ -242,7 +242,7 @@ graph LR
johndoe(person:johndoe):::data
```
-The following snippet makes the same assertion, but uses assertion [iri()](/triply-etl/assert/ratt/term#function-iri) instead of transformation `addIri()`:
+The following snippet makes the same assertion, but uses assertion [iri()](../../assert/ratt/term#function-iri) instead of transformation `addIri()`:
```ts
triple(iri(prefix.person, 'username'), a, sdo.Person),
@@ -270,7 +270,7 @@ graph LR
johndoe(https://example.com/id/person/johndoe):::data
```
-The following snippet uses assertion [iri()](/triply-etl/assert/ratt/term#function-iri) instead of transformation `addIri()`:
+The following snippet uses assertion [iri()](../../assert/ratt/term#function-iri) instead of transformation `addIri()`:
```ts
triple(iri('https://example.com/id/person/johndoe'), a, sdo.Person),
@@ -292,19 +292,19 @@ This transformation can be used in the following 3 ways:
This transformation is typically used when:
-- The same literal occurs in two or more statement assertions (function [triple()](/triply-etl/assert/ratt/statement#function-triple) or [quad()](/triply-etl/assert/ratt/statement#function-quad)). This avoids having to specify the same literal multiple times using function [literal()](/triply-etl/assert/ratt/term#function-literal).
+- The same literal occurs in two or more statement assertions (function [triple()](../../assert/ratt/statement#function-triple) or [quad()](../../assert/ratt/statement#function-quad)). This avoids having to specify the same literal multiple times using function [literal()](../../assert/ratt/term#function-literal).
- The datatype or language tag is derived from the source data record.
### Parameters
-- `content` A key that contains a string value, or a string specified with function [str()](/triply-etl/assert/ratt/term#function-str).
+- `content` A key that contains a string value, or a string specified with function `str()`.
- `datatype` Optionally, a key that stores an IRI or a static IRI.
-- `languageTag` Optionally, a language tag from the [`lang`](/triply-etl/declare#language-declarations) object, or a key that stores such a language tag.
+- `languageTag` Optionally, a language tag from the [`lang`](../../declare#language-declarations) object, or a key that stores such a language tag.
- `key` A new key where the created literal is stored.
### See also
-If the created literal is used exactly once, it is often better to use the inline function [literal()](/triply-etl/assert/ratt/term#function-literal) instead.
+If the created literal is used exactly once, it is often better to use the inline function [literal()](../../assert/ratt/term#function-literal) instead.
### Example: Typed literal
@@ -326,7 +326,7 @@ This makes the following linked data assertion:
book:123 sdo:dateCreated '2022-30-01'^^xsd:date.
```
-Notice that the same linked data could have been asserted with the following use the the [literal} assertion middleware:
+Notice that the same linked data could have been asserted with the following use the the [literal()](../../assert/ratt/term#function-literal) assertion middleware:
```ts
fromJson([{ id: '123', date: '2022-01-30' }]),
@@ -381,7 +381,7 @@ This results in the following linked data assertion:
city:London skos:prefLabel 'London'@en-gb.
```
-Notice that the same linked data could have been asserted with the following use the the [literal} assertion middleware:
+Notice that the same linked data could have been asserted with the following use the the [literal()](../../assert/ratt/term#function-literal) assertion middleware:
```ts
fromJson([{ name: 'London' }]),
@@ -462,7 +462,7 @@ The snippet includes the prefix declarations to illustrate that the path of the
const base = 'https://example.com/'
const prefix = {
feature: declarePrefix(base('id/feature/')),
- skolem: declarePrefix(base('.well-known/genid/'),
+ skolem: declarePrefix(base('.well-known/genid/')),
}
// Etc
@@ -655,7 +655,7 @@ An optionally specified separator is placed in between every two consecutive str
### Parameters
-- `content` An array of key that contain a string and/or strings specified with assertion [str()](/triply-etl/assert/ratt#function-str).
+- `content` An array of key that contain a string and/or strings specified with assertion [str()](../../assert/ratt#function-str).
- `separator` Optionally, the string that is places between every two consecutive string values.
- `key` A new key where the concatenated string is stored.
@@ -809,7 +809,7 @@ Transforms GeoJSON objects to their corresponding Well-Known Text (WKT) serializ
### Parameters
- `content` A key that stores a GeoJSON object.
-- `crs` Optionally, an IRI that denotes a Coordinate Reference System (CRS). You can use IRIs from the [`epsg`](/triply-etl/declare#geospatial-declarations) object. If absent, uses [https://epsg.io/4326](EPSG:4326/WGS84) as the CRS.
+- `crs` Optionally, an IRI that denotes a Coordinate Reference System (CRS). You can use IRIs from the [`epsg`](../../declare#geospatial-declarations) object. If absent, uses [https://epsg.io/4326](EPSG:4326/WGS84) as the CRS.
- `key` A new key where the WKT serialization string is stored
### GeoJSON and Well-Known Text (WKT)
@@ -982,7 +982,7 @@ Older data formats sometimes use uppercase letters for header names or codes. Th
### Example
-The following snppet starts out with header values that use uppercase characters exclusively. The `lowerCase` transformation is used to create lowercase names that can be used to create property IRIs.
+The following snippet starts out with header values that use uppercase characters exclusively. The `lowerCase` transformation is used to create lowercase names that can be used to create property IRIs.
```ts
fromJson([
@@ -1155,7 +1155,7 @@ Performs a regular expression replacement to the given input string, and stores
### Parameters
-- `content` A key that contains a string value, or a static string specified with assertion [str()](/triply-etl/assert/ratt/term#function-str).
+- `content` A key that contains a string value, or a static string specified with assertion [str()](../../assert/ratt/term#function-str).
- `from` A [JavaScript Regular Expression](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions).
- `to` Optionally, a string that replaces potential matches of the Regular Expression (`from`). Use `$1`, `$2`, etc. to insert matches. If absent, the empty string is used.
- `key` A new key where the result of the replacement is stored.
@@ -1205,14 +1205,14 @@ This transformation removes any trailing whitespace that remains after the strin
### Use cases
The transformation is used when:
-- Tablular source data encodes multiple values inside singular cells. (Such concatenated storage inside cells is a data quality issue, because the table format cannot guarantee that the separator character does not (accidentally) occur inside individual values inside a cell. Tree-shaped source formats are able to store multiple values for the same key reliably, e.g. JSON and XML.)
+- Tabular source data encodes multiple values inside singular cells. (Such concatenated storage inside cells is a data quality issue, because the table format cannot guarantee that the separator character does not (accidentally) occur inside individual values inside a cell. Tree-shaped source formats are able to store multiple values for the same key reliably, e.g. JSON and XML.)
- Source data contains complex string values that can be decomposed into stand-alone components with distinct meaning.
### Parameters
-- `content` A key that stores a string, or a string specified with assertion [str()](/triply-etl/assert/ratt#str).
+- `content` A key that stores a string, or a string specified with assertion [str()](../../assert/ratt#str).
- `separator` A string or a regular expression that is used to separate the content.
-- `key` A new key where the array of splitted strings is stored.
+- `key` A new key where the array of split strings is stored.
### Example: Multiple values in singular table cells
@@ -1317,7 +1317,7 @@ result in a new key.
### Parameters
-- `content` A key that stores a string value, or a string specified with assertion [str()](/triply-etl/assert/ratt/term#function-str).
+- `content` A key that stores a string value, or a string specified with assertion [str()](../../assert/ratt/term#function-str).
- `start` The index of the first character that is included in the substring. The first character has index 0.
- `end` Optionally, the index of the first character that is excluded from the substring. If absent, the substring ends at the end of the source string.
- `key` The new key in which the substring is stored.
@@ -1456,7 +1456,7 @@ This transformation is used when string values must be mapped onto literals with
The datatype IRIs that could apply are specified in a list. The specified datatype IRIs are tried out from left to right. The first datatype IRI that results in a valid literal is chosen.
-- `content` A key that contains a string value, or a string value specified with assertion [str()](/triply-etl/assert/ratt/term#function-str).
+- `content` A key that contains a string value, or a string value specified with assertion [str()](../../assert/ratt/term#function-str).
- `datatypes` An array of two or more datatype IRIs.
- `key` A new key where the created literal is stored.
@@ -1486,7 +1486,7 @@ If we do not want to emit errors for string values that cannot be satisfy any of
### See also
-You only need to use `tryLiteral()` if the datatype IRI varies from record to record. If the datatype IRI is the same for every record, then the regular assertion function [literal()](/triply-etl/assert/ratt/term#function-literal) should be used instead.
+You only need to use `tryLiteral()` if the datatype IRI varies from record to record. If the datatype IRI is the same for every record, then the regular assertion function [literal()](../../assert/ratt/term#function-literal) should be used instead.
diff --git a/docs/triply-etl/transform/typescript/index.md b/docs/triply-etl/transform/typescript/index.md
index 109d0e29..6d3fef6f 100644
--- a/docs/triply-etl/transform/typescript/index.md
+++ b/docs/triply-etl/transform/typescript/index.md
@@ -7,7 +7,7 @@ path: "/docs/triply-etl/transform/typescript"
# TypeScript
-The vast majority of ETLs can be written with the core set of [RATT Transformations](/triply-etl/transform/ratt). But sometimes a custom transformation is necessary that cannot be handled by this core set. For such circumstances, TriplyETL allows a custom TypeScript function to be written.
+The vast majority of ETLs can be written with the core set of [RATT Transformations](../ratt). But sometimes a custom transformation is necessary that cannot be handled by this core set. For such circumstances, TriplyETL allows a custom TypeScript function to be written.
Notice that the use of a custom TypeScript function should be somewhat uncommon. The vast majority of real-world transformations should be supported by the core set of RATT Transformations.
@@ -64,7 +64,7 @@ This function emits an error if `NEW_KEY` already exists in the current Record.
Notice that it is bad practice to use `custom.add()` for adding a new entry that is based on exactly one existing entry. In such cases, the use of function `custom.copy()`
### Example: Numeric calculations
-Suppose the source data contains a numeric balance and a numeirc rate. We can use function `custom.add()` to calculate the interest and store it in a new key:
+Suppose the source data contains a numeric balance and a numeric rate. We can use function `custom.add()` to calculate the interest and store it in a new key:
```ts
import { Etl, fromJson } from '@triplyetl/etl/generic'
@@ -176,7 +176,7 @@ Notice that the values for the `balance` keys were changed.
### Example: Cast numeric data
-Some source data formats are unable to represent numeric data. A good example are the [CSV](/triply-etl/extract/formats#extractor-fromcsv) and [TSV](/triply-etl/extract/formats#extractor-fromtsv) formats, where every cell value is represented as a string.
+Some source data formats are unable to represent numeric data. A good example are the [CSV](../../extract/formats#extractor-fromcsv) and [TSV](../../extract/formats#extractor-fromtsv) formats, where every cell value is represented as a string.
If such a source data format that cannot represent numeric data is used, it is often useful to explicitly cast string values to numbers.
@@ -210,12 +210,12 @@ After `custom.change()` has been applied, the record looks as follows:
| Italy | null |
| Netherlands | 17650200 |
-Notice that strings that encode a number are correctly transformed, and non-empty strings that do not encode a number are transformed to `null`. Most of the time, this is the behaviour that you want in a linked data pipeline.
+Notice that strings that encode a number are correctly transformed, and non-empty strings that do not encode a number are transformed to `null`. Most of the time, this is the behavior that you want in a linked data pipeline.
-Also notice that the empty string is cast to the number zero. Most of the time, this is *not* what you want. If you want to prevent this transformation from happening, and you almost certainly do, you must process the source data conditionally, using [control structures](/triply-etl/control).
+Also notice that the empty string is cast to the number zero. Most of the time, this is *not* what you want. If you want to prevent this transformation from happening, and you almost certainly do, you must process the source data conditionally, using [control structures](../../control).
### Example: Variant type
diff --git a/docs/triply-etl/validate/index.md b/docs/triply-etl/validate/index.md
index 26cc2fc6..0f359d20 100644
--- a/docs/triply-etl/validate/index.md
+++ b/docs/triply-etl/validate/index.md
@@ -31,7 +31,7 @@ Triply believes that *every* ETL should include the Validate step to ensure that
TriplyETL supports two approaches for validating linked data:
-- 5A. [**Graph Comparison**](/triply-etl/validate/graph-comparison) uses one or more manually created 'gold records'. Graph comparison ensures that these records are transformed in the intended way by the ETL pipeline.
-- 5B. [**SHACL Validation**](/triply-etl/validate/shacl) uses a generic data model. SHACL Validation ensures that each individual record is processed in accordance with the generic data model.
+- 5A. [**Graph Comparison**](../../triply-etl/validate/graph-comparison) uses one or more manually created 'gold records'. Graph comparison ensures that these records are transformed in the intended way by the ETL pipeline.
+- 5B. [**SHACL Validation**](../../triply-etl/validate/shacl) uses a generic data model. SHACL Validation ensures that each individual record is processed in accordance with the generic data model.
Notice that it is possible to combine these two approaches in the same ETL pipeline: you can use graph comparison to test for specific conformities, and use SHACL to test for generic conformities.
diff --git a/docs/triply-etl/validate/shacl/index.md b/docs/triply-etl/validate/shacl/index.md
index df18c754..5f0ac940 100644
--- a/docs/triply-etl/validate/shacl/index.md
+++ b/docs/triply-etl/validate/shacl/index.md
@@ -15,7 +15,7 @@ This page documents how SHACL is used to validate linked data in the internal st
SHACL Validation can be used when the following preconditions are met:
1. A data model that uses SHACL constraints.
-2. Some data must be asserted in the internal store. If your internal store is still empty, you can read [the Assert documentation](/triply-etl/assert) on how to add assertions to that store.
+2. Some data must be asserted in the internal store. If your internal store is still empty, you can read [the Assert documentation](../../assert) on how to add assertions to that store.
The function for running SHACL Validation is imported as follows:
@@ -86,7 +86,7 @@ In our example we are using the following source data that records the age of a
}
```
-In our example the data source is [inline JSON](/triply-etl/extract/types#inline-json), but notice that any source format could have been used:
+In our example the data source is [inline JSON](../../extract/formats/#extractor-fromjson), but notice that any source format could have been used:
```ts
fromJson([{ age: 'twelve', id: '1' }]),
@@ -124,7 +124,7 @@ This Information Model specifies that instances of class `foaf:Person` must have
### Step 4: Transformation
-We now have source data (Step 1), and a fair intuition about our target data (Step 2), and an Information Model (Step 3). We can automate the mapping from source to target data with an [Assertion](/triply-etl/assert):
+We now have source data (Step 1), and a fair intuition about our target data (Step 2), and an Information Model (Step 3). We can automate the mapping from source to target data with an [Assertion](../../assert):
```ts
etl.use(
@@ -172,18 +172,21 @@ shp:Person_age
```
Notice the following details:
+
- We enforce a Closed World Semantics (CWA) in our Information Models with the `sh:closed` property. If a property is not explicitly specified in our Information Model, it is not allowed to be used with instance data.
+
- We create IRIs in the dedicated `shp:` namespace for nodes in the Information Model.
+
- Elements in our Information Model are always in a one-to-one correspondence with elements in our Knowledge Model:
- - Node shapes such as `shp:Person` relate to a specific class such as `foaf:Person`.
- - Property shapes such as `shp:Person_age` relate to a specific property such as `foaf:age`.
+ - Node shapes such as `shp:Person` relate to a specific class such as `foaf:Person`.
+ - Property shapes such as `shp:Person_age` relate to a specific property such as `foaf:age`.
### Step 6: Use the `validate()` function
TriplyETL has a dedicated function that can be used to automatically enforce Information Models such as the one expressed in Step 5.
-Since the Information Model is relatively small, it can be specified in-line using the [string source type](/triply-etl/extract/types/#strings). Larger models will probably be stored in a separate file or in a TriplyDB graph or asset.
+Since the Information Model is relatively small, it can be specified in-line using the [string source type](../../extract/types/#strings). Larger models will probably be stored in a separate file or in a TriplyDB graph or asset.
```ts
validate(Source.string(`
@@ -287,7 +290,7 @@ In this case, changing the data in the source system seem the most logical. Aft
#### Option 2: Change the transformation and/or assertions
-Alternatively, it is possible to transform English words that denote numbers to their corresponding numeric values. Since people can get up to one hundred years old, or even older, there are many words that we must consider and transform. This can be done with the [`translateAll()` transformation](/triply-etl/transform/ratt#function-translateall):
+Alternatively, it is possible to transform English words that denote numbers to their corresponding numeric values. Since people can get up to one hundred years old, or even older, there are many words that we must consider and transform. This can be done with the [`translateAll()` transformation](../../transform/ratt#function-translateall):
```ts
translateAll({
diff --git a/docs/triplydb-js/account/index.md b/docs/triplydb-js/account/index.md
index 94640676..1c50c6ed 100644
--- a/docs/triplydb-js/account/index.md
+++ b/docs/triplydb-js/account/index.md
@@ -2,11 +2,11 @@
# Account
-Instances of the `Account` class denote TriplyDB accounts. Accounts can be either organizations ([`Organization`](/triplydb-js/organization#organization)) or users ([`User`](/triplydb-js/user#user)).
+Instances of the `Account` class denote TriplyDB accounts. Accounts can be either organizations ([`Organization`](../organization#organization)) or users ([`User`](../user#user)).
Account objects are obtained by calling the following method:
-- [`App.getAccount(name?: string)`](/triplydb-js/app#appgetaccountname-string)
+- [`App.getAccount(name?: string)`](../app#appgetaccountname-string)
## Account.addDataset(name: string, metadata?: object)
@@ -97,7 +97,7 @@ const dataset = await account.addDataset('iris', {
### See also
-This method returns a dataset object. See the [Dataset](/triplydb-js/dataset#dataset) section for an overview of the methods that can be called on such objects.
+This method returns a dataset object. See the [Dataset](../dataset#dataset) section for an overview of the methods that can be called on such objects.
## Account.addQuery(name: string, metadata: object)
@@ -116,10 +116,10 @@ Adds a new SPARQL query to the account.
The SPARQL query string (e.g., 'select * { ?s ?p ?o }'
).