Skip to content

Commit

Permalink
Adds TOC to all pages.
Browse files Browse the repository at this point in the history
  • Loading branch information
mightymax committed Oct 16, 2023
1 parent d74786c commit 016c395
Show file tree
Hide file tree
Showing 65 changed files with 148 additions and 23 deletions.
4 changes: 2 additions & 2 deletions docs/generics/JSON-LD-frames.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "JSON-LD Framing"
path: "/docs/jsonld-frames"
---

[TOC]

# Why JSON-LD Framing?

SPARQL Construct and SPARQL Describe queries can return results in the JSON-LD format. Here is an example:
Expand Down Expand Up @@ -182,5 +184,3 @@ This can also be accessed by the generated API Link above the SPARQL editor.
Copying and pasting the generated link will direct you to a page where you can view the script:

![](json-ld-in-api.png)


2 changes: 2 additions & 0 deletions docs/generics/api-token.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "API Token"
path: "/docs/api-token"
---

[TOC]

Applications (see [TriplyDB.js](/triplydb-js)) and pipelines (see [TriplyETL](/triply-etl)) often require access rights to interact with TriplyDB instances. Specifically, reading non-public data and writing any (public or non-public) data requires setting an API token. The token ensures that only users that are specifically authorized for certain datasets are able to access and/or modify those datasets.

The following steps must be performed in order to create an API token:
Expand Down
2 changes: 2 additions & 0 deletions docs/generics/sparql-pagination.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "SPARQL pagination"
path: "/docs/pagination"
---

[TOC]

This page explains how to retrieve all results from a SPARQL query using pagination.

Often SPARQL queries can return more than 10.000 results, but due to limitations the result set will only consist out of the first 10.000 results. To retrieve more than 10.000 results you can use pagination. TriplyDB supports two methods to retrieve all results from a SPARQL query. Pagination with the saved query API or Pagination with TriplyDB.js.
Expand Down
2 changes: 2 additions & 0 deletions docs/hello-world/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ path: "/docs/hello"
draft: true
---

[TOC]

![](joshua-earle-234740.jpg) _Photo by
[Joshua Earle](https://unsplash.com/@joshuaearle) via
[Unsplash](https://unsplash.com/@joshuaearle?photo=-87JyMb9ZfU)_
Expand Down
2 changes: 2 additions & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

<img src="https://triplydb.com/imgs/logos/logo-lg.svg?v=0" style="height: 100px;"/>

# Welcome to the Triply Documentation
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-api/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "Triply API"
path: "/docs/triply-api"
---

[TOC]

Each Triply instance has a fully RESTful API. All functionality, from managing
the Triply instance to working with your data, is done through the API. This
document describes the general setup of the API, contact
Expand Down
4 changes: 3 additions & 1 deletion docs/triply-db-getting-started/admin-settings-pages/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Admin settings Pages

You can use the console to perform administrator tasks. The administrator tasks are performed within the admin settings page. The admin settings pages are accessible by clicking on the user menu in the top-right corner and selecting the “Admin settings” menu item. You must have administrator privileges to access these pages and perform administrator tasks.
Expand Down Expand Up @@ -324,4 +326,4 @@ TriplyDB supports two types of mapping rules:

<dt>Regex</dt>
<dd>Regular Expression rules trigger when a resource matches a Regular Expression.</dd>
</dl>
</dl>
4 changes: 3 additions & 1 deletion docs/triply-db-getting-started/data-stories/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Data stories

A TriplyDB data story is a way of communicating information about your linked data along with explanatory text while also being able to integrate query results.
Expand Down Expand Up @@ -45,4 +47,4 @@ Before you know it, you will have created your first data story. Congratulations
1. You can simply share the URL in TriplyDB.
2. You can embed the Data Story on your own webpage. Scroll all the way to the end of your Data Story and click the “</> Embed” button. This brings up a code snippet that you can copy/paste into your own HTML web page.

![Dialog for embedding a Data Story](data-stories-embed.png)
![Dialog for embedding a Data Story](data-stories-embed.png)
4 changes: 3 additions & 1 deletion docs/triply-db-getting-started/exporting-data/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Exporting Data

This section explains how a user of TriplyDB can export linked data stored in the triple store.
Expand Down Expand Up @@ -28,4 +30,4 @@ The process of extracting the compressed file is the same for exporting graphs a

![Extract the compressed file](extract.png)

Select the file that should be extracted. In this case select "pokemon.trig" and click on "Extract". In the following step choose the location where the extracted file should be stored.
Select the file that should be extracted. In this case select "pokemon.trig" and click on "Extract". In the following step choose the location where the extracted file should be stored.
2 changes: 2 additions & 0 deletions docs/triply-db-getting-started/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "TriplyDB"
path: "/docs/triply-db-getting-started"
---

[TOC]

# Introduction

TriplyDB allows you to store, publish, and use linked data Knowledge
Expand Down
4 changes: 3 additions & 1 deletion docs/triply-db-getting-started/publishing-data/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Publishing data

With TriplyDB you can easily make your data available to the outside world.
Expand Down Expand Up @@ -111,4 +113,4 @@ https://gitlab.example.com/api/v4/projects/<project_id>/trigger/pipeline?token=<
```
When your webhook is created and active, you can see every occasion the webhook was called in the webhook trigger history.

![Active webhook trigger history](webhook_trigger_history.png)
![Active webhook trigger history](webhook_trigger_history.png)
4 changes: 3 additions & 1 deletion docs/triply-db-getting-started/reference/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Reference

## Access levels
Expand Down Expand Up @@ -138,4 +140,4 @@ Haskell (`haskell`), Java (`java`), JavaScript (`javascript`), LaTeX
(`latex`), Makefile (`makefile`), Markdown (`markdown`), Objective C
(`objectivec`), Pascal (`pascal`), Perl (`perl`), Powershell
(`powershell`), Prolog (`prolog`), Regular Expression (`regex`), Ruby
(`ruby`), Scala (`scala`), SQL (`sql`), Yaml (`yaml`).
(`ruby`), Scala (`scala`), SQL (`sql`), Yaml (`yaml`).
4 changes: 3 additions & 1 deletion docs/triply-db-getting-started/saved-queries/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Saved Queries

A Saved Query is a versioned SPARQL query with its own URL. Using this URL,
Expand Down Expand Up @@ -211,4 +213,4 @@ Users can specify additional metadata inside the query string, by using the GRLC
#+ frequency: hourly
```

See the [Triply API documentation](/triply-api#queries) for how to retrieve query metadata, including how to retrieve GRLC annotations.
See the [Triply API documentation](/triply-api#queries) for how to retrieve query metadata, including how to retrieve GRLC annotations.
4 changes: 3 additions & 1 deletion docs/triply-db-getting-started/uploading-data/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Uploading Data

This section explains how to create a linked dataset in TriplyDB.
Expand Down Expand Up @@ -124,4 +126,4 @@ standards-compliant tools start using the data.

Not all data can be stored as RDF data. For example images and video
files use a binary format. Such files can also be stored in TriplyDB
and can be integrated into the Knowledge Graph.
and can be integrated into the Knowledge Graph.
2 changes: 2 additions & 0 deletions docs/triply-db-getting-started/viewing-data/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Viewing Data

TriplyDB offers several ways to explore your datasets.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/assert/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Assert

The **Assert** step uses data from the Record to add linked data to the Internal Store.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/assert/json-ld/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# JSON-LD

The JSON-LD standard includes the following algorithms that allow linked data to be added to the internal store:
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/assert/ratt/statement.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# RATT Statement Assertions

[TOC]
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/assert/ratt/term.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# RATT Term Assertions

The following term assertions are available:
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/changelog/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Changelog

The current version of TriplyETL is **3.0.0**
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/cli/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Command Line Interface (CLI)

TriplyETL allows you to manually perform various tasks in a terminal application (a Command-Line Interface or CLI).
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/control/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Control Structures

This page documents how you can use control structures in your ETL configuration.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/debug/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Debug

TriplyETL includes functions that can be used during debugging. These debug function allow you to inspect in a detailed way how data flows through your pipeline. This allows you to find problems more quickly, and allows you to determine whether data is handled correctly by your TriplyETL configuration.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/declare/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Declare

Declarations introduce constants that you can use in the rest of your ETL configuration.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/enrich/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Enrich

The **Enrich** step uses linked data that is asserted in the Internal Store to derive new linked data.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/enrich/shacl/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# SHACL Rules

SHACL Rules allow new data to be asserted based on existing data patterns. This makes them a great approach for data enrichment. Since SHACL Rules can be defined as part of the data model, it is one of the best approaches for creating and maintaining business rules in complex domains.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/enrich/sparql/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# SPARQL Update

SPARQL Update is a powerful feature that allows you to modify and enrich linked data in the internal store. With SPARQL Update, you can generate new linked data based on existing linked data, thereby enhancing the content of the store.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/extract/formats/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Data Formats

## Overview
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/extract/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Extract

The **Extract** step is the first step in any TriplyETL pipeline. It extracts data in different formats and from different source types. Examples of data formats are 'Microsoft Excel' and 'JSON'. Examples of source types are 'file' or 'URL'. Source data are represented in a uniform Record.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/extract/record/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Record

When a TriplyETL is connected to one of more data sources, a stream of **Records** will be generated. Records use a generic representation that is independent of the format used in the data sources.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/extract/types/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Source types
## Overview

Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/getting-started/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Getting Started

This page helps you to get started with TriplyETL. You can get started with TriplyETL in any of the following ways:
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# TriplyETL Overview
TriplyETL allows you to create and maintain production-grade linked data pipelines.

Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/maintenance/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Maintenance

Once a TriplyETL repository is configured, it goes into maintenance mode. TriplyETL contains specific functionality to support maintenance.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/publish/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# TriplyETL: Publish

The **Publish** step makes the linked data that is produced by the TriplyETL pipeline available in a Triple Store for use by others.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/automation.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "TriplyETL: Automation"
path: "/docs/triply-etl/automation"
---

[TOC]

TriplyETL runs within a Gitlab CI environment ([Figure 1](#figure-1)).

<figure id="figure-1">
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/ci-cd.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

This document explains how to maintain an ETL that runs in the gitlab CI.

# How to create a TriplyETL CI pipeline?
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/context.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

## Configuring the Context <!-- {#context} -->

The TriplyETL Context is specified when the `Etl` object is instantiated. This often appears towards the start of a pipeline script. The TriplyETL Context allows the following things to be specified:
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/copy.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Copy an existing entry over to a new entry <!-- {#copy} -->

Copying is the act of creating a new thing that is based on a specific existing thing.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/faq.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# FAQ

### Why does my pipeline schedule only run an `install` job?
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/getting-started.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

## Transforming RDF data

If you have RDF data that does not need to be transformed, see [copying source data](https://triply.cc/docs/ratt-working-with-ratt#function-direct-copying-of-source-data-to-destination).
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/main-loop.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# The main loop

The following code snippet shows the main TriplyETL loop. Every TriplyETL pipeline consists of such a loop.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/source-destination.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

##### An easier way to configure graph names and prefixes

Instead of setting the graph name and the prefixes for every ETL, you can use functions for their generation:
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/static-dynamic-statements.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Create dynamic statements

Dynamic statements are statements that are based on some aspect of the source data.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/tmp/tmp.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

## Create statements {#create-statements}

After source data is connected and transformed, the RATT Record is ready to be transformed to linked data. Linked data statements are assertions or factual statements that consist of 3 terms (triple) or 4 terms (quadruples).
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/transform/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Transform

The transform step makes changes to the Record:
Expand Down
5 changes: 2 additions & 3 deletions docs/triply-etl/transform/ratt/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[TOC]

# Transform: RATT

RATT transformations are a core set of functions that are commonly used to change the content of TriplyETL Records.
Expand Down Expand Up @@ -1518,6 +1520,3 @@ This results in the following linked data assertion:
```ttl
city:Amsterdam geonames:countryCode "NL"
```
2 changes: 2 additions & 0 deletions docs/triply-etl/transform/typescript/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "2. Transform: TypeScript"
path: "/docs/triply-etl/transform/typescript"
---

[TOC]

The vast majority of ETLs can be written with the core set of [RATT Transformations](/triply-etl/transform/ratt). But sometimes a custom transformation is necessary that cannot be handled by this core set. For such circumstances, TriplyETL allows a custom TypeScript function to be written.

Notice that the use of a custom TypeScript function should be somewhat uncommon. The vast majority of real-world transformations should be supported by the core set of RATT Transformations.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/validate/graph-comparison/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "5. Validate: Graph Comparison"
path: "/docs/triply-etl/validate/graph-comparison"
---

[TOC]

Graph comparison is an approach for validating that the data produced by your TriplyETL pipeline is correct.

For a limited set of key records, the linked data that is generated by the TriplyETL pipeline is compared to graphs that were created by hand. This comparison must follow certain rules that are laid out in the RDF standards.
Expand Down
2 changes: 2 additions & 0 deletions docs/triply-etl/validate/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ title: "5. TriplyETL: Validate"
path: "/docs/triply-etl/validate"
---

[TOC]

The **Validate** step ensures that the linked data a pipeline produces conforms to the requirements specified in the data model.

```mermaid
Expand Down
Loading

0 comments on commit 016c395

Please sign in to comment.