Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/master' into nls/missing-params
Browse files Browse the repository at this point in the history
  • Loading branch information
smith committed Sep 14, 2020
2 parents 7786faf + 0c3a8c5 commit 839eb95
Show file tree
Hide file tree
Showing 146 changed files with 3,622 additions and 1,483 deletions.
211 changes: 211 additions & 0 deletions packages/kbn-telemetry-tools/GUIDELINE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,211 @@
# Collector Schema Guideline

Table of contents:
- [Collector Schema Guideline](#collector-schema-guideline)
- [Adding schema to your collector](#adding-schema-to-your-collector)
- [1. Update the telemetryrc file](#1-update-the-telemetryrc-file)
- [2. Type the `fetch` function](#2-type-the-fetch-function)
- [3. Add a `schema` field](#3-add-a-schema-field)
- [4. Run the telemetry check](#4-run-the-telemetry-check)
- [5. Update the stored json files](#5-update-the-stored-json-files)
- [Updating the collector schema](#updating-the-collector-schema)
- [Writing the schema](#writing-the-schema)
- [Basics](#basics)
- [Allowed types](#allowed-types)
- [Dealing with arrays](#dealing-with-arrays)
- [Schema Restrictions](#schema-restrictions)
- [Root of schema can only be an object](#root-of-schema-can-only-be-an-object)


## Adding schema to your collector

To add a `schema` to the collector, follow these steps until the telemetry check passes.
To check the next step needed simply run the telemetry check with the path of your collector:

```
node scripts/telemetry_check.js --path=<relative_path_to_collector>.ts
```

### 1. Update the telemetryrc file

Make sure your collector is not excluded in the `telemetryrc.json` files (located at the root of the kibana project, and another on in the `x-pack` dir).

```s
[
{
...
"exclude": [
"<path_to_my_collector>"
]
}
]
```

Note that the check will fail if the collector in --path is excluded.

### 2. Type the `fetch` function
1. Make sure the return of the `fetch` function is typed.

The function `makeUsageCollector` accepts a generic type parameter of the returned type of the `fetch` function.

```
interface Usage {
someStat: number;
}
usageCollection.makeUsageCollector<Usage>({
fetch: async () => {
return {
someStat: 3,
}
},
...
})
```

The generic type passed to `makeUsageCollector` will automatically unwrap the `Promise` to check for the resolved type.

### 3. Add a `schema` field

Add a `schema` field to your collector. After passing the return type of the fetch function to the `makeUsageCollector` generic parameter. It will automaticallly figure out the correct type of the schema based on that provided type.


```
interface Usage {
someStat: number;
}
usageCollection.makeUsageCollector<Usage>({
schema: {
someStat: {
type: 'long'
}
},
...
})
```

For full details on writing the `schema` object, check the [Writing the schema](#writing-the-schema) section.

### 4. Run the telemetry check

To make sure your changes pass the telemetry check you can run the following:

```
node scripts/telemetry_check.js --ignore-stored-json --path=<relative_path_to_collector>.ts
```

### 5. Update the stored json files

The `--fix` flag will automatically update the persisted json files used by the telemetry team.

```
node scripts/telemetry_check.js --fix
```

Note that any updates to the stored json files will require a review by the kibana-telemetry team to help us update the telemetry cluster mappings and ensure your changes adhere to our best practices.


## Updating the collector schema

Simply update the fetch function to start returning the updated fields back to our cluster. The update the schema to accomodate these changes.

Once youre run the changes to both the `fetch` function and the `schema` field run the following command

```
node scripts/telemetry_check.js --fix
```

The `--fix` flag will automatically update the persisted json files used by the telemetry team. Note that any updates to the stored json files will require a review by the kibana-telemetry team to help us update the telemetry cluster mappings and ensure your changes adhere to our best practices.


## Writing the schema

We've designed the schema object to closely resemble elasticsearch mapping object to reduce any cognitive complexity.

### Basics

The function `makeUsageCollector` will automatically translate the returned `Usage` fetch type to the `schema` object. This way you'll have the typescript type checker helping you write the correct corrisponding schema.

```
interface Usage {
someStat: number;
}
usageCollection.makeUsageCollector<Usage>({
schema: {
someStat: {
type: 'long'
}
},
...
})
```


### Allowed types

Any field property in the schema accepts a `type` field. By default the type is `object` which accepts nested properties under it. Currently we accept the following property types:

```
AllowedSchemaTypes =
| 'keyword'
| 'text'
| 'number'
| 'boolean'
| 'long'
| 'date'
| 'float';
```


### Dealing with arrays

You can optionally define a property to be an array by setting the `isArray` to `true`. Note that the `isArray` property is not currently required.


```
interface Usage {
arrayOfStrings: string[];
arrayOfObjects: {key: string; value: number; }[];
}
usageCollection.makeUsageCollector<Usage>({
fetch: () => {
return {
arrayOfStrings: ['item_one', 'item_two'],
arrayOfObjects: [
{ key: 'key_one', value: 13 },
]
}
}
schema: {
arrayOfStrings: {
type: 'keyword',
isArray: true,
},
arrayOfObjects: {
isArray: true,
key: {
type: 'keyword',
},
value: {
type: 'long',
},
}
},
...
})
```

Be careful adding arrays of objects due to the limitation in correlating the properties inside those objects inside kibana. It is advised to look for an alternative schema based on your use cases.


## Schema Restrictions

We have enforced some restrictions to the schema object to adhere to our telemetry best practices. These practices are derived from the usablity of the sent data in our telemetry cluster.


### Root of schema can only be an object

The root of the schema can only be an object. Currently any property must be nested inside the main schema object.
27 changes: 26 additions & 1 deletion packages/kbn-telemetry-tools/src/cli/run_telemetry_check.ts
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ import {

export function runTelemetryCheck() {
run(
async ({ flags: { fix = false, path }, log }) => {
async ({ flags: { fix = false, 'ignore-stored-json': ignoreStoredJson, path }, log }) => {
if (typeof fix !== 'boolean') {
throw createFailError(`${chalk.white.bgRed(' TELEMETRY ERROR ')} --fix can't have a value`);
}
Expand All @@ -50,6 +50,14 @@ export function runTelemetryCheck() {
);
}

if (fix && typeof ignoreStoredJson !== 'undefined') {
throw createFailError(
`${chalk.white.bgRed(
' TELEMETRY ERROR '
)} --fix is incompatible with --ignore-stored-json flag.`
);
}

const list = new Listr([
{
title: 'Checking .telemetryrc.json files',
Expand All @@ -59,11 +67,28 @@ export function runTelemetryCheck() {
title: 'Extracting Collectors',
task: (context) => new Listr(extractCollectorsTask(context, path), { exitOnError: true }),
},
{
enabled: () => typeof path !== 'undefined',
title: 'Checking collectors in --path are not excluded',
task: ({ roots }: TaskContext) => {
const totalCollections = roots.reduce((acc, root) => {
return acc + (root.parsedCollections?.length || 0);
}, 0);
const collectorsInPath = Array.isArray(path) ? path.length : 1;

if (totalCollections !== collectorsInPath) {
throw new Error(
'Collector specified in `path` is excluded; Check the telemetryrc.json files.'
);
}
},
},
{
title: 'Checking Compatible collector.schema with collector.fetch type',
task: (context) => new Listr(checkCompatibleTypesTask(context), { exitOnError: true }),
},
{
enabled: (_) => !!ignoreStoredJson,
title: 'Checking Matching collector.schema against stored json files',
task: (context) =>
new Listr(checkMatchingSchemasTask(context, !fix), { exitOnError: true }),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,6 @@ export const boolean: ExpressionTypeDefinition<'boolean', boolean> = {
},
datatable: (value): Datatable => ({
type: 'datatable',
meta: {},
columns: [{ id: 'value', name: 'value', meta: { type: name } }],
rows: [{ value }],
}),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,10 @@ export type DatatableRow = Record<string, any>;
export interface DatatableColumnMeta {
type: DatatableColumnType;
field?: string;
index?: string;
params?: SerializableState;
source?: string;
sourceParams?: SerializableState;
}
/**
* This type represents the shape of a column in a `Datatable`.
Expand All @@ -63,17 +66,11 @@ export interface DatatableColumn {
meta: DatatableColumnMeta;
}

export interface DatatableMeta {
type?: string;
source?: string;
}

/**
* A `Datatable` in Canvas is a unique structure that represents tabulated data.
*/
export interface Datatable {
type: typeof name;
meta?: DatatableMeta;
columns: DatatableColumn[];
rows: DatatableRow[];
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,6 @@ export const num: ExpressionTypeDefinition<'num', ExpressionValueNum> = {
},
datatable: ({ value }): Datatable => ({
type: 'datatable',
meta: {},
columns: [{ id: 'value', name: 'value', meta: { type: 'number' } }],
rows: [{ value }],
}),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,6 @@ export const number: ExpressionTypeDefinition<typeof name, number> = {
},
datatable: (value): Datatable => ({
type: 'datatable',
meta: {},
columns: [{ id: 'value', name: 'value', meta: { type: 'number' } }],
rows: [{ value }],
}),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,6 @@ export const string: ExpressionTypeDefinition<typeof name, string> = {
},
datatable: (value): Datatable => ({
type: 'datatable',
meta: {},
columns: [{ id: 'value', name: 'value', meta: { type: 'string' } }],
rows: [{ value }],
}),
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/

import React, { FunctionComponent } from 'react';

import { i18n } from '@kbn/i18n';

import { documentationService } from '../../../../../services/documentation';
import { getFieldConfig } from '../../../lib';
import { UseField, Field } from '../../../shared_imports';
import { EditFieldFormRow } from '../fields/edit_field';

interface Props {
defaultToggleValue: boolean;
}

export const IgnoreAboveParameter: FunctionComponent<Props> = ({ defaultToggleValue }) => (
<EditFieldFormRow
title={i18n.translate('xpack.idxMgmt.mappingsEditor.ignoreAboveFieldTitle', {
defaultMessage: 'Set length limit',
})}
description={i18n.translate('xpack.idxMgmt.mappingsEditor.ignoreAboveFieldDescription', {
defaultMessage:
'Strings longer than this value will not be indexed. This is useful for protecting against Lucene’s term character-length limit of 8,191 UTF-8 characters.',
})}
docLink={{
text: i18n.translate('xpack.idxMgmt.mappingsEditor.ignoreAboveDocLinkText', {
defaultMessage: 'Ignore above documentation',
}),
href: documentationService.getIgnoreAboveLink(),
}}
defaultToggleValue={defaultToggleValue}
>
<UseField path="ignore_above" config={getFieldConfig('ignore_above')} component={Field} />
</EditFieldFormRow>
);
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,8 @@ export * from './other_type_name_parameter';

export * from './other_type_json_parameter';

export * from './ignore_above_parameter';

export const PARAMETER_SERIALIZERS = [relationsSerializer, dynamicSerializer];

export const PARAMETER_DESERIALIZERS = [relationsDeserializer, dynamicDeserializer];
Loading

0 comments on commit 839eb95

Please sign in to comment.