Skip to content

Commit

Permalink
transport plans
Browse files Browse the repository at this point in the history
  • Loading branch information
vmarshall committed Jan 7, 2025
1 parent 01cad98 commit 7749f98
Showing 1 changed file with 77 additions and 2 deletions.
79 changes: 77 additions & 2 deletions fern/customization/speech-configuration.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -243,10 +243,85 @@ This enhanced explanation provides concrete examples and clear descriptions of t
- **High-Noise Environments**: For calls from locations with significant background noise, like factories or busy streets, these rules can be adjusted to better distinguish between speech and ambient sounds, improving the overall conversation quality.
- **Elderly or Speech-Impaired Users**: The endpointing rules can be customized to accommodate slower speech patterns or frequent pauses, ensuring the assistant doesn't interrupt prematurely.


AssistantCustomEndpointingRule
CustomerCustomEndpointingRule
BothCustomEndpointingRule
TranscriptionEndpointingPlan

#### AssistantCustomEndpointing Rule
#### CustomerCustom Endpointing Rule

This rule allows customization of when the assistant should start speaking based on its own speech patterns. It's part of the startSpeakingPlan configuration.

AssistantCustomEndpointingRule is a JSON object that defines a rule for setting an endpointing timeout based on the last assistant message before the customer starts speaking. Here's a breakdown of its properties:
- **type**: A string that must be "assistant". It indicates that the rule is based on the last assistant message.
- **regex**: A string representing a regular expression pattern to match against the assistant's message.
- **regexOptions**: An array of options for the regex match. Defaults to an empty array.
- **timeoutSeconds**: A number representing the endpointing timeout in seconds if the rule is matched. It must be between 0 and 15 seconds.

##### Usage Flow
1. The assistant speaks.
2. The customer starts speaking.
3. The customer's transcription is received.
4. The rule is evaluated on the last assistant message.
5. If the message matches the regex, the endpointing timeout is set to `timeoutSeconds`.

##### Example Use Cases
- For yes/no questions like "Are you interested in a loan?", you can set a shorter timeout.
- For questions where the customer may need to pause, like "What's my account number?", you can set a longer timeout.

#### CustomerCustomEndpointing Rule

This rule defines custom conditions for determining when the customer has finished speaking. It helps the assistant accurately detect the end of a customer's utterance.

The `CustomerCustomEndpointingRule` is a JSON object that defines a rule for setting an endpointing timeout based on the current customer message as they are speaking. Here's a breakdown of its properties:

- **type**: A string that must be "customer". It indicates that the rule is based on the current customer message.
- **regex**: A string representing a regular expression pattern to match against the customer's message.
- **regexOptions**: An array of options for the regex match. Defaults to an empty array.
- **timeoutSeconds**: A number representing the endpointing timeout in seconds if the rule is matched. It must be between 0 and 15 seconds.

#### Usage Flow
1. The assistant speaks.
2. The customer starts speaking.
3. The customer's transcription is received.
4. The rule is evaluated on the current customer transcription.
5. If the message matches the `regex`, the endpointing timeout is set to `timeoutSeconds`.

#### Example Use Case
- If you want to wait longer while the customer is speaking numbers, you can set a longer timeout.

This rule allows for dynamic adjustment of the endpointing timeout based on the content of the customer's message, providing flexibility in handling different types of customer responses.

#### BothCustomEndpointing Rule
#### Transcription Endpointing Plan

This rule combines both assistant and customer speech patterns to determine the optimal moment for the assistant to begin speaking. It aims to create a more natural conversational flow.

The `BothCustomEndpointingRule` is a JSON object that defines a rule for setting an endpointing timeout based on both the last assistant message and the current customer message as they are speaking. Here's a breakdown of its properties:

- **type**: A string that must be "both". It indicates that the rule is based on both the last assistant message and the current customer message.
- **assistantRegex**: A string representing a regular expression pattern to match against the assistant's message.
- **assistantRegexOptions**: An array of options for the assistant's message regex match. Defaults to an empty array.
- **customerRegex**: A string representing a regular expression pattern to match against the customer's message.
- **customerRegexOptions**: An array of options for the customer's message regex match. Defaults to an empty array.
- **timeoutSeconds**: A number representing the endpointing timeout in seconds if the rule is matched. It must be between 0 and 15 seconds.

##### Usage Flow
1. The assistant speaks.
2. The customer starts speaking.
3. The customer's transcription is received.
4. The rule is evaluated on both the last assistant message and the current customer transcription.
5. If the assistant message matches `assistantRegex` AND the customer message matches `customerRegex`, the endpointing timeout is set to `timeoutSeconds`.

##### Example Use Case
- If you want to wait longer while the customer is speaking numbers, you can set a longer timeout.

#### TranscriptionEndpointing Plan

This plan provides detailed control over how the transcription affects the assistant's speaking behavior. It includes parameters such as:
- **onPunctuationSeconds**: Wait time after detecting punctuation in the transcription.
- **onNoPunctuationSeconds**: Wait time when no punctuation is detected.
- **onNumberSeconds**: Wait time after detecting a number in the transcription.


### Best Practices
Expand Down

0 comments on commit 7749f98

Please sign in to comment.