diff --git a/README.md b/README.md
index 6d35575..ae2efd2 100644
--- a/README.md
+++ b/README.md
@@ -7,3 +7,8 @@
### Development
`mintlify dev`
+
+
+### Errors
+
+If you encounter an issue try setting "openapi" value from "https://api.vapi.ai/api-json" to "./api.json" in mint.json
\ No newline at end of file
diff --git a/blocks.mdx b/blocks.mdx
new file mode 100644
index 0000000..706a2ae
--- /dev/null
+++ b/blocks.mdx
@@ -0,0 +1,58 @@
+---
+title: "Introduction"
+sidebarTitle: "Introduction"
+description: "Breaking down bot conversations into smaller, more manageable prompts"
+---
+
+
+We're currently running a beta for **Blocks**, an upcoming feature from [Vapi.ai](http://vapi.ai/) aimed at improving bot conversations. The problem we've noticed is that single LLM prompts are prone to hallucinations, unreliable tool calls, and can’t handle many-step complex instructions.
+
+**By breaking the conversation into smaller, more manageable prompts**, we can guarantee the bot will do this, then that, or if this happens, then that happens. It’s like having a checklist for conversations — less room for error, more room for getting things right.
+
+
+Here’s an example: For food ordering, this is what a prompt would look like.
+
+
+
+Example Prompt
+
+```jsx
+[Identity]
+You are a friendly and efficient assistant for a food truck that serves burgers, fries, and drinks.
+
+[Task]
+1. Greet the customer warmly and inquire about their main order.
+2. Offer suggestions for the main order if needed.
+3. If they choose a burger, suggest upgrading to a combo with fries and a drink, offering clear options (e.g., regular or special fries, different drink choices).
+4. Confirm the entire order to ensure accuracy.
+5. Suggest any additional items like desserts or sauces.
+6. Thank the customer and let them know when their order will be ready.
+```
+
+
+
+
+
+
+
+
+
+
+
+There are three core types of Blocks: [Conversation](https://api.vapi.ai/api#:~:text=ConversationBlock), [Tool-call](https://api.vapi.ai/api#:~:text=ToolCallBlock), and [Workflow](https://api.vapi.ai/api#:~:text=WorkflowBlock). Each type serves a different role in shaping how your assistant engages with users.
+
+
+
+ Blocks is currently in beta. We're excited to have you try this new feature and welcome your [feedback](https://discord.com/invite/pUFNcf2WmH) as we continue to refine and improve the experience.
+
+
+## Advanced Concepts
+
+
+
+ Learn how to structure the flow of your conversation
+
+
+ Explore the different block types and how to use them
+
+
\ No newline at end of file
diff --git a/blocks/block-types.mdx b/blocks/block-types.mdx
new file mode 100644
index 0000000..a0d5a87
--- /dev/null
+++ b/blocks/block-types.mdx
@@ -0,0 +1,17 @@
+---
+title: "Block Types"
+sidebarTitle: "Block Types"
+description: "Building the Logic and Actions for Each Step in Your Conversation "
+---
+
+[**Blocks**](https://api.vapi.ai/api#/Blocks/BlockController_create) are the functional units within a Step, defining what action happens at each stage of a conversation. Each Step can contain only one Block, and there are three main types of Blocks, each designed to handle different aspects of conversation flow.
+
+
+ Blocks is currently in beta. We're excited to have you try this new feature and welcome your [feedback](https://discord.com/invite/pUFNcf2WmH) as we continue to refine and improve the experience.
+
+
+#### Types
+
+- [**Conversation:**]((https://api.vapi.ai/api#:~:text=ConversationBlock)) This block type manages interactions between the assistant and the user. A conversation block is used when the assistant needs to ask the user for specific information, such as contact details or preferences.
+- [**Tool-call:**](https://api.vapi.ai/api#:~:text=ToolCallBlock) This block allows the assistant to make external tool calls.
+- [**Workflow:**](https://api.vapi.ai/api#:~:text=WorkflowBlock) This block type enables the creation of subflows, which are smaller sets of steps executed within a Block. It can contain an array of steps (`steps[]`) and uses an `inputSchema` to define the data needed to initiate the workflow, along with an `outputSchema` to handle the data returned after completing the subflow. Workflow blocks are ideal for organizing complex processes or reusing workflows across different parts of the conversation.
\ No newline at end of file
diff --git a/blocks/steps.mdx b/blocks/steps.mdx
new file mode 100644
index 0000000..8de62e9
--- /dev/null
+++ b/blocks/steps.mdx
@@ -0,0 +1,69 @@
+---
+title: "Steps"
+sidebarTitle: "Steps"
+description: "Building and Controlling Conversation Flow for Your Assistants"
+---
+
+[**Steps**](https://api.vapi.ai/api#:~:text=HandoffStep) are the core building blocks that dictate how conversations progress in a bot interaction. Each Step represents a distinct point in the conversation where the bot performs an action, gathers information, or decides where to go next. Think of Steps as checkpoints in a conversation that guide the flow, manage user inputs, and determine outcomes.
+
+
+ Blocks is currently in beta. We're excited to have you try this new feature and welcome your [feedback](https://discord.com/invite/pUFNcf2WmH) as we continue to refine and improve the experience.
+
+
+#### Features
+
+- **Output:** The data or response expected from the step, as outlined in the block's `outputSchema`.
+- **Input:** The data necessary for the step to execute, defined in the block's `inputSchema`.
+- [**Destinations:**](https://api.vapi.ai/api#:~:text=StepDestination) This can be determined by a simple linear progression or based on specific criteria, like conditions or rules set within the Step. This enables dynamic decision-making, allowing the assistant to choose the next Step depending on what happens during the conversation (e.g., user input, a specific value, or a condition being met).
+
+#### Example
+
+```json
+ {
+ "type": "handoff",
+ "name": "get_user_order",
+ "input": {
+ "name": "John Doe",
+ "email": "johndoe@example.com"
+ },
+ "destinations": [
+ {
+ "type": "step",
+ "stepName": "confirm_order",
+ "conditions": [
+ {
+ "type": "model-based",
+ "instruction": "If the user has provided an order"
+ }
+ ]
+ }
+ ],
+ "block": {
+ "name": "ask_for_order",
+ "type": "conversation",
+ "inputSchema": {
+ "type": "object",
+ "required": ["name", "email"],
+ "properties": {
+ "name": { "type": "string", "description": "The customer's name" },
+ "email": { "type": "string", "description": "The customer's email" }
+ }
+ },
+ "instruction": "Greet the customer and ask for their name and email. Then ask them what they'd like to order.",
+ "outputSchema": {
+ "type": "object",
+ "required": ["orders", "name"],
+ "properties": {
+ "orders": {
+ "type": "string",
+ "description": "The customer's order, e.g., 'burger with fries'"
+ },
+ "name": {
+ "type": "string",
+ "description": "The customer's name"
+ }
+ }
+ }
+ }
+}
+```
\ No newline at end of file
diff --git a/mint.json b/mint.json
index c5d9a83..faf6328 100644
--- a/mint.json
+++ b/mint.json
@@ -178,6 +178,14 @@
"assistants/background-messages"
]
},
+ {
+ "group": "Blocks",
+ "pages": [
+ "blocks",
+ "blocks/steps",
+ "blocks/block-types"
+ ]
+ },
{
"group": "Server URL",
"pages": [
@@ -540,5 +548,5 @@
"measurementId": "G-G6EN8MLZLK"
}
},
- "openapi": "https://api.vapi.ai/api-json"
+ "openapi": "./api.json"
}
diff --git a/static/images/blocks/food-order-steps.png b/static/images/blocks/food-order-steps.png
new file mode 100644
index 0000000..1f6f03d
Binary files /dev/null and b/static/images/blocks/food-order-steps.png differ