# Events

The events service in Flows provides a powerful way for blocks to communicate with each other through asynchronous event-based messaging. This allows for building complex workflows where different blocks can react to events emitted by other blocks.

Events in Flows are messages that can be emitted by one block and received by others. The events system is built around the block component model, where:

 - Blocks can define inputs (to receive events)
 - Blocks can define outputs (to emit events)
 - Events flow through a socket, from the output of one block to the connected inputs of another block

This creates a flexible system for building event-driven workflows.

## Emitting events
[Section titled “Emitting events”](#emitting-events)
Events are emitted using the `events.emit` function, through outputs defined in a block’s component. This section covers how to define such outputs and how to use the emission function.

### Defining outputs
[Section titled “Defining outputs”](#defining-outputs)
Before you emit an event, you need to define the output in your block’s schema. The output definition specifies the name, description, and type of events that will be emitted. Types can be simple (like “string”, “number”, “boolean”) or complex, typed using JSON Schema.


**

```
import { AppBlock } from "@slflows/sdk/v1";
export const sourceBlock: AppBlock = {  name: "Event Source",  outputs: {    default: {      name: "Record ID",      description: "Emitted when a new record is created",      type: "string",    },  },};
```

### The `events.emit` function
[Section titled “The events.emit function”](#the-eventsemit-function)
The only way to emit events is through the `emit` function:


**

```
import { events } from "@slflows/sdk/v1";
events.emit(event: any, options?: EmitOptions): Promise<void>
```

**Arguments:**

 - `event`: The event payload (can be any serializable object)
 - `options`: Optional configuration for the event

**Options object:**


**

```
interface EmitOptions {  outputKey?: string; // Key of the output to emit through  parentEventId?: string; // ID of a parent event (for event chaining)  secondaryParentEventIds?: string[]; // IDs of secondary parent events  echo?: boolean; // Whether to echo the event back to the emitter  complete?: string; // pendingId to complete when emitting}
```

Most blocks will react to events by emitting one or more events on one or more outputs. However, some blocks will emit events from different triggers - such as incoming [HTTP requests](./http), [schedules](./scheduling), internal timers, [internal messages](./messaging) or user actions from the GUI such as re-emitting a previous event.

### Basic usage
[Section titled “Basic usage”](#basic-usage)
To emit a simple event:


**

```
await events.emit({ message: "Hello, World!" });
```

To emit on a specific output:


**

```
await events.emit(  { status: "completed", data: result },  { outputKey: "taskCompleted" });
```

Mind the key

When using `outputKey`, the key must match one of the outputs defined in your block’s component. If an invalid outputKey is provided, an error will be thrown. You don’t need to specify `outputKey` if your block has only one output.

### Event parentage
[Section titled “Event parentage”](#event-parentage)
#### Automatic parent assignment
[Section titled “Automatic parent assignment”](#automatic-parent-assignment)
When emitting an event directly from within an `onEvent` handler, the current event being handled is **automatically** assigned as the parent:


**

```
{  inputs: {    default: {      onEvent: async (input) => {        // ...process input.event...        // The emitted event will automatically have the current event as its        // parent.        // No need to explicitly set parentEventId here.        await events.emit({ processedData });      },    }  },}
```

#### Manual parent assignment
[Section titled “Manual parent assignment”](#manual-parent-assignment)
However, in cases where you need to emit an event outside the direct flow of an event handler (such as in a [timer handler](./scheduling) or [HTTP callback](./http)), you must explicitly specify the parent event ID. Otherwise, the emitted event will not have any parentage information, meaning downstream blocks will not have access to data carried by upstream events.


**

```
{  onTimer: async (input) => {    const inputPayload = input.timer.payload;    const { operationId, parentEventId } = inputPayload;
    // Check status of remote operation    const status = await remoteApi.checkOperationStatus(operationId);
    if (status === "completed") {      // Operation complete - get results      const result = await remoteApi.getOperationResult(operationId);
      // Emit with explicit parent ID since we're in a timer handler      await events.emit(        { result },        { parentEventId } // Must be explicitly set here      );    } else if (status === "in_progress") {      // Still processing, set another timer      await timers.block.set(30, { inputPayload });    } else {      console.error("Operation failed or timed out");    }  },}
```

## Receiving events
[Section titled “Receiving events”](#receiving-events)
To receive events, blocks must define at least one input in their component and implement an `onEvent` handler for it.

### Configuring block inputs
[Section titled “Configuring block inputs”](#configuring-block-inputs)
In this example, we define a block with an input that listens for events. The input has a configuration parameter `someValue`. It is required, meaning that it has to be filled out - either statically by the user, or dynamically referring to data from one of the upstream blocks.


**

```
import { AppBlock } from "@slflows/sdk/v1";
export const myBlock: AppBlock = {  name: "Event Receiver",  inputs: {    default: {      config: {        someValue: {          name: "Some value",          type: "string",          required: true,        },      },      onEvent: async ({ app, block, event }) => {        // In addition to app and block config, the event payload is available        // here:        const { id, inputConfig } = event;        console.log(          `Received event ${id} with some value: `,          inputConfig.someValue,        );      },    },  },};
```

## Advanced topics
[Section titled “Advanced topics”](#advanced-topics)
### Event Echo
[Section titled “Event Echo”](#event-echo)
When an event is processed by a block that already processed one of its ancestors, we call the past event an *echo*. The echo feature provides a powerful way to automatically access previous events in a workflow that were emitted by the same block.

#### How Echo Works
[Section titled “How Echo Works”](#how-echo-works)
The echo system automatically identifies the most recent event in an event’s ancestry that was emitted by the current block. It makes this previous event available as part of the input, eliminating the need to manually track relationships between events.

For example, with an HTTP Endpoint block:

 1. When a request comes in, the block emits an event (the outgoing event)
 1. Later, when a response event comes back to the block (the incoming event)
 1. The echo system automatically includes the original request event in the incoming event’s data

This solves a common challenge in request-response patterns: matching responses to their original requests without requiring manual tracking.

#### Using Echo
[Section titled “Using Echo”](#using-echo)
To enable echo for an emitted event:


**

```
await events.emit(  { status: "requestInitiated", requestData: data },  { echo: true } // Enable echo for this event);
```

When a block later receives an event that has one of its own events in its ancestry, the original event will be available in the input:


**

```
export const myBlock: AppBlock = {  name: "Echo Receiver",  inputs: {    default: {      config: {        someValue: {          name: "Some value",          type: "string",          required: true,        },      },      onEvent: async ({ app, block, event }) => {        const originalEvent = event.echo;
        // We can now access data from the original event        const { requestId } = originalEvent.body.requestData;        const outputId = originalEvent.outputId;
        console.log(          `Received response for request ${requestId} originally sent on output ${outputId}`        );      },    },  },};
```

The echo contains additional metadata:

 - The original event body
 - The ID of the output on which the event was emitted

#### Benefits and Use Cases
[Section titled “Benefits and Use Cases”](#benefits-and-use-cases)
Echo is particularly valuable for:

 - HTTP request/response patterns
 - Multi-step workflows where a block needs to match responses with original requests
 - Any scenario where maintaining context across multiple events is important

By automatically tracking event relationships, echo eliminates the need for developers to implement manual tracking systems and simplifies the user experience when building workflows. An excellent example of `echo` is the [Subroutine Definition](https://github.com/spacelift-io/flows-app-utilities/blob/main/blocks/subroutineDefinition.ts) block from the core Utilities app.

`echo` is not automatic!

Blocks need to explicitly request `echo` capability to receive echo data in the `events.emit` options, as it requires additional processing.

### Secondary Parents
[Section titled “Secondary Parents”](#secondary-parents)
Events can have multiple parent relationships through secondary parents:


**

```
await events.emit(  { status: "merged", result: mergedData },  {    outputKey: "dataMerged",    parentEventId: primaryParentId,    secondaryParentEventIds: [secondaryParent1Id, secondaryParent2Id],  });
```

#### Purpose of secondary parents
[Section titled “Purpose of secondary parents”](#purpose-of-secondary-parents)
Secondary parents serve a specialized purpose in the Flows event system. They are:

 - **Visual lineage tracking**: Secondary parents are primarily used for visualizing event relationships in the Event History view, showing which other events influenced or contributed to the current event.
 - **Documentation of relationships**: They help document complex event flows where multiple upstream events contribute to a single downstream event.
 - **No data availability to downstream blocks**: Unlike primary parents, secondary parent event data is NOT made available to downstream blocks that receive the event.

#### When to use secondary parents
[Section titled “When to use secondary parents”](#when-to-use-secondary-parents)
Secondary parents are most useful in scenarios such as:

 - **Merge operations**: When a block combines data from multiple sources or events
 - **Aggregation patterns**: When collecting multiple events before producing a summary event
 - **Multi-input transformations**: When an operation depends on multiple input events

### Pending Events
[Section titled “Pending Events”](#pending-events)
Pending events represent future events that are expected to occur, providing visibility into asynchronous processing. They transform events from point-in-time occurrences into observable processes with duration, showing users what’s happening during long-running operations.

#### Purpose
[Section titled “Purpose”](#purpose)
Pending events address the observability gap that occurs when a block processes an event asynchronously. They show:

 - What’s happening with the event
 - When a response might be expected
 - Which output will emit the response
 - Preview data that will be included

#### Creating
[Section titled “Creating”](#creating)

**

```
const pendingId = await events.createPending({  event?: Record<string, any>,           // Predicted event payload  outputKey?: string,                    // Target output  parentEventId?: string,                // Auto-populated in handlers  secondaryParentEventIds?: string[],  statusDescription?: string             // User-readable status});
```

#### Updating
[Section titled “Updating”](#updating)

**

```
await events.updatePending(pendingId, {  event?: Record<string, any>,  outputKey?: string,  statusDescription?: string,  // ... other fields});
```

#### Completing
[Section titled “Completing”](#completing)
Two approaches:


**

```
// 1. Complete with current dataawait events.completePending(pendingId);
// 2. Complete while emittingawait events.emit(eventData, { complete: pendingId });
```

#### Canceling
[Section titled “Canceling”](#canceling)

**

```
await events.cancelPending(pendingId, "Reason for cancellation");
```

#### Timer integration
[Section titled “Timer integration”](#timer-integration)
Link timers to pending events (see [Scheduling documentation](./scheduling) for more on timers):


**

```
await timers.block.set(seconds, {  pendingEventId: pendingId,  description: "Waiting...",});
```

The pending event is then available in the timer handler:


**

```
onTimer: async ({ timer }) => {  await events.completePending(timer.pendingEvent.id);}
```

#### Example: Simple delay
[Section titled “Example: Simple delay”](#example-simple-delay)

**

```
{  inputs: {    default: {      onEvent: async ({ event }) => {        const pendingId = await events.createPending({          event: { startedAt: Date.now() },          statusDescription: "Sleeping...",        });
        await timers.block.set(event.inputConfig.seconds, {          pendingEventId: pendingId,        });      },    },  },  onTimer: async ({ timer }) => {    await events.completePending(timer.pendingEvent.id);  },}
```