Skip to content

Events

The events service in Flows provides a powerful way for blocks to communicate with each other through asynchronous event-based messaging. This allows for building complex workflows where different blocks can react to events emitted by other blocks.

Events in Flows are messages that can be emitted by one block and received by others. The events system is built around the block component model, where:

  • Blocks can define inputs (to receive events)
  • Blocks can define outputs (to emit events)
  • Events flow through a socket, from the output of one block to the connected inputs of another block

This creates a flexible system for building event-driven workflows.

Events are emitted using events.emit function, through outputs defined in a block’s component. This section covers how to define such outputs and how to use the emission function.

Before you emit an event, you need to define the output in your block’s schema. The output definition specifies the name, description, and type of events that will be emitted. Types can be simple (like “string”, “number”, “boolean”) or complex, typed using JSON Schema.

import { AppBlock } from "@slflows/sdk/v1";
export const sourceBlock: AppBlock = {
name: "Event Source",
outputs: {
default: {
name: "Record ID",
description: "Emitted when a new record is created",
type: "string",
},
},
};

The only way to emit events is through the emit function:

import { events } from "@slflows/sdk/v1";
events.emit(event: any, options?: EmitOptions): Promise<void>

Arguments:

  • event: The event payload (can be any serializable object)
  • options: Optional configuration for the event

Options object:

interface EmitOptions {
outputKey?: string; // Key of the output to emit through
parentEventId?: string; // ID of a parent event (for event chaining)
secondaryParentEventIds?: string[]; // IDs of secondary parent events
echo?: boolean; // Whether to echo the event back to the emitter
complete?: string; // pendingId to complete when emitting
}

Most blocks will react to events by emitting one or more events on one or more outputs. However, some blocks will emit events from different triggers - such as incoming HTTP requests, schedules, internal timers, internal messages or user actions from the GUI such as re-emitting a previous event.

To emit a simple event:

await events.emit({ message: "Hello, World!" });

To emit on a specific output:

await events.emit(
{ status: "completed", data: result },
{ outputKey: "taskCompleted" }
);

When emitting an event directly from within an onEvent handler, the current event being handled is automatically assigned as the parent:

{
inputs: {
default: {
onEvent: async (input) => {
// ...process input.event...
// The emitted event will automatically have the current event as its
// parent.
// No need to explicitly set parentEventId here.
await events.emit({ processedData });
},
}
},
}

However, in cases where you need to emit an event outside the direct flow of an event handler (such as in a timer handler or HTTP callback), you must explicitly specify the parent event ID. Otherwise, the emitted event will not have any parentage information, meaning downstream blocks will not have access to data carried by upstream events.

{
onTimer: async (input) => {
const inputPayload = input.timer.payload;
const { operationId, parentEventId } = inputPayload.operationId;
// Check status of remote operation
const status = await remoteApi.checkOperationStatus(operationId);
if (status === "completed") {
// Operation complete - get results
const result = await remoteApi.getOperationResult(operationId);
// Emit with explicit parent ID since we're in a timer handler
await events.emit(
{ result },
{ parentEventId } // Must be explicitly set here
);
} else if (status === "in_progress") {
// Still processing, set another timer
await timers.set(30, { inputPayload });
} else {
console.error("Operation failed or timed out");
}
},
}

To receive events, blocks must define at least one input in their component and implement an onEvent handler for it.

In this example, we define a block with an input that listens for events. The input has a configuration parameter someValue. It is required, meaning that it has to be filled out - either statically by the use, or dynamically referring to data from one of the upstream blocks.

import { AppBlock } from "@slflows/sdk/v1";
export const myBlock: AppBlock = {
name: "Event Receiver",
inputs: {
default: {
config: {
someValue: {
name: "Some value",
type: "string",
required: true,
},
}
onEvent: async ({ app, block, event }) => {
// In addition to app and block config, the event payload is available
// here:
const { id, inputConfig } = event;
console.log(
`Received event ${id} with some value: `,
inputConfig.someValue,
);
},
},
},
};

When an event is processed by a block that already processed one of its ancestors, we call the past event an echo. The echo feature provides a powerful way to automatically access previous events in a workflow that were emitted by the same block.

The echo system automatically identifies the most recent event in an event’s ancestry that was emitted by the current block. It makes this previous event available as part of the input, eliminating the need to manually track relationships between events.

For example, with an HTTP Endpoint block:

  1. When a request comes in, the block emits an event (the outgoing event)
  2. Later, when a response event comes back to the block (the incoming event)
  3. The echo system automatically includes the original request event in the incoming event’s data

This solves a common challenge in request-response patterns: matching responses to their original requests without requiring manual tracking.

To enable echo for an emitted event:

await events.emit(
{ status: "requestInitiated", requestData: data },
{ echo: true } // Enable echo for this event
);

When a block later receives an event that has one of its own events in its ancestry, the original event will be available in the input:

export const myBlock: AppBlock = {
name: "Echo Receiver",
inputs: {
default: {
config: {
someValue: {
name: "Some value",
type: "string",
required: true,
},
}
onEvent: async ({ app, block, event }) => {
const originalEvent = input.event.echo;
// We can now access data from the original event
const { requestId } = originalEvent.body.requestData;
const outputId = originalEvent.outputId;
console.log(
`Received response for request ${requestId} originally sent on output ${outputId}`
);
},
},
},
};

The echo contains additional metadata:

  • The original event body
  • The ID of the output on which the event was emitted

Echo is particularly valuable for:

  • HTTP request/response patterns
  • Multi-step workflows where a block needs to match responses with original requests
  • Any scenario where maintaining context across multiple events is important

By automatically tracking event relationships, echo eliminates the need for developers to implement manual tracking systems and simplifies the user experience when building workflows. An excellent example of echo is the Subroutine Definition block from the core Utilities app.

Events can have multiple parent relationships through secondary parents:

await events.emit(
{ status: "merged", result: mergedData },
{
outputKey: "dataMerged",
parentEventId: primaryParentId,
secondaryParentEventIds: [secondaryParent1Id, secondaryParent2Id],
}
);

Secondary parents serve a specialized purpose in the Flows event system. They are:

  • Visual lineage tracking: Secondary parents are primarily used for visualizing event relationships in the Event History view, showing which other events influenced or contributed to the current event.

  • Documentation of relationships: They help document complex event flows where multiple upstream events contribute to a single downstream event.

  • No data availability to downstream blocks: Unlike primary parents, secondary parent event data is NOT made available to downstream blocks that receive the event.

Secondary parents are most useful in scenarios such as:

  • Merge operations: When a block combines data from multiple sources or events
  • Aggregation patterns: When collecting multiple events before producing a summary event
  • Multi-input transformations: When an operation depends on multiple input events

Pending events represent future events that are expected to occur, providing visibility into asynchronous processing. They transform events from point-in-time occurrences into observable processes with duration, showing users what’s happening during long-running operations.

Pending events address the observability gap that occurs when a block processes an event asynchronously. They show:

  • What’s happening with the event
  • When a response might be expected
  • Which output will emit the response
  • Preview data that will be included
const pendingId = await events.createPending({
event?: Record<string, any>, // Predicted event payload
outputKey?: string, // Target output
parentEventId?: string, // Auto-populated in handlers
secondaryParentEventIds?: string[],
statusDescription?: string // User-readable status
});
await events.updatePending(pendingId, {
event?: Record<string, any>,
outputKey?: string,
statusDescription?: string,
// ... other fields
});

Two approaches:

// 1. Complete with current data
await events.completePending(pendingId);
// 2. Complete while emitting
await events.emit(eventData, { complete: pendingId });
await events.cancelPending(pendingId, "Reason for cancellation");

Link timers to pending events (see Scheduling documentation for more on timers):

await timers.block.set(seconds, {
pendingEventId: pendingId,
description: "Waiting...",
});

The pending event is then available in the timer handler:

onTimer: async ({ timer }) => {
await events.completePending(timer.pendingEvent.id);
}
{
inputs: {
default: {
onEvent: async ({ event }) => {
const pendingId = await events.createPending({
event: { startedAt: Date.now() },
statusDescription: "Sleeping...",
});
await timers.block.set(event.inputConfig.seconds, {
pendingEventId: pendingId,
});
},
},
},
onTimer: async ({ timer }) => {
await events.completePending(timer.pendingEvent.id);
},
}