Flow
type: "io.kestra.plugin.core.trigger.Flow"
Trigger a flow in response to a state change in one or more other flows.
You can trigger a flow as soon as another flow ends. This allows you to add implicit dependencies between multiple flows, which can often be managed by different teams.
If you don't provide any conditions, the flow will be triggered for EVERY execution of EVERY flow on your instance.
Examples
Trigger the
transform
flow after theextract
flow finishes successfully. Theextract
flow generates alast_ingested_date
output that is passed to thetransform
flow as an input. Here is theextract
flow:
id: extract
namespace: company.team
tasks:
- id: final_date
type: io.kestra.plugin.core.debug.Return
format: "{{ execution.startDate | dateAdd(-2, 'DAYS') | date('yyyy-MM-dd') }}"
outputs:
- id: last_ingested_date
type: STRING
value: "{{ outputs.final_date.value }}"
Below is the transform
flow triggered in response to the extract
flow's successful completion.
id: transform
namespace: company.team
inputs:
- id: last_ingested_date
type: STRING
defaults: "2025-01-01"
variables:
result: |
Ingestion done in {{ trigger.executionId }}.
Now transforming data up to {{ inputs.last_ingested_date }}
tasks:
- id: run_transform
type: io.kestra.plugin.core.debug.Return
format: "{{ render(vars.result) }}"
- id: log
type: io.kestra.plugin.core.log.Log
message: "{{ render(vars.result) }}"
triggers:
- id: run_after_extract
type: io.kestra.plugin.core.trigger.Flow
inputs:
last_ingested_date: "{{ trigger.outputs.last_ingested_date }}"
conditions:
- type: io.kestra.plugin.core.condition.ExecutionFlowCondition
namespace: company.team
flowId: extract
- type: io.kestra.plugin.core.condition.ExecutionStatusCondition
in:
- SUCCESS
Properties
conditions
- Type: array
- SubType: Condition
- Dynamic: ❌
- Required: ❌
List of conditions in order to limit the flow trigger.
inputs
- Type: object
- Dynamic: ❌
- Required: ❌
Pass upstream flow's outputs to inputs of the current flow.
The inputs allow you to pass data object or a file to the downstream flow as long as those outputs are defined on the flow-level in the upstream flow.
Make sure that the inputs and task outputs defined in this Flow trigger match the outputs of the upstream flow. Otherwise, the downstream flow execution will not to be created. If that happens, go to the Logs tab on the Flow page to understand the error.
states
- Type: array
- SubType: string
- Dynamic: ❓
- Required: ❌
- Default:
[SUCCESS, WARNING, FAILED, KILLED, CANCELLED, RETRIED, SKIPPED]
List of execution states that will be evaluated by the trigger
By default, only executions in a terminal state will be evaluated. Any
ExecutionStatusCondition
-type condition will be evaluated after the list ofstates
.The trigger will be evaluated for each state change of matching executions. Keep in mind that if a flow has two
Pause
tasks, the execution will transition from PAUSED to a RUNNING state twice — one for each Pause task. The Flow trigger listening to aPAUSED
state will be evaluated twice in this case.
Note that a Flow trigger cannot react to the CREATED
state.
stopAfter
- Type: array
- SubType: string
- Dynamic: ❌
- Required: ❌
List of execution states after which a trigger should be stopped (a.k.a. disabled).
Outputs
executionId
- Type: string
- Required: ✔️
The execution ID that triggered the current flow.
flowId
- Type: string
- Required: ✔️
The flow ID whose execution triggered the current flow.
flowRevision
- Type: integer
- Required: ✔️
The flow revision that triggered the current flow.
namespace
- Type: string
- Required: ✔️
The namespace of the flow that triggered the current flow.
state
- Type: string
- Required: ✔️
- Possible Values:
CREATED
RUNNING
PAUSED
RESTARTED
KILLING
SUCCESS
WARNING
FAILED
KILLED
CANCELLED
QUEUED
RETRYING
RETRIED
SKIPPED
The execution state.
Definitions
Was this page helpful?