Workflows
Workflow Builder
A full visual automation engine — drag-and-drop ReactFlow canvas, 50+ integration nodes, AI-powered steps, cross-node variables, versioning with redo/undo, error handling with dead letter queues, and workflow templates.
Mithrilis
Overview#
The Workflow Builder is the core automation engine in Mithrilis. It replaces manual, repetitive logistics processes with visual automations that your team can build, test, and deploy without writing code.
Built on ReactFlow, the builder gives you a full drag-and-drop canvas where you wire together triggers, logic, and actions into end-to-end workflows. Whether you're routing carrier alerts to Slack, classifying exceptions with AI, or syncing shipment data across systems — the workflow builder handles it.
Canvas and Node System#
The canvas is the workspace where you design your automation. It behaves like a whiteboard: pan, zoom, and drag nodes around to organize your workflow visually.
Node Palette#
The left sidebar contains a searchable palette of all available nodes, organized into two categories:
- Input nodes — Triggers that start your workflow (webhooks, cron schedules, email)
- Output nodes — Actions that perform work (send Slack messages, make HTTP requests, query databases, run AI models)
Drag a node from the palette onto the canvas to add it, then connect nodes by drawing edges between their input and output ports.
Node Execution#
You can test individual nodes without running the entire workflow. Click any non-trigger node to execute it in isolation:
- The node border changes color based on the execution result — blue while running, green on success, red on failure
- A status indicator transitions from grey (pending) to blue (running) to the final state
- Execution results are shown inline so you can inspect the output before connecting downstream nodes
Test as you build
Run individual nodes frequently as you build your workflow. This catches configuration errors early and lets you verify that each step produces the data you expect before wiring everything together.
Consistent Design#
All integration nodes share a unified visual style with provider logos, making it easy to identify what each step does at a glance. The canvas maintains a clean aesthetic regardless of how complex your workflow gets.
Triggers#
Every workflow starts with a trigger — the event that kicks off execution. Three trigger types are supported, each configurable directly in the editor.
Webhook#
A POST-only HTTP endpoint that Mithrilis generates for your workflow. Point any external system at this URL to trigger the workflow when events occur. No verification is required for simplicity — just send a POST request with a JSON body.
Common use case: Carrier tracking webhooks that fire when a shipment status changes.
Cron Schedule#
Time-based execution for workflows that need to run on a recurring schedule. Configure using standard cron syntax or a visual scheduler.
Common use case: A nightly workflow that checks for shipments missing proof-of-delivery and creates exceptions for any that are overdue.
Email#
An SMTP/IMAP trigger that watches for incoming emails and starts the workflow when a new message arrives. Includes intent classification to filter emails by type before processing.
Common use case: Parsing carrier rate confirmation emails and extracting structured data into your system.
Trigger limits
Each workflow can have exactly one trigger. If you need the same automation to respond to multiple event types, create separate workflows for each trigger and have them call a shared sub-workflow via webhook.
Variables#
The variable system is what makes workflows powerful — it lets data flow from one node to the next, so each step can use the output of previous steps.
How Variables Work#
When a node executes, its output becomes available as a variable that any downstream node can reference. For example, if a webhook trigger receives carrier data, the AI classification node can reference webhook.body.carrier_name in its prompt.
Variable Picker#
Variables are inserted via a picker UI that appears when you're editing any textarea in a node configuration panel:
- Collapsible tree — Variables are organized by source node, with a "collapse all" button for complex workflows
- Live preview — See the actual message data flowing through each variable, not just the variable name
- Non-blocking UI — The picker appears outside the node menu so it doesn't obstruct the configuration panel
AI Nodes#
AI-powered steps bring intelligence into your automations. Built on the Vercel AI SDK with OpenRouter integration, AI nodes let you classify, extract, summarize, and transform data using large language models.
Configuration#
- Model selector — Choose from OpenAI (GPT-4o, GPT-4o-mini), Anthropic (Claude Sonnet, Claude Haiku), and other providers. Each model shows its original provider logo for easy identification.
- JSON output mode — When enabled, the AI returns structured JSON instead of free text, making it easy to use the output in downstream conditional logic
- System prompt — Define the AI's behavior and output format using a system prompt with variable interpolation
Common use case: Classify incoming carrier exceptions by severity and category, then route critical ones to Slack while batching lower-priority ones into a daily digest email.
Conditional Logic#
The conditional node supports Switch Map logic — a redesigned branching UI that lets you route data down different paths based on conditions.
Define multiple branches, each with its own condition expression. The workflow follows the first branch whose condition evaluates to true. A default "else" branch catches anything that doesn't match.
Example: Route exceptions to different Slack channels based on severity:
- If
severity == "critical"→ #ops-critical - If
severity == "high"→ #ops-alerts - Else → #ops-general
Versioning and Deployment#
Version History#
Every save creates a new version. The floating header shows which version you're editing in real time, and you can browse the full version history to see how the workflow has evolved.
Draft vs. Published#
Workflows exist in one of two states:
- Draft — Only executes via the Test button. Safe for experimentation.
- Published — Runs automatically when the trigger fires. This is the live version.
This separation means you can iterate on a workflow without affecting production traffic. When you're ready, hit Publish to make the latest draft the active version.
Other Features#
- Redo/undo — Full canvas operation history with keyboard shortcuts (Cmd+Z / Cmd+Shift+Z)
- Workflow templates — Start from pre-built templates for common automation patterns (carrier alert routing, exception triage, daily digest)
- Duplicate prevention — The system prevents creating workflows with duplicate names
Publish workflow
Always test your workflow thoroughly in draft mode before publishing. Use the Test button with realistic sample data to verify every branch and error condition.
Error Handling#
A dedicated error handling system ensures failed workflows don't go unnoticed.
Dead Letter Queue#
Failed workflow executions are captured in a dead letter queue (DLQ). From the DLQ UI, you can:
- Inspect the full execution trace to see which node failed and why
- Replay failed executions after fixing the underlying issue
- Discard executions that are no longer relevant
Notifications#
- Error badges — Visible in the workflow list and editor header when a workflow has recent failures
- Toast notifications — In-app alerts when a running workflow encounters an error
- Email notifications — Automatic failure emails sent when a workflow errors out in production
- Credit handling — Failed workflows manage credit deductions correctly, so you're not charged for executions that didn't complete
Shipment Upserter#
A workflow-wide toggle (enabled by default) that automatically upserts shipment data as workflows process logistics events. When a workflow handles a carrier webhook, the shipment upserter ensures the associated shipment record in Mithrilis is created or updated with the latest data.
You can disable the upserter per-workflow from the floating header system menu — useful for workflows that process non-shipment data.
Getting Started#
Create a new workflow
Click New Workflow from the Workflows page. Give it a descriptive name (e.g., "Late Pickup Alert") and choose a trigger type.
Configure the trigger
Set up your trigger. For a webhook, copy the generated URL and configure your carrier or TMS to send events to it. For cron, set the schedule. For email, connect your SMTP credentials.
Add processing nodes
Drag nodes from the palette onto the canvas. Connect them in the order you want data to flow. Use the variable picker to reference upstream data in each node's configuration.
Test and publish
Click Test Run to execute the workflow with sample data. Inspect the results at each node. When everything looks right, click Publish to make it live.
Topics
Related updates
AI
AI Query & Assistant
Ask questions in plain English and get answers from your data — a query router with 4-way classification, RAG-powered retrieval, a conversational assistant with multi-conversation support, and a shared reasoning layer.
Knowledge Base
Data Unification
A complete data unification pipeline — golden records with entity graphs, confidence scoring, probabilistic dedup, field-level merge with conflict resolution, and a worker infrastructure for processing at scale.
Knowledge Base
Knowledge Base
A centralized knowledge layer connecting your databases, documents, and business definitions — with schema introspection, a SQL query playground, and workflow-native KB nodes.