Skip to main content

What is Flow Core?

Nadoo Flow Core is a flexible workflow orchestration framework with multi-backend support designed for production use. It’s the open-source foundation of the Nadoo AI platform.
Current Version: v0.1.0 (Beta)Flow Core is actively maintained and available as open source.

Key Features

Based on the actual implementation in the source code:

Type-Safe

Full Pydantic v2 support with runtime type validation and automatic serialization

Async-First

Built on asyncio for high-performance concurrent execution

Minimal Dependencies

Only 2 core dependencies: Pydantic and typing-extensions

Composable

Modular nodes with clear interfaces using the pipe operator (|)

Observable

Built-in execution tracking, callbacks, and streaming support

Flexible

Conditional branching, parallel execution, and dynamic routing

Core Components

The framework consists of 14 main modules with over 6,495 lines of production code:

Essential Modules

ModulePurposeKey Classes
base.pyCore types and execution engineWorkflowExecutor, NodeContext, IStepNode
chain.pyFluent composition APIChainableNode, NodeChain, FunctionNode
parallel.pyConcurrent executionParallelNode, FanOutFanInNode
callbacks.pyObservability systemCallbackManager, BaseCallbackHandler

Advanced Features

ModulePurposeUse Case
resilience.pyRetry and fallback patternsHandle transient failures
streaming.pyReal-time event streamingUI updates, progress tracking
memory.pyChat history managementConversational AI
caching.pyResponse cachingPerformance optimization
rate_limiting.pyToken bucket algorithmAPI quota management
batch.pyBatch processingBulk operations
tools.pyTool managementLLM function calling
parsers.pyOutput parsingStructured extraction

Quick Example

Here’s a simple workflow demonstrating the core concepts:
from nadoo_flow import ChainableNode, FunctionNode, NodeResult

# Define a custom node
class UppercaseNode(ChainableNode):
    async def execute(self, node_context, workflow_context):
        text = node_context.input_data.get("text", "")
        return NodeResult(
            success=True,
            output={"text": text.upper()}
        )

# Chain nodes using pipe operator
workflow = (
    UppercaseNode()
    | FunctionNode(lambda x: {"text": x["text"][::-1]})
)

# Execute
result = await workflow.run({"text": "hello"})
print(result)  # Output: {'text': 'OLLEH'}

Multi-Backend Architecture

Note: Currently only the native backend is implemented. LangGraph, CrewAI, and A2A backends are planned for future releases.
Flow Core uses a protocol-based design that allows swapping execution backends:
from nadoo_flow.backends import BackendRegistry

# Use the native backend (currently the only available option)
backend = BackendRegistry.create("native")

# Future: Other backends will be available
# backend = BackendRegistry.create("langgraph")  # Coming soon
# backend = BackendRegistry.create("crewai")     # Coming soon

Comparison with LangChain

AspectNadoo Flow CoreLangChain/LangGraph
Dependencies2 (minimal)50+ (heavy)
API Stylerun() / stream()ainvoke() / astream()
ChainingPipe operator + GraphLCEL (complex)
Target AudiencePlatform buildersAI researchers
Learning CurveGentleSteep
CustomizationBuild your ownUse pre-built

When to Use Flow Core?

✅ Flow Core is Perfect For:

  • Building no-code/low-code platforms - Full control over workflow execution
  • Custom AI agent development - Create exactly what you need
  • Production systems - Minimal dependencies reduce security risks
  • Microservices - Lightweight footprint perfect for containers
  • Educational projects - Clear abstractions easy to understand

❌ Consider Alternatives If:

  • You need pre-built integrations (200+ in LangChain)
  • You want to prototype quickly with standard patterns
  • You prefer batteries-included frameworks
  • You’re building research experiments vs production systems

Installation

# Basic installation
pip install nadoo-flow-core

# With optional features
pip install "nadoo-flow-core[cel]"  # CEL expression support

Project Structure

The complete source code is organized as:
packages/nadoo-flow-core/
├── src/nadoo_flow/          # Main source code
│   ├── __init__.py          # Public API exports
│   ├── base.py              # Core classes (20KB)
│   ├── chain.py             # Chaining API (6KB)
│   ├── parallel.py          # Parallel execution (19KB)
│   ├── callbacks.py         # Observability (18KB)
│   ├── memory.py            # Chat history (11KB)
│   ├── streaming.py         # Event streaming (11KB)
│   ├── resilience.py        # Retry/fallback (12KB)
│   ├── backends/            # Multi-backend support
│   └── ...                  # Other modules
├── examples/                # 7 working examples
├── tests/                   # Comprehensive test suite
└── pyproject.toml          # Project configuration

Available Examples

Learn from working code in the examples directory:
  1. Basic Chat Bot - Chat with memory management
  2. Streaming Chat - Real-time token streaming
  3. Parallel Search - Concurrent data fetching
  4. RAG Pipeline - Retrieval-augmented generation
  5. Advanced Features - Retry, fallback, caching
  6. Human-in-the-Loop - Workflow interruption and approval

Development Status

1

✅ Implemented

  • Core workflow engine
  • Chaining API
  • Parallel execution
  • Streaming support
  • Memory management
  • Resilience patterns
  • Native backend
2

🚧 In Progress

  • Additional backends (LangGraph, CrewAI)
  • CEL expression evaluation
  • Visual debugger
  • Performance optimizations
3

📋 Planned

  • Distributed execution
  • Workflow versioning
  • Advanced monitoring
  • Cloud deployment tools

Getting Help

Next Steps