20 ComfyUI nodes for persistent agent memory, cross-agent messaging, and LLM tool use
  • Python 54.9%
  • HTML 36.8%
  • CSS 8.3%
Find a file
ghmk 499604fc81 Update README with 20 nodes + 7 workflows, add launch kit
README: badges, ASCII flow diagrams, all 20 nodes in tables,
7 workflow descriptions with node counts, memory directory structure.

Launch kit: copy-paste posts for ComfyUI Manager PR, Discord
(#custom-nodes + #share-workflows), Reddit (r/comfyui, r/LocalLLaMA,
r/StableDiffusion), Show HN, X/Twitter thread, Moltbook.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-14 15:15:35 +01:00
nodes comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00
promo Update README with 20 nodes + 7 workflows, add launch kit 2026-03-14 15:15:35 +01:00
tests comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00
workflows comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00
.gitignore comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00
__init__.py comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00
conftest.py comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00
pyproject.toml comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00
pytest.ini comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00
README.md Update README with 20 nodes + 7 workflows, add launch kit 2026-03-14 15:15:35 +01:00
requirements.txt comfyui-agentazall v2.0: 20 nodes, 76 tests, 7 demo workflows 2026-03-14 10:41:23 +01:00

ComfyUI AgentAZAll — Persistent Memory & Multi-Agent Workflows

Nodes Tests Python License PyPI

20 custom nodes that give ComfyUI agents persistent memory, cross-agent messaging, multi-turn chat, and LLM tool use — all running 100% locally.

No vector database. No embedding model. No cloud dependency. Memories survive restarts. Agents message each other through a file-based mailbox. Works with any OpenAI-compatible LLM: Ollama, LM Studio, llama.cpp, vLLM.


Quick Install

Option A — Git clone (recommended):

cd ComfyUI/custom_nodes
git clone https://github.com/cronos3k/comfyui-agentazall.git
# restart ComfyUI — 20 nodes appear under "AgentAZAll/"

Option B — ComfyUI Manager: Search "AgentAZAll" in the Manager and click Install.

Dependency: pip install agentazall>=1.0.13 (auto-installs PyNaCl for Ed25519 signing)


7 Demo Workflows Included

Ready-to-load JSON files in workflows/. Drag & drop into ComfyUI.

1. Character Design Studio

Consistent character creation for Stable Diffusion. The art director agent remembers every character it has designed and cross-references them for visual coherence across prompts.

Setup → Recall("character") → TextCombine → LLM → ToolParser → TextDisplay
                                ↑
                          SystemPrompt (art director persona)

Nodes: 7 · Key feature: Cross-run memory — design Elara in run 1, and Thorne in run 2 automatically references Elara's visual style.


2. Novel Crafter (Dual-Agent)

Two-agent writing pipeline. The Writer drafts chapters, stores plot outlines, and sends drafts to the Editor via Relay for constructive feedback.

┌─ WRITER (writer@localhost) ─────────────────────────┐
│ Setup → Note → Recall → TextCombine → LLM →        │
│ ToolParser → TextDisplay + Relay ──── ✉ ──────────┐ │
└─────────────────────────────────────────────────────┘ │
┌─ EDITOR (editor@localhost) ──────────────────────┐   │
│ Setup → Inbox ← ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┘   │
│ SystemPrompt → LLM → TextDisplay                │
└──────────────────────────────────────────────────┘

Nodes: 16 · Agents: 2 · Key feature: Cross-agent messaging via file-based Relay.


3. Story → Video Pipeline

Break a story into visual scenes for animation. The director generates scripts + SD prompts, then BatchSplitter separates each scene for downstream image generation.

Nodes: 12 · Key feature: BatchSplitter splits on ## Scene headers, outputting individual scenes.


4. Multi-Turn Chat

Persistent conversation that survives ComfyUI queue resets. History is saved to disk as JSON and reloaded automatically — true multi-turn sessions.

ChatHistory(load+add user msg) → PromptTemplate → LLM → ToolParser
                                                           ↓
                                          TextDisplay + ChatHistory(save assistant msg)

Nodes: 8 · Key feature: Two ChatHistory nodes create a load→respond→save loop.


5. Agent Debate

Two LLM agents face off on a topic. Agent A argues FOR and sends its argument via Relay. Agent B reads the inbox and writes a rebuttal.

Nodes: 14 · Agents: 2 · Key feature: Real message passing — no shared variables.


6. World Builder

Incrementally build a game world. Each queue run adds a location, character, or item — all stored in memory and cross-referenced for consistency.

Nodes: 8 · Key feature: PromptTemplate with {a}=type {b}=name {c}=style variables.


7. RAG Pipeline

Store documents with Remember, retrieve with Recall, answer with LLM, and route errors gracefully with ConditionalRouter.

Remember(doc) → Recall(query) → PromptTemplate → LLM → ConditionalRouter
                                                            ├─ Error Display
                                                            └─ Success Display

Nodes: 9 · Key feature: ConditionalRouter splits on "LLM_ERROR" for graceful error handling.


All 20 Nodes

Config (3)

Node Description
AZAll Setup Initialize agent config — address, mailbox directory
AZAll Who Am I Get or set agent identity description
AZAll Current Task Get or set current task status

Memory (3)

Node Description
AZAll Remember Store a persistent memory with searchable title
AZAll Recall Search memories by keyword across all dates
AZAll Note Read/write named notes (plot outlines, character sheets)

Messaging (5)

Node Description
AZAll Send Message Send a message to another agent
AZAll Inbox List all received messages
AZAll Read Message Read a specific message by ID
AZAll Relay Deliver outbox messages to recipient inboxes (local)
AZAll Chat History Persistent multi-turn conversation (JSON on disk)

Integration (3)

Node Description
AZAll System Prompt Assemble rich context (identity + memories + inbox + tools)
AZAll Tool Parser Parse [TOOL:...]...[/TOOL] from LLM output and execute
AZAll File Browser Show agent's mailbox directory as a tree

LLM (1)

Node Description
AZAll LLM Call any OpenAI-compatible API (Ollama, LM Studio, llama.cpp, vLLM)

Utility (5)

Node Description
AZAll Text Display Display text output in the workflow
AZAll Text Combine Join two text inputs with a configurable separator
AZAll Prompt Template {a}, {b}, {c}, {d} variable substitution
AZAll Conditional Router Route text by keyword/regex → matched vs. unmatched
AZAll Batch Splitter Split text by delimiter → first section + all sections + count

How Memory Works

Everything is plain text files — inspectable, portable, and git-friendly:

agentazall_data/
  mailboxes/
    art-director@localhost/
      2026-03-14/
        inbox/
          msg-abc123.txt       ← "From: editor@localhost | Review notes"
        outbox/
          msg-def456.txt       ← pending delivery
        sent/
          msg-def456.txt       ← delivered by Relay
        remember/
          character-elara.txt  ← "Fire mage, flowing red hair, arcane tattoos..."
          character-thorne.txt
        notes/
          plot-outline.txt
          chat-history/
            my-chat.json       ← multi-turn conversation
  • Memories persist between queue runs and ComfyUI restarts
  • Two agents in the same workflow can message each other via Relay
  • Connect a transport (FTP, Email, HTTPS) for cross-machine agent communication
  • Ed25519 signing for message authentication

Requirements

  • Python 3.10+
  • agentazall>=1.0.13
  • Any LLM endpoint for the LLM node (recommended: ollama serve or llama-server)
  • ComfyUI 0.3.x


License

AGPL-3.0 — see LICENSE for details.