State Transforms¶
S factories return dict transforms that compose with >> as zero-cost workflow nodes. They manipulate session state between agent steps without making LLM calls.
flowchart TB
INPUT["<b>Input state</b><br>{ web, scholar, _debug }"] --> DROP
DROP["S.drop('_debug')<br>→ { web, scholar }"] --> MERGE
MERGE["S.merge('web','scholar', into='research')<br>→ { web, scholar, research }"] --> DEFAULT
DEFAULT["S.default(confidence=0.0)<br>→ { web, scholar, research, confidence }"] --> RENAME
RENAME["S.rename(research='input')<br>→ { web, scholar, input, confidence }"] --> PICK
PICK["S.pick('input','confidence')<br>→ { input, confidence }"] --> OUTPUT
OUTPUT["<b>Output state</b><br>{ input, confidence }"]
style INPUT fill:#FFF3E0,stroke:#E65100,color:#1A1A1A
style OUTPUT fill:#ecfdf5,stroke:#10b981,color:#1A1A1A
style DROP fill:#fef2f2,stroke:#e94560,color:#1A1A1A
style MERGE fill:#e0f2fe,stroke:#0ea5e9,color:#1A1A1A
style DEFAULT fill:#FFF8E1,stroke:#f59e0b,color:#1A1A1A
style RENAME fill:#f3e5f5,stroke:#a78bfa,color:#1A1A1A
style PICK fill:#ecfdf5,stroke:#10b981,color:#1A1A1A
Composition operators:
S.drop() >> S.merge() # chain: run in sequence
S.default() + S.rename() # combine: apply to same state
Basic Usage¶
from adk_fluent import S
pipeline = (
(web_agent | scholar_agent)
>> S.merge("web", "scholar", into="research")
>> S.default(confidence=0.0)
>> S.rename(research="input")
>> writer_agent
)
Transform Reference¶
Factory |
Purpose |
|---|---|
|
Keep only specified keys |
|
Remove specified keys |
|
Rename keys |
|
Fill missing keys |
|
Combine keys |
|
Map a single value |
|
Derive new keys |
|
Assert invariant |
|
Debug-print |
S.pick(*keys)¶
Keep only the specified keys in state, dropping everything else:
# After this, state only contains "name" and "email"
pipeline = agent >> S.pick("name", "email") >> next_agent
S.drop(*keys)¶
Remove the specified keys from state:
# Remove temporary/internal keys before the next step
pipeline = agent >> S.drop("_internal", "_debug") >> next_agent
S.rename(**kw)¶
Rename keys in state. The keyword argument maps old names to new names:
# Rename "research" to "input" for the next agent
pipeline = researcher >> S.rename(research="input") >> writer
S.default(**kw)¶
Fill in missing keys with default values. Existing keys are not overwritten:
# Ensure "confidence" exists with a default of 0.0
pipeline = agent >> S.default(confidence=0.0) >> evaluator
S.merge(*keys, into=)¶
Combine multiple keys into a single key:
# Merge "web" and "scholar" results into "research"
pipeline = (
(web_agent | scholar_agent)
>> S.merge("web", "scholar", into="research")
>> writer
)
S.transform(key, fn)¶
Apply a function to transform a single value in state:
# Uppercase the "title" value
pipeline = agent >> S.transform("title", str.upper) >> next_agent
S.compute(**fns)¶
Derive new keys by applying functions to the state:
# Compute a new "summary_length" key from the existing state
pipeline = agent >> S.compute(
summary_length=lambda s: len(s.get("summary", "")),
has_citations=lambda s: "cite" in s.get("text", "").lower()
) >> evaluator
S.guard(pred)¶
Assert an invariant on the state. Raises an error if the predicate fails:
# Ensure "data" key is present before proceeding
pipeline = agent >> S.guard(lambda s: "data" in s) >> processor
S.log(*keys)¶
Debug-print specified keys from state. Useful during development:
# Print "web" and "scholar" values for debugging
pipeline = (
(web_agent | scholar_agent)
>> S.log("web", "scholar")
>> S.merge("web", "scholar", into="research")
>> writer
)
STransform Composition¶
All S.xxx() methods return STransform objects that support two composition operators:
>> — Chain (sequential)¶
# Clean state then merge — runs in order
cleanup = S.drop("_internal") >> S.merge("web", "scholar", into="research")
pipeline = agent >> cleanup >> writer
+ — Combine (parallel)¶
# Apply multiple transforms to the same state at once
setup = S.default(confidence=0.0) + S.rename(research="input")
pipeline = agent >> setup >> writer
Composing with agents¶
STransform objects work seamlessly in pipelines:
# STransform >> Agent works directly
pipeline = S.capture("user_message") >> classifier >> handler
Key tracking¶
STransforms track which state keys they read and write:
t = S.rename(old="new")
print(t._reads_keys) # frozenset({'old'})
print(t._writes_keys) # frozenset({'new'})
This metadata powers the contract checker’s data-flow analysis.
Complete Example¶
The following pipeline demonstrates multiple transforms woven between agents:
Complete transform flow:
┬─ web ────┐
└─ scholar ┘ (parallel)
│
S.log ──► S.merge ──► S.default ──► S.rename
debug combine fill gaps rewire
│
writer @ Report
│
S.guard(confidence > 0) ── fail? ──► raise
from pydantic import BaseModel
from adk_fluent import Agent, S, until
class Report(BaseModel):
title: str
body: str
confidence: float
pipeline = (
( Agent("web").model("gemini-2.5-flash").instruct("Search web.")
| Agent("scholar").model("gemini-2.5-flash").instruct("Search papers.")
)
>> S.log("web", "scholar") # Debug
>> S.merge("web", "scholar", into="research") # Combine
>> S.default(confidence=0.0) # Default
>> S.rename(research="input") # Rename
>> Agent("writer").model("gemini-2.5-flash").instruct("Write.") @ Report
>> S.guard(lambda s: s.get("confidence", 0) > 0) # Assert
)