Daily News Collector
Project Link
Positioning
This is a news collection and daily briefing project rewritten on top of Agently v4. After checking the current v4.0.8.3 source, it is no longer just a Search + Browse + TriggerFlow example. It now serves as a fuller reference for parent flow / child flow composition, capture / write_back, runtime_resources, and structured outputs in one project.
1. Layered architecture
How to read this diagram
news_collector/is the application assembly layer, while the actual process logic lives inworkflow/.workflow/is now clearly split into the parent flow, the per-column sub flow, report-level chunks, and column-level chunks, instead of hiding the whole column pipeline inside one oversized handler.
2. Runtime flow
Design rationale
The key point is no longer “call a generate-column function inside a loop.” The current shape is:
- the parent flow owns
prepare_request -> generate_outline -> for_each(column) -> render_report column_sub_flowownssearch -> pick -> summarize -> write_column- parent and child communicate explicitly through
capture / write_back render_reportreceives the converged list of column results afterend_for_each()
That is also why this project is now a stronger v4.0.8.3 reference:
- internal
for_eachnodes can be expanded in Mermaid - nested
sub_flowstructure can be shown inline - downstream parent nodes no longer get rendered inside the loop group by mistake
3. Structural layers
The project is split into four layers:
news_collector/app/integration layer for config, CLI, and Agently setupworkflow/TriggerFlow definition and chunk implementations, including the parent flow, column sub flow, report-level chunks, and column-level chunkstools/Search/Browse adapter layerprompts/structured output contracts
4. v4.0.8.3 capabilities it actually uses
4.1 Parent/child flow composition plus concurrent fan-out
In workflow/daily_news.py, the parent flow is:
prepare_requestgenerate_outlinefor_each(concurrency=settings.workflow.column_concurrency)to_sub_flow(column_sub_flow, capture=..., write_back=...)end_for_each()render_report
The column_sub_flow is itself an independent chain:
search_column_newspick_column_newssummarize_column_newswrite_column
So the project directly uses:
TriggerFlow.for_each(concurrency=...)TriggerFlow.to_sub_flow(...)captureto pass the parent flow's currentvalue,runtime_data.request, andlogger / search_tool / browse_toolinto the child flow explicitlywrite_back={"value": "result"}to return the child flow's final result into the parent loop item output.end_for_each()to converge all per-column results forrender_report
4.2 runtime_resources injection
In news_collector/collector.py, the flow is wired through:
self.flow.update_runtime_resources(
logger=self.logger,
search_tool=search_tool,
browse_tool=browse_tool,
)This injects dependencies into execution runtime instead of capturing them in closures. The parent flow then forwards those resources to column_sub_flow explicitly through capture["resources"].
4.3 Built-in Search and Browse
tools/builtin.py wraps:
SearchBrowse
Its Browse setup directly uses:
enable_playwrightresponse_modemax_content_lengthmin_content_lengthplaywright_headless
4.4 Structured output contracts
In workflow/report_chunks.py and workflow/column_chunks.py, outline generation, story selection, summarization, and column writing all rely on YAML prompts plus structured result constraints to keep step interfaces stable.
4.5 ${ENV.xxx} + auto_load_env=True
In news_collector/collector.py, the setup uses:
Agently.set_settings(
self.settings.model.provider,
model_settings,
auto_load_env=True,
)and SETTINGS.yaml uses ${ENV.xxx} placeholders.
5. Why this is now a more complete TriggerFlow reference
In the current v4.0.8.3 code, Daily News Collector covers all of the following in one place:
- report-level parent flow orchestration
for_eachfan-out / fan-insub_flowextraction and reuse- explicit
capture / write_backboundaries runtime_resourcespropagation across parent and child flows- Mermaid visualization for loop internals and nested child-flow structure
So it is no longer just a tool-integration example. It is now closer to a real TriggerFlow composition reference for production-style projects.
6. Current typing style
The project's chunks and helper functions now consistently use:
from agently import TriggerFlowRuntimeDataThat means:
- the project no longer depends on the older
TriggerFlowEventDatacompatibility alias - the runtime handler signature now matches the current documentation recommendation
- parent data such as
requestis passed into the child flow through explicitcapture, then read fromruntime_datainside the child flow rather than through implicit shared state