AI Self-Growth System
Aggregation as a Service
PremiumIn an era of information overload, filters are more valuable than production machines.
Aggregation as a Service: Sell Shovels on the Gold Rush Road
"In an age of information overload, filters are more valuable than content."
What you will get in this chapter
- Three forms of aggregation products and their scenarios
- Aggregation SOP (collect -> score -> summarize -> publish)
- Key metrics and acceptance criteria
One-sentence definition
Aggregation as a Service = continuous collection + strict filtering + stable output.
Users are not buying information, they are buying certainty and time.
Minimum viable aggregation system (MVS)
| Step | You need | Acceptance result |
|---|---|---|
| Sources | 3-5 vertical sources | New inputs every day |
| Scoring | One CurateScore model | Keep only the Top 10% |
| Output | One fixed column | Publish weekly and consistently |
| Distribution | Newsletter/site | Users can subscribe |
Qualified signal: publish consistently for 4 straight weeks without missing.
Three forms of aggregation products
- Weekly Digest: weekly summary, best for beginners
- Directory: tool/resource library, best for long-term SEO
- Dashboard: real-time intelligence, best for high-frequency scenarios
Aggregation SOP (standard process)
- Collect: RSS/API/crawlers into one pipeline
- Score: AI scoring + manual review
- Summarize: one-sentence summary + target audience
- Format: fixed template (column structure)
- Publish: fixed cadence and time
- Review: click data feeds back into scoring
CurateScore (filtering model)
Scoring dimensions
- Novelty: how new it is
- Usefulness: practical value
- Fit: fit to your domain
- Signal: source authority
Scoring formula (0-100)
CurateScore = 30*Usefulness + 25*Fit + 25*Signal + 20*NoveltyCore metrics (must track)
Definition (default):
- Time window: unless stated otherwise, use the last 7 days rolling.
- Data source: use one trusted source (GA4/GSC/platform console/logs) and keep it consistent.
- Scope: only the current product/channel, exclude self-tests and bots.
| Metric | Meaning | Pass line |
|---|---|---|
| Open Rate | Open rate | >= 30% |
| Click Rate | Click rate | >= 5% |
| Retention | 4-week retention | >= 60% |
| Unsub Rate | Unsubscribe rate | <= 2% |
Acceptance checklist
Source count >= 3 and covers the core topics
CurateScore runs end-to-end and filters stably
Fixed columns and publishing time are set
Common mistakes
- Too broad -> no vertical focus
- No filtering -> low signal-to-noise
- No cadence -> users do not build habit
Community case addendum (from developer communities)
The following are public community shares. Metrics are self-reported or taken from public pages and are not independently verified:
- HN Show HN: Cursor Directory aggregates Cursor rules into a directory. The author says the first version shipped in 3 hours, then added MCP, a trending board, and rule generation; directory aggregation + fixed entry is a typical "sell shovels" model. Link: https://news.ycombinator.com/item?id=43412295
Summary
Key takeaways
1. The value of aggregation comes from filtering, not quantity.
2. Fixed columns and cadence build user habit.
3. Use CurateScore to keep quality and consistency.
Next chapter, we will cover the Trend Prediction Engine -- catch future hotspots early.
AI Practice Knowledge Base