Spike Summary Documentation
Documents results of time-boxed technical or design exploration. Captures learnings, findings, and recommendations to help teams make informed decisions on technology choices and feasibility.
name: spike-summary description: Documents the results of a time-boxed technical or design exploration (spike). Use after completing a spike to capture learnings, findings, and recommendations for the team. license: Apache-2.0 metadata: category: coordination frameworks: [triple-diamond, lean-startup, design-thinking] author: product-on-purpose version: "1.0.0"
Spike Summary
A spike summary documents the results of a time-boxed exploration — a focused investigation to reduce uncertainty before committing to implementation. Spikes answer specific questions like "Can we integrate with this API?" or "Is this technology viable for our use case?" The summary captures findings so the team can make informed decisions without the spike participants needing to repeat explanations.
When to Use
- After completing a time-boxed technical exploration
- When evaluating technology choices or vendor options
- After proof-of-concept work that needs to inform team decisions
- When investigating feasibility of a proposed solution
- Before committing engineering resources to a new approach
Instructions
When asked to document a spike, follow these steps:
-
State the Question Clearly Articulate the specific question the spike was designed to answer. Good spike questions are focused and answerable with the time-box available. If the question evolved during the spike, document both the original and final versions.
-
Define the Time-Box Document the time allocated (e.g., 3 days) and actual time spent. If the spike exceeded its time-box, explain why and note any remaining work.
-
Describe the Approach Explain what was tried, in what order, and why. This helps future readers understand the methodology and whether alternative approaches were considered.
-
Present Findings with Evidence Document what was learned, supported by concrete evidence — code samples, performance benchmarks, screenshots, or API responses. Distinguish between verified findings and hypotheses that need more testing.
-
Make a Clear Recommendation Answer the original question directly: proceed, do not proceed, or proceed with conditions. Avoid hedging — the team needs actionable guidance.
-
Document Artifacts Link to any code, prototypes, diagrams, or documentation created during the spike. These artifacts often have ongoing value beyond the summary.
-
Capture Open Questions Note what the spike didn't answer and what additional investigation might be needed.
Output Format
Use the template in references/TEMPLATE.md to structure the output.
Quality Checklist
Before finalizing, verify:
- [ ] Original question is clearly stated
- [ ] Time-box is documented (allocated vs. actual)
- [ ] Findings are supported by evidence, not just opinions
- [ ] Recommendation directly answers the question
- [ ] Artifacts (code, diagrams) are linked or attached
- [ ] Open questions identify remaining unknowns
Examples
See references/EXAMPLE.md for a completed example.
Related skills
API Documentation Generator
Automatically generates OpenAPI/Swagger API documentation.
Technical Writer
Writes clear technical documentation following top style guides.
Markdown to PDF Converter
Convert markdown files to clean, formatted PDFs using reportlab. Perfect for generating polished documents from your markdown notes and drafts.