spec:test-plan

Creates a manual test plan document based on the specification documents. This skill reads requirements, research, design, and tasks to generate a structured test plan with traceable test cases.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "spec:test-plan" with this command: npx skills add ikatsuba/skills/ikatsuba-skills-spec-test-plan

Generate Test Plan

Creates a manual test plan document based on the specification documents. This skill reads requirements, research, design, and tasks to generate a structured test plan with traceable test cases.

When to use

Use this skill when the user needs to:

  • Create a manual test plan after implementation is complete

  • Generate test scenarios from existing specification documents

  • Establish a structured testing phase for a feature

Specification Files Structure

All specification documents are located in .specs/<spec-name>/ directory:

File Description

.specs/<spec-name>/requirements.md

Requirements and acceptance criteria

.specs/<spec-name>/research.md

Research findings and chosen solutions

.specs/<spec-name>/design.md

Technical design and architecture

.specs/<spec-name>/tasks.md

Implementation tasks with checkboxes

Read all available files to understand the full context before generating the test plan.

Instructions

Step 1: Locate and Read Specification Documents

  • If <args> contains a spec name, look in .specs/<spec-name>/

  • If no spec name provided, list available specs in .specs/ and use the AskUserQuestion tool to let the user choose

  • Read and parse all specification documents:

  • requirements.md

  • understand what needs to be tested

  • research.md

  • understand chosen solutions and edge cases

  • design.md

  • understand architecture and integration points

  • tasks.md

  • understand what was implemented

Step 2: Analyze for Test Coverage

Before writing the test plan:

  • Extract all testable requirements from requirements.md (SHALL/WHEN-THEN statements)

  • Identify user flows and feature areas from the design

  • Note edge cases, error states, and boundary conditions from research

  • Review tasks to understand which components were built and how they connect

  • Group related test cases by feature area or user flow

Step 3: Create the Test Plan Document

Create the document at .specs/<spec-name>/test-plan.md with this structure:

Manual Test Plan: [Feature Name]

Overview

[What is being tested, scope, and goals of this test plan]

Prerequisites

  • [Environment setup needed]
  • [Test data or accounts required]
  • [Access or permissions needed]
  • [Dependencies that must be running]

Test Scenarios

  • 1. [Scenario Group Name]

    • 1.1 [Test Case Name]

      • Preconditions: [Required state before testing]
      • Steps:
        1. [Action to perform]
        2. [Next action]
        3. [Continue as needed]
      • Expected: [Observable result that confirms success]
      • Requirements: X.X
    • 1.2 [Test Case Name]

      • Preconditions: [Required state]
      • Steps:
        1. [Action]
      • Expected: [Result]
      • Requirements: X.X
  • 2. [Another Scenario Group]

    • 2.1 [Test Case]
      • Preconditions: [State]
      • Steps:
        1. [Action]
      • Expected: [Result]
      • Requirements: X.X

Summary

  • Total: N tests
  • Passed: 0
  • Failed: 0
  • Skipped: 0

Test Plan Guidelines

  • Trace every requirement - Each testable requirement from requirements.md must be covered by at least one test case via Requirements: X.X

  • Group by feature area - Organize scenario groups by user flow or feature area, not by requirement number

  • Include edge cases - Add test cases for error states, boundary conditions, empty states, and invalid inputs

  • Be specific in steps - Each step should be a concrete action the tester can perform without ambiguity

  • Be specific in expected results - Describe exactly what the tester should observe, not vague outcomes

  • Include preconditions - State any setup needed before running the test case

  • Order logically - Start with happy paths, then edge cases, then error scenarios within each group

Checkbox States

  • [ ]

  • Pending (not tested)

  • [-]

  • In progress (currently being tested)

  • [x]

  • Passed

  • [!]

  • Failed

Step 4: Confirm with User

After creating the document, show the user:

  • The location of the created file

  • A summary of the test plan structure

  • Total number of test scenarios and test cases

  • Coverage: which requirements are covered

  • Use the AskUserQuestion tool to ask if they want to make changes or start testing, with options like "Looks good, start testing", "I want to make changes", "Review test plan first"

Arguments

  • <args>
  • The spec name (e.g., "user-auth", "payment-flow")

If not provided, list available specs and ask the user to choose.

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Research

spec:research

No summary provided by upstream source.

Repository SourceNeeds Review
General

spec:design

No summary provided by upstream source.

Repository SourceNeeds Review
General

spec:requirements

No summary provided by upstream source.

Repository SourceNeeds Review
General

git:amend

No summary provided by upstream source.

Repository SourceNeeds Review