Generate E2E Tests
Generate E2E test files from E2E_TESTS.md specifications. This creates new test files for all specified suites, overwriting any existing generated tests.
Additional instructions from the user: "$ARGUMENTS". Ignore if empty.
Warning
Before proceeding, warn the user:
Warning: This skill will delete all existing E2E test files and regenerate them from scratch based on the
E2E_TESTS.mdspecifications. Any manual edits or customizations in existing test files will be lost.If you only need to bring existing tests in sync with updated specs — without discarding them — use
/fp-updateinstead. That skill analyzes the diff between specs and tests, and applies incremental changes.
Ask the user to confirm they want to proceed with full regeneration. Do not continue unless they explicitly confirm.
This command has four phases. Complete all four in order.
Phase 1: Discover
- Find all
E2E_TESTS.mdfiles by searching recursively from the project root. - If E2E tests already exist, run them first to establish a baseline of current pass/fail status. Note any pre-existing failures.
- Read the root-level spec first to understand project-wide testing constraints.
- Read each package-level
E2E_TESTS.mdand extract all suites, features, preconditions, postconditions, and metadata. - Identify the project's:
- Programming language and test framework (from package files, existing tests, config)
- Existing test utilities (look for
e2e-utilsfiles) - Test runner configuration (look for E2E-specific config files)
- Existing patterns and conventions
Phase 2: Plan
For each suite in each spec:
- Determine the target test file path following the project's naming conventions.
- Map each feature to a test case, grouping by category.
- Map
Preconditionssections (at whatever heading level they appear) to per-test or per-suite setup hooks (the framework's equivalent of running code before each test or before all tests in a suite). - Map
Postconditionssections (at whatever heading level they appear) to assertions and per-test teardown hooks (the framework's equivalent of running cleanup code after each test). - Identify shared utilities needed (cleanup, mock helpers, git init).
- Plan the mock strategy for each external dependency.
Present the plan to the user:
- List of files to be created
- Number of test cases per file
- Any shared utilities that need to be created or updated
Ask the user if they want to proceed or adjust the plan.
Phase 3: Generate
Delete existing autogenerated E2E tests
Before generating anything, find and delete all existing autogenerated E2E test files. Autogenerated files are identified by the presence of an autogenerated header comment. Remove these files so that tests are created from a clean slate.
Create new test files
For each planned test file:
- Add the autogenerated header with references to the source spec files.
- Import required dependencies and shared utilities.
- Create the outer suite/group block for the suite (e.g.,
describe()in vitest/jest,suitein other frameworks). - Implement the per-test setup hook (e.g.,
beforeEachin vitest/jest) from preconditions:- Create temp directory
- Save and set environment variables
- Initialize git repository if needed
- Set up mock tools
- Implement the per-test teardown hook (e.g.,
afterEachin vitest/jest) for cleanup:- Restore CWD and environment variables
- Terminate background processes
- Safe cleanup of temp directory
- For each feature, create a test case:
- Add category as a comment
- Implement Setup-Execute-Verify pattern
- Handle
skipmetadata withskipIfand skip documentation block
- Add the autogenerated footer.
Generation Rules
- Follow existing patterns: If the project already has E2E tests, match their style exactly (imports, assertion style, helper usage).
- Create shared utilities: If
e2e-utilsdoesn't exist, create it withsafeCleanup, mock helpers, and common constants. - Use real implementations: Real file systems, real git repos. Mock only external CLI tools and services.
- Sequential execution: Configure the test runner for sequential execution (single fork) if not already configured.
- Framework adaptation: Use the project's actual test framework, not pseudocode. Adapt all patterns to the real language and tooling.
Phase 4: Verify
- Run the E2E tests for each package that was generated.
- If tests fail:
- Read the failure output
- Fix the test implementation (not the spec)
- Re-run until passing
- Run the project's formatter/linter on generated files.
- Present a summary:
- Files created
- Test cases generated per file
- Category breakdown
- Any issues encountered and how they were resolved