Navigate Ways of Working
AI Use Case Catalogue
Catalogue of 40 AI-assisted activities across the operating model. Each use case has a title, purpose, owner, context, required artefacts, produced artefacts, and a placeholder for the prompt.
This document catalogues every AI-assisted activity defined in the Product & Engineering Operating Model. Each use case is intended to be developed into a repeatable, prompt-driven workflow.
Refinement
See the Refinement section of the Engineering Process for the full process. Roles referenced below (PM, Lead Dev, QA) are defined in the Team Structure.
AI-001: Problem Statement Drafting
Purpose: Generate a first-draft problem statement and success criteria from raw inputs (customer feedback, stakeholder requests, strategic objectives) to give the PM a starting point for refinement rather than a blank page.
Owner: PM
Context: Preparation before the weekly refinement session. PM refines the AI output before presenting to the squad.
Artefacts Required:
- Customer feedback, support tickets, or stakeholder request (as input)
- Product roadmap / strategic context
- Existing backlog items (to avoid duplication)
Artefacts Produced:
- Draft problem statement
- Draft success criteria
- Draft constraints
Prompt: To be developed
AI-002: Acceptance Criteria Generation
Purpose: Generate draft acceptance criteria from a problem statement and known constraints, covering happy path, edge cases, and error scenarios.
Owner: PM (with QA review)
Context: Preparation before refinement. QA reviews and challenges the AI output during the refinement session.
Artefacts Required:
- Problem statement
- Known constraints
- Relevant existing feature behaviour / API contracts
Artefacts Produced:
- Draft acceptance criteria (happy path, edge cases, error handling)
Prompt: To be developed
AI-003: Preliminary Technical Assessment
Purpose: Generate an initial assessment of technical approach, likely services/components affected, rough complexity, and potential risks from a problem statement.
Owner: Lead Dev
Context: Preparation before refinement. Gives the Lead Dev a faster starting point for the feasibility conversation in the session.
Artefacts Required:
- Problem statement and acceptance criteria
- Codebase documentation / architecture overview
- Relevant ADRs
Artefacts Produced:
- Draft technical assessment: likely approach, components affected, rough complexity, identified risks
Prompt: To be developed
AI-004: Test Scenario Generation
Purpose: Generate draft test scenarios (functional, edge case, negative path) from a problem statement and acceptance criteria as a starting point for QA to react to.
Owner: QA
Context: Preparation before refinement. QA reviews, curates, and adds scenarios the AI missed during the session.
Artefacts Required:
- Problem statement
- Draft acceptance criteria
- Existing test coverage for related features
Artefacts Produced:
- Draft test scenarios (categorised by type: functional, edge case, negative, integration)
Prompt: To be developed
Sprint Planning
See the Sprint Planning section of the Engineering Process for the full process.
AI-005: Capacity Modelling
Purpose: Suggest a realistic number of items for the sprint based on the squad’s recent throughput history and known capacity constraints (holidays, on-call, overhead).
Owner: Lead Dev
Context: Preparation before sprint planning. Gives the Lead Dev a data-informed baseline for the commitment conversation.
Artefacts Required:
- Throughput data from last 4–6 sprints
- Known capacity constraints for upcoming sprint (holidays, on-call assignments)
- Candidate item sizes (small/medium/large from refinement)
Artefacts Produced:
- Recommended item count with confidence range
- Comparison to recent sprint actuals
Prompt: To be developed
AI-006: Cross-Squad Dependency Detection
Purpose: Analyse committed items across squads and flag potential code-level or service-level conflicts.
Owner: Staff Engineer
Context: Run at sprint planning or shortly after, to catch conflicts before work begins. Staff Engineer reviews and raises with relevant Lead Devs.
Artefacts Required:
- Committed sprint items across all squads (with technical approach notes where available)
- Codebase structure / service map
- Recent commits and active branches
Artefacts Produced:
- List of potential conflicts (service, file, or API-level)
- Severity assessment for each conflict
Prompt: To be developed
AI-007: Sprint Goal Drafting
Purpose: Articulate a clear, concise sprint goal from the set of candidate items and strategic context.
Owner: PM
Context: Preparation before sprint planning. PM refines before presenting to the squad.
Artefacts Required:
- Candidate sprint items (prioritised)
- Current product roadmap / quarterly objectives
- Previous sprint goal and outcome
Artefacts Produced:
- Draft sprint goal (one sentence)
Prompt: To be developed
Technical Design
See the Technical Design section of the Engineering Process for the full process.
AI-008: Implementation Plan Pressure Testing
Purpose: Review a developer’s proposed implementation approach and identify failure modes, edge cases, scalability concerns, and alternative approaches they may not have considered.
Owner: Dev (prior to Lead Dev conversation)
Context: After the dev has drafted their approach and before the technical design review with Lead Dev. AI raises the floor so the Lead Dev conversation focuses on judgment calls, codebase history, and operational context.
Artefacts Required:
- Dev’s proposed approach (bullet points, short doc, or Slack message)
- Problem statement and acceptance criteria
- Relevant codebase context / architecture overview
- Relevant ADRs
Artefacts Produced:
- List of potential failure modes
- Edge cases not addressed
- Alternative approaches with trade-offs
- Questions to resolve with Lead Dev
Prompt: To be developed
Development
See the Development section of the Engineering Process for the full process.
AI-009: Code Generation
Purpose: Accelerate development by generating code from specifications, following established codebase patterns and conventions.
Owner: Dev
Context: During active development. Dev reviews, tests, and adapts AI output. AI-generated code goes through the same CI gates as human-written code.
Artefacts Required:
- Agreed implementation approach (from technical design review)
- Relevant existing code (patterns, conventions, shared utilities)
- Acceptance criteria
Artefacts Produced:
- Draft code implementing the specified behaviour
Prompt: To be developed
AI-010: Test Writing
Purpose: Generate automated tests (unit, integration, edge case) from acceptance criteria and implementation code.
Owner: Dev and QA
Context: During development. Dev generates tests for their own code. QA generates additional edge case and negative path tests in parallel. All tests reviewed by a human before committing.
Artefacts Required:
- Implementation code
- Acceptance criteria
- Test scenarios from refinement (AI-004)
- Existing test patterns and frameworks in use
Artefacts Produced:
- Draft test code (unit, integration, edge case, negative path)
Prompt: To be developed
AI-011: CI Code Review
Purpose: Automated code review in the CI pipeline, checking for style issues, common bugs, security anti-patterns, and deviation from codebase conventions.
Owner: Platform squad (builds and maintains); runs automatically
Context: Runs on every commit as part of CI. Supplements linting, SAST, and test suites. Findings are surfaced to the committing dev.
Artefacts Required:
- Committed code diff
- Codebase conventions documentation / AI playbook
- Security patterns documentation
Artefacts Produced:
- Review comments: style issues, potential bugs, security concerns, convention deviations
Prompt: To be developed
AI-012: Documentation Generation
Purpose: Generate or update technical documentation (code comments, README sections, inline documentation) from code changes.
Owner: Dev
Context: During or immediately after development. Dev reviews and refines output.
Artefacts Required:
- New or changed code
- Existing documentation for the affected area
- Documentation standards from “how we work”
Artefacts Produced:
- Draft documentation updates
Prompt: To be developed
Incident Management
See the Incident Management section of the Engineering Process for the full process.
AI-013: Incident Diagnostic Assistance
Purpose: Accelerate incident triage by analysing logs, identifying related recent deployments, and suggesting similar past incidents.
Owner: On-call engineer
Context: During an active incident. AI is a diagnostic accelerator, not a decision-maker. On-call engineer uses AI output to narrow the search space.
Artefacts Required:
- Error logs and monitoring data
- Recent deployment history
- Previous incident review records
- Service dependency map
Artefacts Produced:
- Likely contributing factors
- Related recent deployments
- Similar past incidents with resolution details
Prompt: To be developed
AI-014: Incident Timeline Drafting
Purpose: Generate a first-draft incident timeline from alerting data, deployment logs, and Slack/comms threads to save Lead Dev preparation time for the incident review.
Owner: Lead Dev (for incident review preparation)
Context: After an incident is resolved, before the incident review. Lead Dev reviews and corrects the timeline before the review session.
Artefacts Required:
- Alerting/monitoring data with timestamps
- Deployment logs around the incident window
- Slack/comms thread from incident response
- Resolution details
Artefacts Produced:
- Draft incident timeline
- Draft impact summary
- Draft list of contributing factors
Prompt: To be developed
AI-015: Incident Pattern Detection
Purpose: Identify recurring incident themes across squads and over time, surfacing systemic issues that aren’t obvious sprint-to-sprint.
Owner: Head of Eng
Context: Periodic review (monthly or quarterly). Informs where reliability investment is needed.
Artefacts Required:
- All incident review records
- Service ownership map
- Deployment history
Artefacts Produced:
- Pattern analysis: recurring themes, correlated services, frequency trends
- Recommended areas for reliability investment
Prompt: To be developed
Releases
See the Releases section of the Engineering Process for the full process.
AI-016: Rollout Monitoring & Auto-Revert
Purpose: Monitor production metrics during a percentage rollout in real-time and automatically revert the feature flag if predefined thresholds are breached.
Owner: Platform squad (builds and maintains); runs automatically
Context: Active during any percentage rollout. Faster reaction than a human on-call engineer.
Artefacts Required:
- Predefined rollback criteria (error rate, latency thresholds)
- Production monitoring data (real-time)
- Feature flag configuration
- Control group baseline metrics
Artefacts Produced:
- Continuous monitoring assessment
- Auto-revert action if thresholds breached
- Alert to on-call engineer and PM
Prompt: To be developed (likely a configuration rather than a conversational prompt)
AI-017: Release Note Generation
Purpose: Generate release notes from committed code, ticket descriptions, and sprint goals to save PM time on release communication.
Owner: PM
Context: At release time. PM reviews and edits before distribution.
Artefacts Required:
- Committed items in the release (ticket titles, descriptions)
- Code commit messages
- Sprint goal
Artefacts Produced:
- Draft release notes (user-facing summary)
Prompt: To be developed
AI-018: Stale Feature Flag Detection
Purpose: Identify feature flags older than defined thresholds and associated dead code paths that should be cleaned up.
Owner: Platform squad
Context: Monthly flag hygiene report. Flags older than 60 days without documented reason escalated to Lead Dev.
Artefacts Required:
- Feature flag inventory with creation dates
- Flag usage in codebase
- Flag status (active/inactive)
Artefacts Produced:
- Report: flags older than 30/60 days, associated code paths, recommended cleanup actions
Prompt: To be developed
AI-019: Database Migration Analysis
Purpose: Analyse a migration script against the current schema and flag potential issues (table locking, data loss, backward compatibility violations) before human review.
Owner: Dev (before submitting migration for review)
Context: Before the Lead Dev / Staff Engineer reviews the migration. AI catches mechanical issues so the human review focuses on design judgment.
Artefacts Required:
- Migration script
- Current database schema
- Table row counts / size estimates
- Application code that references affected tables
Artefacts Produced:
- Risk assessment: locking potential, estimated execution time, backward compatibility analysis
- Recommended approach if issues found (e.g., online migration tooling)
Prompt: To be developed
QA
See the QA section of the Engineering Process for the full process.
AI-020: Production Anomaly Detection
Purpose: Monitor production quality metrics and detect anomalies that may indicate regressions or emerging issues, including comparison of pre/post-release behaviour.
Owner: QA (monitors); Platform squad (builds and maintains)
Context: Continuous, post-release. QA investigates flagged anomalies.
Artefacts Required:
- Production monitoring data (error rates, latency, user-facing failures)
- Baseline metrics from pre-release period
- Recent release/deployment history
Artefacts Produced:
- Anomaly alerts with context (what changed, when, likely related deployment)
- Pre/post-release comparison report
Prompt: To be developed (likely a configuration rather than a conversational prompt)
Cross-Squad Dependencies
See the Cross-Squad Dependencies section of the Engineering Process for the full process.
AI-021: Dependency Trend Analysis
Purpose: Track cross-squad dependency requests over time and surface structural problems (e.g., one squad disproportionately dependent on platform squad).
Owner: Head of Eng
Context: Quarterly review. Informs squad structure and platform squad resourcing decisions.
Artefacts Required:
- Platform squad intake log (requests, requesting squad, timestamps, resolution times)
- Squad-to-squad request history
Artefacts Produced:
- Trend analysis: volume, wait times, requesting squad breakdown
- Structural recommendations
Prompt: To be developed
AI-022: API Documentation & Migration Guide Generation
Purpose: Auto-generate or update API documentation when contracts change, and produce migration guides for consuming squads.
Owner: Platform squad
Context: When a shared API contract changes. Reduces coordination cost between platform squad and product squads.
Artefacts Required:
- API code / OpenAPI spec (current and changed)
- List of consuming squads/services
- Breaking vs non-breaking change classification
Artefacts Produced:
- Updated API documentation
- Migration guide for consuming squads (if breaking change)
Prompt: To be developed
Metrics
See the Metrics section of the Engineering Process for the full process.
AI-023: Weekly Squad Summary Generation
Purpose: Generate a narrative summary of squad delivery metrics from automated data sources, saving Head of Eng preparation time.
Owner: Head of Eng
Context: Weekly. Provides a shared baseline for leadership discussions and squad health monitoring.
Artefacts Required:
- Project tracker data (throughput, cycle time, planned vs unplanned)
- CI/CD data (deployment frequency, change failure rate)
- Monitoring data (MTTR)
- Previous week’s summary (for trend comparison)
Artefacts Produced:
- Per-squad narrative summary with key metrics, trends, and flagged anomalies
Prompt: To be developed
AI-024: Metrics Anomaly Detection
Purpose: Automatically flag when delivery or quality metrics deviate significantly from trend, prompting proactive investigation.
Owner: Head of Eng (reviews); Platform squad (builds and maintains)
Context: Continuous. Alerts Head of Eng when investigation is warranted.
Artefacts Required:
- Historical metrics data (4–6 sprints minimum)
- Current sprint metrics
- Known contextual factors (holidays, incidents, large migrations)
Artefacts Produced:
- Anomaly alerts with context and possible explanations
Prompt: To be developed (likely a configuration rather than a conversational prompt)
AI-025: Metrics Correlation Analysis
Purpose: Identify correlations across metrics over time that humans don’t easily spot (e.g., “squads consistently using engineering allocation have fewer SEV1 incidents”).
Owner: Head of Eng
Context: Quarterly review. Informs strategic decisions about process and investment.
Artefacts Required:
- Historical metrics data across all squads (6+ months)
- Incident records
- Engineering allocation usage history
Artefacts Produced:
- Correlation report with supporting data
- Suggested areas for investigation or investment
Prompt: To be developed
Retrospectives
See the Retrospectives section of the Engineering Process for the full process.
AI-026: Retro Theme Pattern Detection
Purpose: Identify recurring themes across retrospectives over time, flagging issues that persist across multiple cycles.
Owner: Lead Dev (per squad); Head of Eng (cross-squad)
Context: At the start of each retro, to inform the discussion. Also used at quarterly cross-squad retros.
Artefacts Required:
- Retro action items and themes from previous 4+ retros
- Action completion status
Artefacts Produced:
- Pattern report: recurring themes, unresolved actions, frequency trends
Prompt: To be developed
Onboarding
See the Onboarding section of the Engineering Process for the full process.
AI-027: Codebase Exploration Assistance
Purpose: Enable new joiners to explore and understand the codebase through natural language queries, supplementing the Lead Dev walkthrough.
Owner: New joiner (self-service)
Context: First weeks of onboarding. New joiner asks questions about services, architecture, data flow, and conventions. AI provides answers grounded in the actual codebase.
Artefacts Required:
- Codebase (RAG-indexed or available to AI tool)
- Architecture overview
- ADRs
- AI playbook
Artefacts Produced:
- Answers to codebase questions with references to relevant code and documentation
Prompt: To be developed (RAG/tooling configuration rather than a single prompt)
AI-028: Personalised Codebase Orientation
Purpose: Generate a tailored orientation document for a new joiner based on the product area and services they’ll be working in.
Owner: Lead Dev (initiates); new joiner (consumes)
Context: First week of onboarding. Lead Dev specifies which areas the new joiner will focus on. AI generates a targeted guide.
Artefacts Required:
- Codebase structure
- Architecture overview
- ADRs relevant to the focus area
- Relevant runbooks
Artefacts Produced:
- Personalised orientation document: key services, data flows, important patterns, relevant ADRs, who owns what
Prompt: To be developed
Documentation
See the Documentation section of the Engineering Process for the full process.
AI-029: Runbook Drafting
Purpose: Generate first-draft runbooks for services from codebase, infrastructure configuration, and monitoring setup.
Owner: Dev or QA (for new services); Lead Dev (reviews)
Context: When a new service goes to production or when an incident reveals a runbook gap. Human reviews and refines.
Artefacts Required:
- Service code and configuration
- Infrastructure/deployment configuration
- Monitoring/alerting setup
- Known failure modes (from incident history if available)
Artefacts Produced:
- Draft runbook: service overview, common failure scenarios, diagnostic steps, remediation actions, escalation path
Prompt: To be developed
AI-030: API Documentation Sync
Purpose: Detect when API code has changed and documentation is out of sync, and generate updated documentation or flag the drift.
Owner: Platform squad (builds and maintains); runs automatically
Context: Continuous, triggered by code changes to API endpoints. Prevents documentation from going stale.
Artefacts Required:
- API code / OpenAPI spec
- Existing API documentation
- Change diff
Artefacts Produced:
- Updated API documentation, or
- Drift report flagging undocumented changes
Prompt: To be developed (likely CI integration rather than a conversational prompt)
AI-031: Undocumented Service Detection
Purpose: Compare the codebase against existing documentation to identify services, endpoints, or components that lack documentation.
Owner: Staff Engineer
Context: Periodic (quarterly or as part of documentation health check). Feeds into documentation backlog.
Artefacts Required:
- Codebase service/component inventory (derived from code)
- Existing documentation inventory
- Architecture overview
Artefacts Produced:
- Gap report: undocumented services, endpoints, and components
Prompt: To be developed
Security
See the Security section of the Engineering Process and the DevOps Principles for related guidance.
AI-032: Security-Focused Code Review
Purpose: Targeted code review for security anti-patterns, running as a layer on top of traditional SAST tools and catching context-dependent issues that rule-based scanners miss.
Owner: Platform squad (builds and maintains); runs automatically in CI
Context: Every commit, as part of CI pipeline. Supplements SAST, doesn’t replace it.
Artefacts Required:
- Code diff
- Security patterns documentation
- Known vulnerability patterns for the tech stack
Artefacts Produced:
- Security review findings with severity and remediation guidance
Prompt: To be developed
AI-033: Dependency Advisory Monitoring
Purpose: Monitor dependency vulnerability advisories and cross-reference against the codebase, flagging exposure before automated scanners catch up.
Owner: Platform squad
Context: Continuous. Provides earlier warning than standard dependency scanning tools which update on their own schedule.
Artefacts Required:
- Dependency manifest (package.json, requirements.txt, etc.)
- Vulnerability advisory feeds
- Codebase usage of affected dependencies
Artefacts Produced:
- Early exposure alerts with affected code paths and severity assessment
Prompt: To be developed (likely a tooling/automation configuration)
AI-034: Security Incident Investigation
Purpose: Analyse access logs and system behaviour during a security incident to identify scope of exposure faster than manual log review.
Owner: On-call engineer / Lead Dev
Context: During or immediately after a security incident. AI accelerates investigation; humans make decisions.
Artefacts Required:
- Access logs (authentication, API, database)
- System/application logs
- Timeline of known incident events
- Normal baseline behaviour patterns
Artefacts Produced:
- Scope assessment: what was accessed, by whom, when
- Anomalous access patterns identified
- Recommended containment actions
Prompt: To be developed
AI-035: Security Documentation Generation
Purpose: Generate data flow diagrams and documentation showing where PII and sensitive data lives and moves through the system.
Owner: Staff Engineer
Context: Periodic (annually or when architecture changes significantly). Aids compliance and internal understanding.
Artefacts Required:
- Codebase (data models, API contracts, service interactions)
- Infrastructure configuration
- Existing architecture overview
Artefacts Produced:
- Data flow diagrams: PII locations, movement between services, storage mechanisms, access controls
- Sensitive data inventory
Prompt: To be developed
Work Intake
AI-036: Request Triage & Categorisation
Purpose: Categorise incoming work requests, link them to existing backlog items, and draft initial assessments (rough sizing, likely squad ownership, roadmap overlap).
Owner: PM
Context: When new requests arrive from stakeholders, customers, or support. Reduces PM triage overhead.
Artefacts Required:
- Incoming request
- Current backlog
- Product roadmap
- Squad ownership map
Artefacts Produced:
- Categorisation and likely squad assignment
- Links to similar/duplicate existing items
- Draft assessment: rough size, roadmap alignment
Prompt: To be developed
AI-037: Customer Feedback Synthesis
Purpose: Process support tickets and feature requests to surface patterns and trends, so PMs work from data rather than anecdote.
Owner: PM (PM 4 / analytics PM where applicable)
Context: Periodic (weekly or sprint-aligned). Feeds into roadmap decisions and refinement priorities.
Artefacts Required:
- Support tickets (recent period)
- Feature requests (recent period)
- Existing backlog and roadmap
Artefacts Produced:
- Theme analysis: top issues by frequency and severity
- Trend comparison to previous period
- Links to related backlog items
Prompt: To be developed
Unplanned Work
AI-038: Unplanned Work Buffer Monitoring
Purpose: Flag when a sprint’s unplanned work is approaching the capacity buffer threshold, prompting a proactive conversation.
Owner: Lead Dev / PM
Context: During a sprint. Triggers a discussion about sprint scope adjustment before the buffer is exhausted.
Artefacts Required:
- Sprint committed items and status
- Unplanned items added to sprint (tagged)
- Buffer threshold (10–15% of capacity)
Artefacts Produced:
- Alert when buffer is approaching/exceeded
- Current unplanned work summary and capacity impact
Prompt: To be developed (likely a dashboard/automation configuration)
AI-039: Unplanned Work Pattern Analysis
Purpose: Correlate unplanned work spikes with contributing factors (specific services, release timing, external events) to identify where systemic investment is needed.
Owner: Head of Eng
Context: Quarterly review. Informs reliability investment and process improvement.
Artefacts Required:
- Unplanned work history (tagged items across sprints)
- Deployment/release history
- Incident records
- Service ownership map
Artefacts Produced:
- Correlation analysis: unplanned work by source, timing, and service
- Recommended investment areas
Prompt: To be developed
AI-040: Discovered Work Scope Assessment
Purpose: When a dev flags “this is worse than expected,” AI performs preliminary codebase analysis to quantify the additional scope so the PM can make an informed trade-off decision quickly.
Owner: Dev (initiates); PM (consumes for decision-making)
Context: Mid-sprint, when discovered work threatens sprint commitments. Speed matters — PM needs information fast.
Artefacts Required:
- Dev’s description of what was discovered
- Affected code area
- Original item estimate and approach
- Sprint remaining capacity
Artefacts Produced:
- Scope assessment: estimated additional effort, affected components, options (fix now vs. defer vs. workaround)
- Trade-off summary for PM
Prompt: To be developed