Banner image for Change Proposals
Core Concepts 5 min read

Change Proposals

Review, accept, and apply AI-discovered requirement changes from document extraction, journey analysis, and stakeholder discovery

Updated
On this page

A Change Proposal is a persistent, reviewable suggestion to create, update, or connect an entity — surfaced by AI analysis of documents, journeys, meetings, or code, or created manually by a team member. Change Proposals are Catalio’s mechanism for keeping humans in control of what enters the requirements catalog while enabling AI to do the discovery heavy lifting.

The Problem Change Proposals Solve

AI discovery is powerful but imperfect. When Catalio analyzes an uploaded document, processes a meeting transcript, or observes a Journey, it generates many potential requirements signals — some accurate, some redundant, some incorrect.

Rather than automatically creating requirements from AI analysis (which would pollute the catalog with noise), Catalio creates Change Proposals. These proposals surface AI discoveries for human review. Team members accept what is accurate, modify what is close, and dismiss what is wrong.

This pattern is sometimes called “human-in-the-loop AI” — the AI does the initial labor-intensive extraction, humans provide judgment and domain knowledge.

Proposal Action Types

A Change Proposal can propose one of three action types:

Action What It Does
create Propose creating a new entity (e.g., a new Requirement)
update Propose changes to an existing entity (before/after field snapshots)
connect Propose linking an existing entity to the Initiative scope

Proposal Sources

Change Proposals have a source field indicating where the proposal came from:

Source Meaning
ai_discovery AI-discovered from context analysis or conversation
document_extraction Extracted from an uploaded document or artifact
meeting_extraction Extracted from a meeting transcript (conservative confidence)
manual Manually created by a team member
initiative_analysis Generated from cross-Initiative analysis
artifact_extraction Extracted from an uploaded artifact via NLP/entity recognition
code_analysis Generated from code repository analysis

Proposal Lifecycle

Change Proposals use an existence-based lifecycle — there is no explicit state field. Instead:

  • Exists = proposed (on the table for review)
  • Deleted = rejected (team decided not to act on it)
  • Applied = accepted and applied (applied_at is set)

At the stage gate from Approval to Build, surviving proposals are batch-applied to the requirements catalog. This creates a clean demarcation: discovery and proposal-gathering happens in Planning/Approval; locked specifications exist in Build.

Key Fields

Field Purpose
title Short description of the proposed change
rationale Why this change is being proposed
target_type Entity type being proposed: requirement, capability, etc.
target_id The entity being updated (for update and connect proposals)
proposed_changes Map of field → new value (for update proposals)
original_values Map of field → current value (before snapshot)
source Where this proposal came from
initiative_id The Initiative this proposal belongs to (optional)
application_id The Application this proposal relates to
feature_id The Feature this proposal relates to (optional)
applied_at When the proposal was batch-applied

Reviewing Proposals

The Change Proposals review queue is one of the most important ongoing activities during the Planning and Approval stages of an Initiative. A healthy review cadence:

  1. Daily: Check for new proposals from overnight AI analysis
  2. Weekly: Systematic batch review of accumulated proposals
  3. Stage gate: Complete review before advancing from Approval to Build

For each proposal, a reviewer decides:

  • Apply — Accept the change and mark it applied
  • Dismiss — Reject the proposal (it is soft-deleted)
  • Modify and apply — Edit the proposal content, then apply

Gap Classification and Quality Assessment

Proposals can be annotated with a gap classification (what type of gap the proposal addresses) and a quality assessment (how well-formed the proposed change is). These annotations help triage large volumes of proposals efficiently.

Standalone Proposals

While most proposals are Initiative-scoped, Catalio supports standalone proposals that are not linked to any Initiative. These are useful for:

  • Ad-hoc requirement suggestions outside formal engagements
  • Changes discovered during routine source monitoring
  • Quick proposals generated during live AI sessions

Best Practices

Review proposals within 48 hours of generation.

Proposals lose context rapidly. The team member who triggered the discovery session has the clearest understanding of what the AI extracted — review promptly while that context is fresh.

Dismiss aggressively, apply deliberately.

It is better to have a clean catalog with 80% of the right requirements than a noisy catalog with 120% and 40% noise. Dismiss duplicates, low-confidence extractions, and out-of-scope proposals freely.

Use rationale for traceability.

When manually creating proposals, fill in the rationale field with why this change is being proposed. Future reviewers (and auditors) benefit from the context.

Batch apply at stage gates, not ad hoc.

The Planning phase is for discovery and proposal generation. The Approval phase is for review. Apply proposals at the Approval → Build gate, not piecemeal during discovery — this keeps the requirements catalog stable during active specification work.

Use source filtering when reviewing.

When reviewing a large proposal queue, filter by source to review all document_extraction proposals together, all meeting_extraction proposals together. This creates consistent review context.

Relationships at a Glance

Related Concept Relationship
Initiative Most proposals belong to an Initiative
Application Proposals are scoped to an Application
Requirements Applied create proposals become Requirements
Features Proposals can be linked to a specific Feature
Artifacts Artifact processing generates artifact_extraction proposals
Sources Source monitoring generates ai_discovery proposals

Next Steps


Pro Tip: At the end of each discovery sprint, sort proposals by source and do a focused 30-minute review session per source type. This is more efficient than reviewing proposals in chronological order and helps you calibrate which sources produce the highest-quality signals.

Support

  • Documentation: Continue reading about Initiatives and Requirements
  • Email: support@catalio.ai
  • Community: Share proposal review workflows with other Catalio users