Defense-Requirements-Tracer
Verifiedby Dryade
Requires enterprise tier subscription
Description
MBSE requirements traceability analysis for defense programs with gap detection, duplicate identification, and verification evidence mapping
Screenshots
Details
defense-requirements-tracer
Tier: Enterprise | Type: Agent | Category: Defense | Version: 1.0.0
MBSE requirements traceability analysis plugin for defense programs. Analyzes requirements databases to identify traceability gaps, orphan requirements, missing test coverage, duplicates, and inconsistencies. Generates traceability matrices for program reviews.
1. Overview
Plugin Name: defense-requirements-tracer Slug: defense-requirements-tracer Required Tier: Enterprise Plugin Type: Agent Category: Defense Author: Dryade License: DSUL
What It Does
Assists systems engineers with requirements traceability analysis by ingesting requirements databases (CSV/JSON exports from IBM DOORS, Polarion, or Jira), analyzing parent-child relationships across operational/system/subsystem levels, detecting gaps and inconsistencies, and generating traceability matrices for program milestone reviews.
Key Capabilities
- Multi-level requirements traceability analysis (operational to subsystem)
- Gap detection: orphan requirements, missing test coverage, dangling references
- Duplicate and contradictory requirement identification
- Verification evidence mapping to requirements and test cases
- Traceability matrix generation for program reviews (RCP, RCD, RQ, RA)
2. User Stories
Primary User Stories
US-1: Prepare Program Review Traceability
As a systems engineer, I want to generate a traceability matrix for a program review so that I can demonstrate complete requirements coverage.
Acceptance Criteria:
- [ ] Matrix shows all requirements with linked test cases and evidence
- [ ] Verification rate percentage calculated
- [ ] Gaps highlighted for remediation
US-2: Detect Requirements Quality Issues
As a requirements manager, I want to find duplicate and contradictory requirements so that I can clean the requirements database before a review.
Acceptance Criteria:
- [ ] Potential duplicates identified with similarity scores
- [ ] Contradictory pairs flagged
- [ ] Remediation actions suggested
Edge Cases
- Empty requirements database: Returns empty analysis with zero counts
- Missing parent references: Flagged as dangling parent references in gap detection
3. Architecture
Component Diagram
+------------------+ +------------------------+ +------------------+
| Orchestrator | --> | Requirements Tracer | --> | Data Provider |
| Agent Tools | | plugin.py | | (mock / real) |
+------------------+ +------------------------+ +------------------+
|
+-----v------+
| Demo Data |
| data/*.csv |
| data/*.json|
+------------+
Components
| Component | File | Responsibility |
|-----------|------|----------------|
| Plugin | plugin.py | Agent tools, CSV/JSON parsing, mock/real toggle |
| Data | data/ | Demo requirements DB, test cases, evidence, templates |
| Tests | tests/test_plugin.py | Manifest, tools, data validation |
Dependencies
- Internal: core.plugins.PluginProtocol, core.plugin_config_store.PluginConfigStore
- External: None (uses stdlib csv, json)
4. API Spec / Agent Capabilities
Agent Tools
| Tool Name | Input | Output | Description |
|-----------|-------|--------|-------------|
| analyze_traceability | requirements_source: str | JSON analysis | Analyze requirement hierarchy and coverage |
| detect_gaps | requirements_source: str | JSON gap list | Find orphans, missing tests, dangling refs |
| find_duplicates | requirements_source: str | JSON duplicate list | Identify potential duplicate requirements |
| map_verification_evidence | requirements_source: str | JSON mapping | Map evidence to requirements via test cases |
| generate_traceability_matrix | requirements_source: str, format: str | JSON matrix | Generate complete traceability matrix |
5. Data Flow
Processing Pipeline
1. User requests traceability analysis via orchestrator
2. Plugin loads requirements CSV, test cases JSON, evidence JSON from data/
3. Plugin analyzes parent-child links, test coverage, evidence mapping
4. Gap detection runs across all requirements
5. Audit trail appended, JSON response returned to orchestrator
Demo Data Description
The data/ directory contains:
requirements-database.csv: 20 requirements across 3 levels (operational, system, subsystem)test-cases.json: 10 test cases linked to requirementsverification-evidence.json: 8 evidence documents linked to test casestraceability-matrix-template.json: Template with column definitions and standardsprogram-review-template.json: Review report structure with milestone types
Total: 5 demo files covering the requirements traceability lifecycle.
6. Security Considerations
Data Handling
- PII: No - synthetic defense systems engineering data only
- Encryption: N/A - all data local
- Data Retention: No persistent storage beyond plugin config
External API Keys
None required. Designed for air-gapped deployment.
Isolation
- Plugin runs in sandboxed context via core plugin loader
- No external network calls
- All demo data uses fictional defense program requirements
7. Test Plan
Test Classes
| Class | Tests | Coverage Target |
|-------|-------|----------------|
| TestPluginAttributes | Manifest consistency | 100% manifest fields |
| TestPluginConfig | Config, mock/real toggle | Config validation |
| TestAgentTools | All 5 tools output validation | All tools |
| TestDemoData | Data presence, CSV/JSON loadability, classification | All data files |
Running Tests
cd dryade-plugins
python -m pytest enterprise/defense-requirements-tracer/tests/ -x -v --tb=short
8. Deployment Notes
Requirements
No additional Python packages required.
Configuration
{
"data_source": "mock"
}
Compatibility
- Min Dryade Version: 1.0.0
- Python: >=3.11
- Notes: Imports requirements from CSV/JSON exports. No cloud dependencies.
9. User Guide
Getting Started
- Ensure your Dryade instance has an Enterprise tier license
- Install the plugin via the marketplace or
dryade-pm push - Ask the orchestrator to analyze requirements traceability or detect gaps
- Review the generated matrix and gap analysis
Common Workflows
Workflow 1: Program Review Preparation
- Export requirements from DOORS/Polarion/Jira as CSV
- Ask the orchestrator to generate a traceability matrix
- Review verification rates and address gaps before the review
Workflow 2: Requirements Quality Audit
- Ask the orchestrator to detect gaps and duplicates
- Review orphan requirements and missing test coverage
- Prioritize remediation by severity
FAQ
Q: What requirements formats are supported? A: CSV with columns: id, level, text, parent_id, status. JSON format also supported.
Q: How are duplicates detected? A: Word-level similarity analysis. Pairs with >60% word overlap are flagged for review.
10. Screenshots
Screenshots will be added after UI integration.
11. Changelog
1.0.0 (2026-03-05)
- Initial release
- Multi-level requirements traceability analysis
- Gap detection (orphans, missing tests, dangling references)
- Duplicate and contradiction detection
- Verification evidence mapping
- Traceability matrix generation
Future Roadmap
- [ ] Integration with Capella MBSE for model-based traceability
- [ ] Natural language analysis for ambiguous requirement detection
- [ ] Support for IBM DOORS ReqIF export format
Requires enterprise tier subscription