Policy Development Workflow
Policy Development Workflow: Testing and Deploying Network Configuration Compliance Rules
Section titled “Policy Development Workflow: Testing and Deploying Network Configuration Compliance Rules”The Policy Development Workflow provides a systematic, iterative approach to creating, testing, and deploying policy definitions in rConfig V8. This workflow eliminates the trial-and-error of deploying untested policies directly to production, enabling administrators to validate policy rules against actual device configurations before enforcement. The integrated testing interface allows rapid iteration—edit, test, refine—without closing windows or navigating between multiple screens, dramatically accelerating policy development cycles.
This guide walks through the complete policy development lifecycle from initial policy creation through testing, refinement, and production deployment, ensuring policies work correctly before impacting compliance reports.
Understanding the Policy Development Workflow
Section titled “Understanding the Policy Development Workflow”Why a Development Workflow Matters
Section titled “Why a Development Workflow Matters”Avoid production surprises: Testing policies against real device configurations before deployment prevents widespread false failures or missed violations when policies go live.
Rapid iteration: The integrated test interface enables quick edit-test-refine cycles, allowing policy refinement in minutes rather than hours spent troubleshooting production issues.
Confidence in deployment: Validated policies that pass testing against representative configurations provide confidence they’ll work correctly across entire device populations.
Reduced remediation burden: Policies tested against both compliant and non-compliant configurations ensure accurate detection, preventing administrators from chasing false positives or missing real issues.
Knowledge building: Testing policies teaches administrators how different policy methods work, building expertise for creating increasingly sophisticated compliance rules.
Workflow Benefits
Section titled “Workflow Benefits”Non-destructive testing: Test policies without creating assignments or affecting compliance reports. Policies remain in draft state until ready for production.
Multi-tab iteration: Open policy definition in separate tab for editing while keeping test interface open, enabling seamless edit-test cycles without losing context.
Real configuration testing: Test against actual device configurations captured from your network, not synthetic examples, ensuring policies match real-world device output.
Immediate feedback: Test results appear instantly, showing exactly which rules passed or failed and why, enabling rapid problem identification and correction.
Comprehensive validation: Test against multiple device configurations representing different scenarios—compliant devices, non-compliant devices, edge cases—before deployment.
Policy Development Workflow Overview
Section titled “Policy Development Workflow Overview”The workflow follows five phases:
Phase 1: Create Policy Definition
- Define policy structure and rules
- Use policy method reference for guidance
- Save initial draft policy
Phase 2: Select Test Configuration
- Choose representative device
- Select command output to test against
- Ensure configuration represents real-world scenario
Phase 3: Test Policy
- Execute test against configuration
- Review results (pass/fail for each rule)
- Analyze failures or unexpected passes
Phase 4: Refine Policy
- Open policy definition in new tab
- Adjust rules based on test results
- Save changes and return to test interface
- Re-test without closing modal
Phase 5: Deploy to Production
- Create policy assignment
- Enable assignment for compliance enforcement
- Monitor initial results
- Adjust if needed
This iterative process ensures policies work correctly before production deployment.
Phase 1: Create Policy Definition
Section titled “Phase 1: Create Policy Definition”Step 1: Create Initial Policy
Section titled “Step 1: Create Initial Policy”Navigate to Compliance → Policy Definitions and click New Policy Definition.
Initial configuration:
- Name: Descriptive name indicating policy purpose (e.g.,
Security-Baseline-Test-Draft
) - Description: Document what the policy validates and testing scope
- Click Create
The editor opens with default template.

Step 2: Write Initial Policy Rules
Section titled “Step 2: Write Initial Policy Rules”Create policy rules based on compliance requirements. Start simple—test basic rules before adding complexity.
Example initial policy (Cisco IOS security baseline):
// Description: Verify enable secret configured#[must_match_single_string]enable secret 5
// Description: Ensure SSH version 2 enabled#[must_match_single_string]ip ssh version 2
// Description: Verify no Telnet transport#[must_not_match_single_string]transport input telnet
Initial policy guidelines:
- Start with 3-5 rules maximum
- Use simple methods (
must_match_single_string
,must_not_match_single_string
) - Include clear comments for each rule
- Test basic rules before adding complex patterns
Step 3: Save Draft Policy
Section titled “Step 3: Save Draft Policy”Click Save to store the policy definition.
Draft policy status:
- Policy exists in system but isn’t assigned to devices
- No compliance impact—policy not being enforced
- Available for testing against configurations
- Can be edited freely without affecting production
Phase 2: Select Test Configuration
Section titled “Phase 2: Select Test Configuration”Step 1: Choose Test Device
Section titled “Step 1: Choose Test Device”Select a device whose configuration you want to test the policy against.
Step 2: Select Command Output
Section titled “Step 2: Select Command Output”Choose which command output to test the policy against.
Command output selection matches assignment: Select the same command you’ll use in production policy assignment. If assignment will evaluate show running-config
, test against show running-config
.
Common command choices:
show running-config
- Most configuration elementsshow startup-config
- Startup configuration validationshow version
- Version and hardware information- Device-specific commands - Vendor-specific outputs
Verification: Confirm selected device has configuration file for chosen command. If not, run device backup before testing.

Step 3: Select Policy Definition
Section titled “Step 3: Select Policy Definition”Click the Test Policy Definition
button above the config view and choose the policy definition to test.
Select your draft policy: Choose the policy definition created in Phase 1 (e.g., Security-Baseline-Test-Draft
).
Policy list shows all definitions: Both production and draft policies appear. Ensure you select the correct draft policy for testing.

Phase 3: Test Policy
Section titled “Phase 3: Test Policy”Step 1: Execute Test
Section titled “Step 1: Execute Test”Click Test Policy Definition button to execute the test. Note, this process does not connect to the device; it evaluates the selected policy against the stored configuration file for the chosen command.
The system:
- Loads selected device’s configuration file for specified command
- Loads policy definition rules
- Evaluates each policy rule against configuration
- Generates pass/fail results for each rule
- Displays results in test interface

Step 2: Review Test Results
Section titled “Step 2: Review Test Results”Results display shows outcome for each policy rule.
Result format:
{ "0": { "policyMethod": "must_match_single_string", "comment": "Description: Verify enable secret configured", "policyString": "enable secret 5", "result": "pass", "resultRaw": true, "configId": 123 }, "1": { "policyMethod": "must_match_single_string", "comment": "Description: Ensure SSH version 2 enabled", "policyString": "ip ssh version 2", "result": "fail", "resultRaw": false, "configId": 123 }, "eval_result": "FAIL", "eval_result_raw": false, "eval_result_reason": "1 policy methods failed"}

Interpreting results:
Individual rules:
"result": "pass"
/"resultRaw": true
- Rule passed"result": "fail"
/"resultRaw": false
- Rule failed
Overall evaluation:
"eval_result": "PASS"
- All rules passed"eval_result": "FAIL"
- One or more rules failed"eval_result_reason"
- Summary of results
Step 3: Analyze Results
Section titled “Step 3: Analyze Results”If all rules pass: Policy correctly validates compliant configuration. Test against non-compliant device to verify policy catches violations.
If rules fail unexpectedly: Determine why rules failed:
- Does configuration actually lack required element? (Legitimate failure)
- Does policy string not match configuration format? (Policy needs adjustment)
- Case sensitivity issue? (Check exact capitalization)
- Extra/missing whitespace? (Verify spacing matches configuration)
If rules pass unexpectedly: Determine why rules passed when they shouldn’t:
- Policy method too lenient? (Wildcards matching too broadly)
- Wrong configuration being evaluated? (Wrong command selected)
- Policy string matches unintended configuration? (Needs more specificity)
Example analysis:
Rule failed: "policyString": "ip ssh version 2"
Configuration has: ip ssh version 2.0
Issue: Policy expects exact string ip ssh version 2
but device outputs ip ssh version 2.0
Solution: Adjust policy to use wildcard or regex to match both formats
Phase 4: Refine Policy (Iterative Testing)
Section titled “Phase 4: Refine Policy (Iterative Testing)”Step 1: Open Policy Definition for Editing
Section titled “Step 1: Open Policy Definition for Editing”From test results screen, click Open Policy Definition button (or link to policy definition).
Critical workflow detail: Policy definition opens in NEW browser tab, leaving test interface open in original tab.

Multi-tab workflow benefits:
- Keep test results visible for reference
- Edit policy in separate tab
- Switch between tabs quickly
- No need to re-select device/command/policy for each test iteration
Step 2: Edit Policy Based on Test Results
Section titled “Step 2: Edit Policy Based on Test Results”In the policy definition editor tab, adjust rules based on test results.
Common refinements:
Adjust exact strings to match device output:
Before:
// Description: Ensure SSH version 2 enabled#[must_match_single_string]ip ssh version 2
After (if device outputs ip ssh version 2.0
):
// Description: Ensure SSH version 2 enabled#[wildcard_match_single_string]ip ssh version 2*
Add missing rules discovered during testing:
If testing revealed additional required configurations not in initial policy, add them:
// Description: Verify AAA new-model configured#[must_match_single_string]aaa new-model
Adjust methods for better matching:
Change from exact match to regex if configuration format varies:
// Description: Match any valid SNMP server configuration#[must_match_regex]"/snmp-server host [0-9]+\.[0-9]+\.[0-9]+\.[0-9]+ .+/m"
Fix case sensitivity or whitespace issues:
Ensure policy strings match exact device output including capitalization and spacing.
Step 3: Save Updated Policy
Section titled “Step 3: Save Updated Policy”Click Save in policy definition editor to store changes.
Changes save immediately: Updated policy is now ready for re-testing.
Step 4: Switch Back to Test Tab
Section titled “Step 4: Switch Back to Test Tab”Return to the test interface tab (don’t close it).
Test interface remains configured: Device, command, and policy selection persist. No need to re-select.
Step 5: Re-test Policy
Section titled “Step 5: Re-test Policy”Click Test Policy Definition button again to test updated policy.
Rapid iteration: This edit-save-switch-test cycle takes seconds, enabling rapid refinement until policy works correctly.
No need to close modal: Keep test interface open, make edits in separate tab, return and re-test. Repeat until satisfied.

VIDEOPLACEHOLDER
Step 6: Iterate Until Policy Works Correctly
Section titled “Step 6: Iterate Until Policy Works Correctly”Continue edit-test-refine cycle until:
All intended rules pass on compliant devices: Policy correctly validates devices that meet requirements.
All intended rules fail on non-compliant devices: Policy correctly catches violations.
No false positives: Policy doesn’t fail devices that should pass.
No false negatives: Policy doesn’t pass devices that should fail.
Edge cases handled: Policy works correctly on unusual configurations or device variations.
Testing Against Multiple Scenarios
Section titled “Testing Against Multiple Scenarios”Comprehensive testing validates policy across various scenarios:
Test Scenario 1: Known compliant device
- Select device with correct configurations
- All rules should pass
- Validates policy recognizes compliance
Test Scenario 2: Known non-compliant device
- Select device missing required configurations
- Appropriate rules should fail
- Validates policy catches violations
Test Scenario 3: Partially compliant device
- Select device with some but not all required configs
- Some rules pass, some fail
- Validates policy correctly identifies partial compliance
Test Scenario 4: Edge cases
- Select device with unusual configuration formats
- Test wildcard and regex patterns
- Validates policy handles variations
Example multi-scenario testing:
Initial test against Router-Core-01 (compliant) → All rules pass ✓
Second test against Router-Branch-05 (missing SSH config) → SSH rule fails ✓
Third test against Router-Edge-02 (has telnet enabled) → Telnet prohibition rule fails ✓
Fourth test against Router-Lab-10 (different IOS version) → All rules pass ✓
Result: Policy validated across multiple device types and configurations. Ready for production deployment.
Phase 5: Deploy to Production
Section titled “Phase 5: Deploy to Production”Step 1: Finalize Policy Definition
Section titled “Step 1: Finalize Policy Definition”Once testing complete and policy works correctly:
- Review policy one final time for clarity and completeness
- Update policy name to remove draft/test prefix
- Add comprehensive description documenting:
- What policy validates
- Testing performed
- Devices/scenarios tested against
- Any known limitations or exceptions
- Save final policy version
Example final policy name change:
- Draft:
Security-Baseline-Test-Draft
- Production:
Security-Baseline-Cisco-IOS
Step 2: Create Policy Assignment
Section titled “Step 2: Create Policy Assignment”Navigate to Compliance → Policy Assignments and create assignment linking validated policy to device scope.
Assignment configuration:
- Name: Descriptive assignment name (e.g.,
Production-Routers-Security-Baseline
) - Scope: Select appropriate scope (device, category, or tag)
- Command: Select same command used during testing
- Policy Definition: Select finalized policy
- Status: Set to Disabled for initial rollout testing
Why start disabled: Even though policy is tested, starting disabled allows one more validation step with actual assignment before affecting compliance reports.
See Policy Assignments for detailed assignment configuration guidance.
Step 3: Test Assignment with Small Scope
Section titled “Step 3: Test Assignment with Small Scope”Before broad deployment, validate assignment works correctly:
Initial test approach:
- Create assignment scoped to single device (the device used for testing)
- Keep assignment disabled
- Run manual compliance check
- Review results
- Verify results match testing expectations
If results match testing: Policy and assignment configured correctly. Proceed to broader deployment.
If results differ from testing: Investigate discrepancy:
- Assignment command matches testing command?
- Device has recent backup?
- Policy definition correct version?
- Configuration changed since testing?
Step 4: Enable Assignment for Broader Scope
Section titled “Step 4: Enable Assignment for Broader Scope”Once confident in policy and assignment:
- Update assignment scope to broader population (category or tag) if needed
- Enable assignment (change status from Disabled to Enabled)
- Create Scheduled Task for automated compliance checking (optional)
- Save assignment
Production deployment complete: Policy now enforces compliance across device population.
Step 5: Monitor Initial Results
Section titled “Step 5: Monitor Initial Results”After deployment, review first compliance results:
Navigate to Compliance → Policy Compliance and review:
- Overall compliance percentage
- Device pass/fail distribution
- Specific rule failures
- Unexpected results
Common post-deployment findings:
Higher than expected failure rate: Some devices legitimately non-compliant. Create remediation plan.
Unexpected failures on specific device models: Policy may need adjustment for device variations. Consider wildcards or model-specific policies.
All devices failing: Policy definition or assignment configuration issue. Review policy against actual device configurations.
All devices passing: Good result if expected. If not, verify policy is actually validating (test against known non-compliant device).
Step 6: Adjust if Needed
Section titled “Step 6: Adjust if Needed”Based on initial results, make adjustments:
Minor policy adjustments: If small percentage of devices fail due to legitimate variations, update policy definition to accommodate (use wildcards, adjust regex).
Scope adjustments: If policy doesn’t apply to certain devices in scope, adjust assignment scope or create exceptions.
Remediation: If devices failing represent legitimate non-compliance, initiate remediation workflow to bring devices into compliance.
See Policy Compliance Results for detailed guidance on reviewing results and remediation.
Best Practices for Policy Development
Section titled “Best Practices for Policy Development”Start Simple, Add Complexity
Section titled “Start Simple, Add Complexity”Initial policy: Begin with basic rules using simple methods (must_match_single_string
, must_not_match_single_string
).
Test and validate: Ensure basic rules work correctly before adding complex patterns.
Incremental enhancement: Add advanced methods (regex, wildcards, code blocks) only when simpler methods insufficient.
Rationale: Simple policies are easier to troubleshoot and maintain. Complexity should be justified by requirements, not pursued for its own sake.
Test Against Multiple Scenarios
Section titled “Test Against Multiple Scenarios”Don’t test only compliant devices: Must validate policy catches violations, not just recognizes compliance.
Include edge cases: Test against unusual configurations, different device models, various firmware versions.
Document test scenarios: Record which devices/configurations tested against for future reference.
Regression testing: When modifying existing policies, re-test against original scenarios to ensure changes don’t break working rules.
Use Meaningful Device Selection
Section titled “Use Meaningful Device Selection”Representative devices: Test devices should represent broader population policy will apply to.
Avoid unique snowflakes: Don’t test only against highly customized devices with non-standard configurations.
Multiple vendors if applicable: If policy applies cross-vendor, test against each vendor’s devices.
Recent configurations: Ensure test devices have current backups reflecting actual production state.
Document Testing Results
Section titled “Document Testing Results”Maintain testing log: Record testing iterations, what changed, why adjustments were made.
Document edge cases: Note unusual configurations tested and how policy handles them.
Record test devices: Document which devices used for testing for future reference.
Example testing documentation:
Policy: Security-Baseline-Cisco-IOSTesting Date: 2025-10-14
Test 1: Router-Core-01 (IOS 15.7)- Result: All rules passed- Notes: Baseline compliant device
Test 2: Router-Branch-03 (IOS 15.2)- Result: SSH rule failed- Issue: IOS 15.2 outputs "ip ssh version 2.0" not "ip ssh version 2"- Fix: Changed to wildcard_match_single_string with "ip ssh version 2*"
Test 3: Router-Branch-03 (after fix)- Result: All rules passed- Notes: Wildcard accommodation works for version variations
Test 4: Router-Edge-05 (IOS 16.12)- Result: All rules passed- Notes: Policy works across IOS versions 15.2-16.12
Conclusion: Policy validated across IOS versions. Ready for production.
Leverage Multi-Tab Workflow
Section titled “Leverage Multi-Tab Workflow”Keep test interface open: Don’t close test modal between iterations.
Edit in separate tab: Open policy definition in new tab for editing.
Rapid switching: Alt+Tab (or Cmd+Tab) between test interface and editor for quick iteration.
Context preservation: Test parameters (device, command, policy) persist across iterations, eliminating repetitive selection.
Time savings: Multi-tab workflow reduces policy refinement time by 70-80% compared to close-navigate-reopen-select-test cycles.
Test Regex and Wildcards Thoroughly
Section titled “Test Regex and Wildcards Thoroughly”External testing first: Test regex patterns at regex101.com with PHP flavor before using in policies.
Wildcard validation: Verify wildcards (*
, ?
) match intended patterns without over-matching.
Edge case testing: Regex and wildcards can have unexpected behavior. Test extensively against various configurations.
Performance consideration: Complex regex patterns can slow policy evaluation. Balance pattern sophistication with performance.
Troubleshooting Policy Testing
Section titled “Troubleshooting Policy Testing”Test Returns Unexpected Results
Section titled “Test Returns Unexpected Results”Symptom: Test results don’t match expectations (unexpected passes or failures).
Diagnosis:
- Review selected device and command: Verify you’re testing against intended configuration
- Check policy syntax: Ensure policy definition has no syntax errors
- Examine configuration content: View actual device configuration to see what’s being evaluated
- Compare policy strings to config: Character-by-character comparison identifies mismatches
Common causes and resolutions:
Case sensitivity: Policy string case doesn’t match device output
- Resolution: Adjust policy string capitalization to match exactly
Whitespace differences: Extra spaces or tabs in policy string
- Resolution: Verify spacing matches device output precisely
Configuration format variation: Device outputs slightly different format than policy expects
- Resolution: Use wildcards or regex to accommodate variations
Wrong command selected: Testing against command that doesn’t contain relevant configuration
- Resolution: Select appropriate command containing configurations policy validates
Cannot Find Configuration to Test Against
Section titled “Cannot Find Configuration to Test Against”Symptom: Device or command dropdown is empty, or selected device has no configuration file.
Cause: Device hasn’t been backed up, or command hasn’t executed.
Resolution:
- Verify device is configured in rConfig and backing up successfully
- Check command exists in device’s command group
- Run manual device backup to generate configuration files
- Refresh test interface after successful backup
- Retry test
Test Interface Not Loading Results
Section titled “Test Interface Not Loading Results”Symptom: Test executes but results don’t display, or interface hangs.
Possible causes:
- Large configuration file causing timeout
- Policy definition syntax error preventing evaluation
- Browser cache issues
Resolution:
- Check browser console for JavaScript errors
- Try different browser
- Clear browser cache and reload
- Review policy definition syntax for errors
- Test smaller policy (fewer rules) to isolate issue
- Check application logs for server-side errors
Policy Tests Successfully But Assignment Fails
Section titled “Policy Tests Successfully But Assignment Fails”Symptom: Policy works correctly in test interface but produces different results when assigned.
Possible causes:
- Assignment using different command than testing
- Device configurations changed between test and assignment execution
- Policy definition was modified after testing
- Assignment scope includes devices not tested
Resolution:
- Verify assignment command matches command used in testing
- Ensure device has recent backup (configuration hasn’t changed)
- Confirm policy definition hasn’t been modified
- Test policy against other devices in assignment scope
- Run manual compliance check on small scope before broad deployment
Results Differ Between Test Runs
Section titled “Results Differ Between Test Runs”Symptom: Same policy tested against same device produces different results.
Possible causes:
- Configuration file was updated between tests (device backed up)
- Policy definition was modified
- Cache issues
Resolution:
- Check device backup history—was device backed up between tests?
- Review policy definition change history
- Force fresh backup of device
- Clear browser cache
- Re-run test with fresh data
Related Documentation
Section titled “Related Documentation”- Compliance Overview - Understanding the compliance system
- Policy Definitions - Complete guide to creating policies with all 14 methods
- Policy Assignments - Deploying policies to device populations
- Policy Compliance Results - Reviewing and acting on compliance reports
Quick Reference
Section titled “Quick Reference”Policy Development Workflow Steps
Section titled “Policy Development Workflow Steps”- Create draft policy with initial rules
- Access test interface (Compliance → Test Policy Definition)
- Select device to test against
- Select command output to evaluate
- Select policy definition to test
- Execute test and review results
- Open policy in new tab for editing
- Refine policy based on test results
- Save changes and return to test tab
- Re-test without closing interface
- Iterate until policy works correctly
- Test multiple scenarios (compliant, non-compliant, edge cases)
- Finalize policy and update name/description
- Create assignment (start disabled)
- Enable assignment and monitor results
Multi-Tab Workflow
Section titled “Multi-Tab Workflow”Tab 1: Test Policy Definition Interface ├─ Device selected ├─ Command selected ├─ Policy selected └─ Test results visible
Tab 2: Policy Definition Editor ├─ Edit policy rules ├─ Save changes └─ Keep open for iterations
Workflow: Edit in Tab 2 → Save → Switch to Tab 1 → Test → Review → Repeat
Summary
Section titled “Summary”The Policy Development Workflow provides a systematic, iterative approach to creating reliable policy definitions that work correctly before production deployment. The integrated test interface with multi-tab editing enables rapid refinement cycles, dramatically reducing policy development time while increasing confidence in deployment outcomes.
Key Takeaways
Section titled “Key Takeaways”Test before deploy: Never deploy policies directly to production without testing against actual device configurations. Testing identifies issues early when they’re easy to fix.
Multi-tab iteration: Keep test interface open in one tab while editing policy in another. This workflow enables rapid edit-test-refine cycles measured in seconds rather than minutes.
Multiple scenario validation: Test policies against compliant devices (should pass), non-compliant devices (should fail), and edge cases (should handle correctly) before considering policy complete.
Start simple, add complexity: Begin with basic rules using simple methods. Validate these work correctly before adding regex patterns, wildcards, or code block matching.
Document testing: Record which devices tested, what results were, what changes were made, and why. This documentation aids future policy maintenance and troubleshooting.
Production rollout: Deploy tested policies first with disabled assignments on small scopes. Validate one more time before enabling broad enforcement.
With systematic policy development workflow following test-driven methodology, rConfig administrators can create sophisticated compliance policies with confidence they’ll work correctly across entire device populations, maintaining configuration standards without trial-and-error in production.