- Update line counts across documentation to match actual source files
(test-config-parser.c: 3304 lines, test-stubs.c: 219 lines)
- Correct test configuration count to exact "25 configs"
- Fix schema file references to single canonical file
- Add property database documentation with accurate statistics
- Fix broken Makefile regeneration targets (regenerate-property-db,
regenerate-platform-property-db) to use correct schema-based workflow
- Replace non-existent rebuild-property-database.py with three-step
process: extract-schema-properties.py → generate-database-from-schema.py
Signed-off-by: Mike Hansen <mike.hansen@netexperience.com>
19 KiB
Configuration Testing Framework
Overview
The OLS uCentral Client includes a comprehensive configuration testing framework that provides two-layer validation of JSON configurations:
- Schema Validation - Structural validation against the uCentral JSON schema
- Parser Testing - Implementation validation of the C parser with property tracking
This framework enables automated testing, continuous integration, and tracking of configuration feature implementation status.
Documentation Index
This testing framework includes multiple documentation files, each serving a specific purpose:
Primary Documentation
-
tests/config-parser/TEST_CONFIG_README.md - Complete testing framework guide
- Overview of two-layer validation approach
- Quick start and running tests
- Property tracking system
- Configuration-specific validators
- Test output interpretation
- CI/CD integration
- Start here for understanding the testing framework
-
tests/schema/SCHEMA_VALIDATOR_README.md - Schema validator detailed documentation
- Standalone validator usage
- Command-line interface
- Programmatic API
- Porting guide for other repositories
- Common validation errors
- Start here for schema validation specifics
-
tests/MAINTENANCE.md - Maintenance procedures guide
- Schema update procedures
- Property database update procedures
- Version synchronization
- Testing after updates
- Troubleshooting common issues
- Start here when updating schema or property database
-
TEST_CONFIG_PARSER_DESIGN.md - Test framework architecture
- Multi-layer validation design
- Property metadata system (398 schema properties)
- Property inspection engine
- Test execution flow diagrams
- Data structures and algorithms
- Output format implementations
- Start here for understanding the test framework internals
Supporting Documentation
- README.md - Project overview and build instructions
- Build system architecture
- Platform abstraction layer
- Testing framework integration
- Deployment instructions
Quick Reference
Running Tests
RECOMMENDED: Use the test runner script (handles Docker automatically):
# Test all configurations in STUB mode (default - fast, proto.c only)
./run-config-tests.sh # Human-readable output
./run-config-tests.sh --format html # HTML report
./run-config-tests.sh --format json # JSON report
# Test all configurations in PLATFORM mode (integration testing)
./run-config-tests.sh --mode platform # Human-readable output
./run-config-tests.sh --mode platform --format html # HTML report
# Test single configuration
./run-config-tests.sh cfg0.json # Stub mode
./run-config-tests.sh --mode platform cfg0.json # Platform mode
Alternative: Run tests directly in Docker (manual Docker management):
# Build the Docker environment first (if not already built)
make build-host-env
# Run all tests in STUB mode (default - fast)
docker exec ucentral_client_build_env bash -c \
"cd /root/ols-nos/tests/config-parser && make test-config-full"
# Run all tests in PLATFORM mode (integration)
docker exec ucentral_client_build_env bash -c \
"cd /root/ols-nos/tests/config-parser && make test-config-full USE_PLATFORM=brcm-sonic"
# Run individual test suites (stub mode)
docker exec ucentral_client_build_env bash -c \
"cd /root/ols-nos/tests/config-parser && make validate-schema"
docker exec ucentral_client_build_env bash -c \
"cd /root/ols-nos/tests/config-parser && make test-config"
# Generate test reports (stub mode)
docker exec ucentral_client_build_env bash -c \
"cd /root/ols-nos/tests/config-parser && make test-config-html"
# Generate test reports (platform mode)
docker exec ucentral_client_build_env bash -c \
"cd /root/ols-nos/tests/config-parser && make test-config-html USE_PLATFORM=brcm-sonic"
# Copy report files out of container to view
docker cp ucentral_client_build_env:/root/ols-nos/tests/config-parser/test-report.html output/
Alternative: Run tests locally (may have OS-specific dependencies):
# Navigate to test directory
cd tests/config-parser
# Run all tests in STUB mode (default)
make test-config-full
# Run all tests in PLATFORM mode
make test-config-full USE_PLATFORM=brcm-sonic
# Run individual test suites
make validate-schema # Schema validation only
make test-config # Parser tests only
# Generate test reports (stub mode)
make test-config-html # HTML report (browser-viewable)
make test-config-json # JSON report (machine-readable)
make test-config-junit # JUnit XML (CI/CD integration)
# Generate test reports (platform mode)
make test-config-html USE_PLATFORM=brcm-sonic
make test-config-json USE_PLATFORM=brcm-sonic
Note: Running tests in Docker is the preferred method as it provides a consistent, reproducible environment regardless of your host OS (macOS, Linux, Windows).
Key Files
Test Implementation:
tests/config-parser/test-config-parser.c(3304 lines) - Parser test framework with property trackingtests/config-parser/test-stubs.c(219 lines) - Platform function stubs for stub mode testingtests/config-parser/platform-mocks/brcm-sonic.c- gNMI/gNOI mocks for platform modetests/config-parser/platform-mocks/example-platform.c- Example platform mockstests/schema/validate-schema.py(649 lines) - Standalone schema validator with undefined property detectiontests/config-parser/config-parser.h- Test header exposing cfg_parse()
Property Database Files:
tests/config-parser/property-database-base.c(416 lines) - Proto.c parsing (398 properties: 102 implemented, 296 not yet)tests/config-parser/property-database-platform-brcm-sonic.c(419 lines) - Platform hardware application trackingtests/config-parser/property-database-platform-example.c- Example platform database
Configuration Files:
config-samples/ucentral.schema.pretty.json- uCentral JSON schema (human-readable, single schema file)config-samples/*.json- Test configuration files (25 configs)config-samples/*invalid*.json- Negative test cases
Build System:
tests/config-parser/Makefile- Test targets and build rulesrun-config-tests.sh- Test runner script (recommended)
Production Code (Minimal Changes):
src/ucentral-client/proto.c- Added TEST_STATIC macro (2 lines changed)src/ucentral-client/include/router-utils.h- Added extern declarations (minor change)
Features
Schema Validation
- Validates JSON structure against official uCentral schema
- Checks property types, required fields, constraints
- Standalone tool, no dependencies on C code
- Exit codes for CI/CD integration
Parser Testing
- Tests actual C parser implementation
- Multiple output formats (human-readable, HTML, JSON, JUnit XML)
- Interactive HTML reports with detailed analysis
- Machine-readable JSON for automation
- JUnit XML for CI/CD integration
- Validates configuration processing and struct population
- Configuration-specific validators for business logic
- Memory leak detection
- Hardware constraint validation
Property Tracking System
- Database of all schema properties and their implementation status (398 canonical properties)
- Tracks which properties are parsed by which functions
- Identifies unimplemented features
- Status classification: CONFIGURED, IGNORED, SYSTEM, INVALID, Unknown
- Property usage reports across all test configurations
- Properties with line numbers are implemented, line_number=0 means not yet implemented
Two-Layer Validation Strategy
Why Both Layers?
Each layer catches different types of errors:
- Schema catches: Type mismatches, missing required fields, constraint violations
- Parser catches: Implementation bugs, hardware limits, cross-field dependencies
- Property tracking catches: Missing implementations, platform-specific features
See TEST_CONFIG_README.md section "Two-Layer Validation Strategy" for detailed explanation.
Test Coverage
Current test suite includes:
- 25 configuration files covering various features
- Positive tests (configs that should parse successfully)
- Negative tests (configs that should fail)
- Feature-specific validators for critical configurations
- Two testing modes: stub mode (fast, proto.c only) and platform mode (integration, proto.c + platform code)
- Platform stub with 54-port simulation (matches ECS4150 hardware)
Tested Features
- Port configuration (enable/disable, speed, duplex)
- VLAN configuration and membership
- Spanning Tree Protocol (STP, RSTP, PVST, RPVST)
- IGMP Snooping
- Power over Ethernet (PoE)
- IEEE 802.1X Authentication
- DHCP Relay
- Static routing
- System configuration (timezone, hostname, etc.)
Platform-Specific Features (Schema-Valid, Platform Implementation Required)
- LLDP (Link Layer Discovery Protocol)
- LACP (Link Aggregation Control Protocol)
- ACLs (Access Control Lists)
- DHCP Snooping
- Loop Detection
- Port Mirroring
- Voice VLAN
These features pass schema validation but show as "Unknown" in property reports, indicating they require platform-specific implementation.
Changes from Base Repository
The testing framework was added with minimal impact to production code:
New Files Added
tests/config-parser/test-config-parser.c(3304 lines) - Complete test frameworktests/config-parser/test-stubs.c(219 lines) - Platform stubs for stub modetests/config-parser/platform-mocks/brcm-sonic.c- Platform mocks for brcm-sonictests/config-parser/platform-mocks/example-platform.c- Platform mocks for example platformtests/config-parser/property-database-base.c(416 lines) - Base property database (398 properties: 102 implemented, 296 not yet)tests/config-parser/property-database-platform-brcm-sonic.c(419 lines) - Platform property databasetests/config-parser/property-database-platform-example.c- Example platform property databasetests/schema/validate-schema.py(649 lines) - Schema validatortests/config-parser/config-parser.h- Test headertests/config-parser/TEST_CONFIG_README.md- Framework documentationtests/config-parser/platform-mocks/README.md- Platform mocks documentationtests/schema/SCHEMA_VALIDATOR_README.md- Validator documentationtests/MAINTENANCE.md- Maintenance procedurestests/ADDING_NEW_PLATFORM.md- Platform addition guidetests/config-parser/Makefile- Test build system with stub and platform modestests/tools/extract-schema-properties.py- Extract properties from schematests/tools/generate-database-from-schema.py- Generate base property databasetests/tools/generate-platform-database-from-schema.py- Generate platform property databasetests/README.md- Testing framework overviewTESTING_FRAMEWORK.md- This file (documentation index, in repository root)TEST_CONFIG_PARSER_DESIGN.md- Test framework architecture and design (in repository root)QUICK_START_TESTING.md- Quick start guide (in repository root)TEST_RUNNER_README.md- Test runner script documentation (in repository root)run-config-tests.sh- Test runner script (in repository root)
Modified Files
-
src/ucentral-client/proto.c- Added TEST_STATIC macro pattern (2 lines)// Changed from: static struct plat_cfg *cfg_parse(...) // Changed to: #ifdef UCENTRAL_TESTING #define TEST_STATIC #else #define TEST_STATIC static #endif TEST_STATIC struct plat_cfg *cfg_parse(...)This allows test code to call cfg_parse() while keeping it static in production builds.
-
src/ucentral-client/include/router-utils.h- Added extern declarations- Exposed necessary functions for test stubs
-
src/ucentral-client/Makefile- No changes (production build only)- Test targets are in tests/config-parser/Makefile
- Clean separation between production and test code
Configuration Files
- Added
config-samples/cfg_invalid_*.json- Negative test cases - Added
config-samples/ECS4150_*.json- Feature-specific test configs - No changes to existing valid configurations
Zero Impact on Production
- Production builds: No functional changes, cfg_parse() remains static
- Test builds: cfg_parse() becomes visible with -DUCENTRAL_TESTING flag
- No ABI changes, no performance impact
- No runtime dependencies added
Integration with Development Workflow
During Development
# 1. Make code changes to proto.c
vi src/ucentral-client/proto.c
# 2. Run tests using test runner script
./run-config-tests.sh
# 3. Review property tracking report
# Check for unimplemented features or errors
# 4. If adding new parser function, update property database
vi tests/config-parser/test-config-parser.c
# Add property entries for new function
# 5. Create test configuration
vi config-samples/test-new-feature.json
# 6. Retest
./run-config-tests.sh
Before Committing
# Ensure all tests pass
./run-config-tests.sh
# Generate full HTML report for review
./run-config-tests.sh --format html
open output/test-report.html
# Check for property database accuracy
# Review "Property Usage Report" section in HTML report
# Look for unexpected "Unknown" properties
In CI/CD Pipeline
test-configurations:
stage: test
script:
- ./run-config-tests.sh --format json
artifacts:
paths:
- output/test-report.json
- output/test-report.html
when: always
Property Database Management
The property database is a critical component tracking which JSON properties are parsed by which functions.
Database Structure
static struct property_metadata properties[] = {
{
.path = "interfaces.ethernet.enabled",
.status = PROP_CONFIGURED,
.source_file = "proto.c",
.source_function = "cfg_ethernet_parse",
.source_line = 1119,
.notes = "Enable/disable ethernet interface"
},
// ... entries for all 398 schema properties (with line numbers for implemented properties) ...
};
Key Rules
- Only track properties for functions that exist in this repository's proto.c
- Remove entries when parser functions are removed
- Add entries immediately when adding new parser functions
- Use accurate function names - different platforms may use different names
- Properties not in database show as "Unknown" - this is correct for platform-specific features
See MAINTENANCE.md for complete property database update procedures.
Schema Management
The schema file defines what configurations are structurally valid.
Schema Location
config-samples/ucentral.schema.pretty.json- Human-readable version (single schema file in repository)
Schema Source
Schema is maintained in the external ols-ucentral-schema repository.
Schema Updates
When ols-ucentral-schema releases a new version:
- Copy new schema to config-samples/
- Run schema validation on all test configs
- Fix any configs that fail new requirements
- Document breaking changes
- Update property database if new properties are implemented
See MAINTENANCE.md section "Schema Update Procedures" for complete process.
Platform-Specific Repositories
This is the base repository providing the core framework. Platform-specific repositories (like Edgecore EC platform) can:
- Fork the test framework - Copy test files to their repository
- Extend property database - Add entries for platform-specific parser functions
- Add platform configs - Create configs testing platform features
- Maintain separate tracking - Properties "Unknown" in base become "CONFIGURED" in platform
Example: LLDP Property Status
In base repository (this repo):
Property: interfaces.ethernet.lldp
Status: Unknown (not in property database)
Note: May require platform-specific implementation
In Edgecore EC platform repository:
Property: interfaces.ethernet.lldp
Parser: cfg_ethernet_lldp_parse()
Status: CONFIGURED
Note: Per-interface LLDP transmit/receive configuration
Each platform tracks only the properties it actually implements.
Troubleshooting
Common Issues
Tests fail in Docker but pass locally:
- Check schema file exists in container
- Verify paths are correct in container environment
- Rebuild container:
make build-host-env
Property shows as "Unknown" when it should be CONFIGURED:
- Verify parser function exists:
grep "function_name" proto.c - Check property path matches JSON exactly
- Ensure property entry is in properties[] array
Schema validation fails for valid config:
- Schema may be outdated - check version
- Config may use vendor extensions not in base schema
- Validate against specific schema:
./validate-schema.py config.json --schema /path/to/schema.json
See MAINTENANCE.md "Troubleshooting" section for complete troubleshooting guide.
Documentation Maintenance
When updating the testing framework:
-
Update relevant documentation:
- New features → TEST_CONFIG_README.md
- Schema changes → MAINTENANCE.md + SCHEMA_VALIDATOR_README.md
- Property database changes → MAINTENANCE.md + TEST_CONFIG_README.md
- Build changes → README.md
-
Keep version information current:
- Update compatibility matrices
- Document breaking changes
- Maintain changelogs
-
Update examples:
- Refresh command output examples
- Update property counts
- Keep test results current
Contributing
When contributing to the testing framework:
- Maintain property database accuracy - Update when changing parser functions
- Add test configurations - Create configs demonstrating new features
- Update documentation - Keep docs synchronized with code changes
- Follow conventions - Use established patterns for validators and property entries
- Test thoroughly - Run full test suite before committing
License
BSD-3-Clause (same as parent project)
See Also
- tests/config-parser/TEST_CONFIG_README.md - Complete testing framework guide
- TEST_CONFIG_PARSER_DESIGN.md - Test framework architecture and design
- tests/schema/SCHEMA_VALIDATOR_README.md - Schema validator documentation
- tests/MAINTENANCE.md - Update procedures and troubleshooting
- TEST_RUNNER_README.md - Test runner script documentation
- QUICK_START_TESTING.md - Quick start guide
- README.md - Project overview and build instructions
- ols-ucentral-schema repository - Official schema source