422 lines
7.4 KiB
Markdown
422 lines
7.4 KiB
Markdown
# NIP Test Suite
|
|
|
|
Comprehensive test suite for the NIP bootstrap system.
|
|
|
|
## Quick Start
|
|
|
|
Run all tests:
|
|
|
|
```bash
|
|
cd nip/tests
|
|
./run_all_tests.sh
|
|
```
|
|
|
|
## Test Structure
|
|
|
|
### Unit Tests
|
|
|
|
Individual component tests written in Nim:
|
|
|
|
- `test_recipes.nim` - Recipe parsing and validation
|
|
- `test_bootstrap_integration.nim` - Component integration tests
|
|
|
|
Run unit tests:
|
|
|
|
```bash
|
|
nim c -r test_recipes.nim
|
|
nim c -r test_bootstrap_integration.nim
|
|
```
|
|
|
|
### Integration Tests
|
|
|
|
Tests that verify components work together:
|
|
|
|
- `test_bootstrap_integration.nim` - Full integration test suite
|
|
- RecipeManager initialization
|
|
- Recipe loading and parsing
|
|
- DownloadManager functionality
|
|
- Checksum verification
|
|
- InstallationManager operations
|
|
- Archive extraction
|
|
- Script execution
|
|
- Error handling
|
|
- Caching
|
|
|
|
Run integration tests:
|
|
|
|
```bash
|
|
nim c -r test_bootstrap_integration.nim
|
|
```
|
|
|
|
### End-to-End Tests
|
|
|
|
Complete workflow tests using the CLI:
|
|
|
|
- `test_e2e_bootstrap.sh` - E2E bootstrap flow tests
|
|
- CLI command testing
|
|
- Bootstrap list/info/recipes
|
|
- Recipe validation
|
|
- Tool detection
|
|
- Container runtime detection
|
|
- Documentation verification
|
|
|
|
Run e2e tests:
|
|
|
|
```bash
|
|
./test_e2e_bootstrap.sh
|
|
```
|
|
|
|
### Bootstrap Flow Tests
|
|
|
|
Real-world bootstrap scenarios:
|
|
|
|
- `test_bootstrap_flow.sh` - Complete bootstrap workflows
|
|
- First-time installation
|
|
- Tool verification
|
|
- Update scenarios
|
|
- Error recovery
|
|
|
|
Run flow tests:
|
|
|
|
```bash
|
|
./test_bootstrap_flow.sh
|
|
```
|
|
|
|
## Test Runner
|
|
|
|
The master test runner orchestrates all tests:
|
|
|
|
```bash
|
|
./run_all_tests.sh
|
|
```
|
|
|
|
This will:
|
|
1. Check prerequisites (Nim, dependencies)
|
|
2. Build NIP if needed
|
|
3. Run all unit tests
|
|
4. Run integration tests
|
|
5. Run e2e tests
|
|
6. Validate recipes
|
|
7. Validate scripts
|
|
8. Check documentation
|
|
9. Generate comprehensive report
|
|
|
|
## Test Results
|
|
|
|
Test results are saved to `/tmp/nip-test-results-<pid>/`:
|
|
|
|
```
|
|
/tmp/nip-test-results-12345/
|
|
├── build.log # NIP build log
|
|
├── unit-test_recipes.nim.log # Unit test logs
|
|
├── Integration Tests.log # Integration test log
|
|
├── End-to-End Tests.log # E2E test log
|
|
└── Bootstrap Flow Test.log # Flow test log
|
|
```
|
|
|
|
## Running Specific Tests
|
|
|
|
### Recipe Parser Tests
|
|
|
|
```bash
|
|
nim c -r test_recipes.nim
|
|
```
|
|
|
|
### Integration Tests Only
|
|
|
|
```bash
|
|
nim c -r test_bootstrap_integration.nim
|
|
```
|
|
|
|
### E2E Tests Only
|
|
|
|
```bash
|
|
./test_e2e_bootstrap.sh
|
|
```
|
|
|
|
### Recipe Validation Only
|
|
|
|
```bash
|
|
# Validate all recipes
|
|
for recipe in ../recipes/*/minimal-*.kdl; do
|
|
echo "Validating: $recipe"
|
|
# Add validation command here
|
|
done
|
|
```
|
|
|
|
## Test Coverage
|
|
|
|
### Phase 2: Recipe System
|
|
|
|
- ✅ Recipe parsing (KDL format)
|
|
- ✅ Recipe validation
|
|
- ✅ Platform selection
|
|
- ✅ Download management
|
|
- ✅ Checksum verification (Blake2b-512)
|
|
- ✅ Archive extraction
|
|
- ✅ Script execution
|
|
- ✅ Installation verification
|
|
- ✅ Error handling
|
|
- ✅ Caching
|
|
- ✅ CLI integration
|
|
- ✅ Progress reporting
|
|
|
|
### Components Tested
|
|
|
|
- ✅ RecipeManager
|
|
- ✅ DownloadManager
|
|
- ✅ InstallationManager
|
|
- ✅ RecipeParser
|
|
- ✅ CLI commands
|
|
- ✅ Installation scripts
|
|
- ✅ Verification scripts
|
|
|
|
### Scenarios Tested
|
|
|
|
- ✅ Fresh installation
|
|
- ✅ Tool detection
|
|
- ✅ Missing tools handling
|
|
- ✅ Container runtime detection
|
|
- ✅ Recipe updates
|
|
- ✅ Error recovery
|
|
- ✅ Checksum mismatches
|
|
- ✅ Invalid recipes
|
|
- ✅ Missing files
|
|
|
|
## Prerequisites
|
|
|
|
### Required
|
|
|
|
- Nim compiler (1.6.0+)
|
|
- Bash shell
|
|
- Standard Unix tools (tar, gzip, etc.)
|
|
|
|
### Optional
|
|
|
|
- Podman or Docker (for container tests)
|
|
- Network access (for recipe updates)
|
|
|
|
## Continuous Integration
|
|
|
|
The test suite is designed for CI/CD integration:
|
|
|
|
```yaml
|
|
# Example GitHub Actions
|
|
- name: Run NIP Tests
|
|
run: |
|
|
cd nip/tests
|
|
./run_all_tests.sh
|
|
```
|
|
|
|
```yaml
|
|
# Example GitLab CI
|
|
test:
|
|
script:
|
|
- cd nip/tests
|
|
- ./run_all_tests.sh
|
|
artifacts:
|
|
paths:
|
|
- /tmp/nip-test-results-*/
|
|
when: always
|
|
```
|
|
|
|
## Writing New Tests
|
|
|
|
### Unit Test Template
|
|
|
|
```nim
|
|
## Test description
|
|
import std/[unittest, os]
|
|
import ../src/nimpak/build/your_module
|
|
|
|
suite "Your Module Tests":
|
|
setup:
|
|
# Setup code
|
|
discard
|
|
|
|
teardown:
|
|
# Cleanup code
|
|
discard
|
|
|
|
test "should do something":
|
|
# Test code
|
|
check someCondition == true
|
|
```
|
|
|
|
### Integration Test Template
|
|
|
|
```nim
|
|
proc testYourFeature(): bool =
|
|
let start = startTest("Your feature")
|
|
|
|
try:
|
|
# Test code
|
|
let result = yourFunction()
|
|
|
|
if result == expected:
|
|
endTest("Your feature", start, true)
|
|
return true
|
|
else:
|
|
endTest("Your feature", start, false, "Unexpected result")
|
|
return false
|
|
except Exception as e:
|
|
endTest("Your feature", start, false, e.msg)
|
|
return false
|
|
```
|
|
|
|
### E2E Test Template
|
|
|
|
```bash
|
|
test_your_feature() {
|
|
log_info "Test: Your feature"
|
|
|
|
output=$($NIP_BIN your command 2>&1 || true)
|
|
|
|
if echo "$output" | grep -q "expected output"; then
|
|
log_success "Your feature works"
|
|
else
|
|
log_error "Your feature failed"
|
|
fi
|
|
}
|
|
```
|
|
|
|
## Debugging Tests
|
|
|
|
### Enable Verbose Output
|
|
|
|
```bash
|
|
# Nim tests
|
|
nim c -r --verbosity:2 test_recipes.nim
|
|
|
|
# Shell tests
|
|
bash -x test_e2e_bootstrap.sh
|
|
```
|
|
|
|
### Run Single Test
|
|
|
|
```bash
|
|
# Nim
|
|
nim c -r test_recipes.nim --test:"specific test name"
|
|
|
|
# Shell - edit script to comment out other tests
|
|
```
|
|
|
|
### Check Test Logs
|
|
|
|
```bash
|
|
# View latest test results
|
|
ls -lt /tmp/nip-test-results-*/
|
|
|
|
# View specific log
|
|
cat /tmp/nip-test-results-12345/Integration\ Tests.log
|
|
```
|
|
|
|
## Test Maintenance
|
|
|
|
### Adding New Tests
|
|
|
|
1. Create test file in `nip/tests/`
|
|
2. Follow naming convention: `test_*.nim` or `test_*.sh`
|
|
3. Add to `run_all_tests.sh` if needed
|
|
4. Update this README
|
|
|
|
### Updating Tests
|
|
|
|
When adding new features:
|
|
|
|
1. Add unit tests for new components
|
|
2. Add integration tests for component interactions
|
|
3. Add e2e tests for user-facing features
|
|
4. Update test documentation
|
|
|
|
### Test Data
|
|
|
|
Test data and fixtures:
|
|
|
|
- Recipes: `../recipes/*/minimal-*.kdl`
|
|
- Scripts: `../recipes/*/scripts/*.sh`
|
|
- Schemas: `../recipes/schema/recipe.json`
|
|
|
|
## Troubleshooting
|
|
|
|
### Tests Fail to Build
|
|
|
|
```bash
|
|
# Check Nim installation
|
|
nim --version
|
|
|
|
# Clean and rebuild
|
|
rm -rf nimcache/
|
|
nim c test_recipes.nim
|
|
```
|
|
|
|
### Tests Fail to Run
|
|
|
|
```bash
|
|
# Check permissions
|
|
chmod +x test_*.sh
|
|
|
|
# Check NIP binary
|
|
ls -l ../nip
|
|
|
|
# Rebuild NIP
|
|
cd ..
|
|
nim c -d:release nip.nim
|
|
```
|
|
|
|
### Network-Dependent Tests Fail
|
|
|
|
Some tests require network access:
|
|
|
|
- Recipe updates
|
|
- Git repository cloning
|
|
|
|
Run offline-safe tests only:
|
|
|
|
```bash
|
|
# Skip network tests
|
|
OFFLINE=1 ./run_all_tests.sh
|
|
```
|
|
|
|
## Performance
|
|
|
|
Test suite execution time (approximate):
|
|
|
|
- Unit tests: ~5 seconds
|
|
- Integration tests: ~10 seconds
|
|
- E2E tests: ~15 seconds
|
|
- Full suite: ~30-45 seconds
|
|
|
|
## Contributing
|
|
|
|
When contributing tests:
|
|
|
|
1. Follow existing test patterns
|
|
2. Add descriptive test names
|
|
3. Include error messages
|
|
4. Clean up test artifacts
|
|
5. Update documentation
|
|
|
|
## Future Enhancements
|
|
|
|
Planned test improvements:
|
|
|
|
- [ ] Code coverage reporting
|
|
- [ ] Performance benchmarks
|
|
- [ ] Stress testing
|
|
- [ ] Multi-platform CI
|
|
- [ ] Container-based tests
|
|
- [ ] Mock external dependencies
|
|
- [ ] Parallel test execution
|
|
|
|
## Summary
|
|
|
|
The NIP test suite provides comprehensive coverage of the bootstrap system:
|
|
|
|
- **Unit tests** verify individual components
|
|
- **Integration tests** verify components work together
|
|
- **E2E tests** verify user workflows
|
|
- **Master runner** orchestrates everything
|
|
|
|
Run `./run_all_tests.sh` to execute the complete test suite.
|