6.0 KiB
Development Partnership Guidelines
Core Development Principles
AI Context Efficiency
CRITICAL: All development practices and choices should be made taking into account the most context efficient interaction with another AI. Any AI should be able to understand this application with minimal prompting.
Codebase Integrity Rules
- Justify every new file and folder as being needed for the final production application.
- Never make up things that aren't part of the actual project
- Never skip or ignore existing system architecture
- Be precise and respectful of the current codebase
- Delete old code when replacing it
- Meaningful names:
userIDnotid
Docker-First Implementation Strategy
1. Package.json Updates Only
File: frontend/package.json
- Add
"{package}": "{version}"to dependencies - No npm install needed - handled by container rebuild
- Testing:
make rebuildthen verify container starts
2. Container-Validated Development Workflow (Production-only)
# After each change:
make rebuild # Rebuilds containers with new dependencies
make logs # Monitor for build/runtime errors
3. Docker-Tested Component Development (Production-only)
- All testing in containers:
make shell-frontendfor debugging - No dev servers; production builds served by nginx
- Changes require rebuild to reflect in production containers
Quality Standards
Automated Checks Are Mandatory
ALL hook issues are BLOCKING - EVERYTHING must be ✅ GREEN!
- No errors. No formatting issues. No linting problems. Zero tolerance
- These are not suggestions. Fix ALL issues before continuing
Code Completion Criteria
Our code is complete when:
- ✅ All linters pass with zero issues
- ✅ All tests pass
- ✅ Feature works end-to-end
- ✅ Old code is deleted
AI Collaboration Strategy
Use Multiple Agents
Leverage subagents aggressively for better results:
- Spawn agents to explore different parts of the codebase in parallel
- Use one agent to write tests while another implements features
- Delegate research tasks: "I'll have an agent investigate the database schema while I analyze the API structure"
- For complex refactors: One agent identifies changes, another implements them
Reality Checkpoints
Stop and validate at these moments:
- After implementing a complete feature
- Before starting a new major component
- When something feels wrong
- Before declaring "done"
Performance & Security Standards
Measure First
- No premature optimization
- Benchmark before claiming something is faster
Security Always
- Validate all inputs
- Use crypto/rand for randomness
- Prepared statements for SQL (never concatenate!)
AI Loading Context Strategies
For AI Assistants: Instant Codebase Understanding
To efficiently understand and maintain this codebase, follow this exact sequence:
1. Load Core Context (Required - 2 minutes)
Read these files in order:
1. .ai/context.json - Loading strategies and feature metadata
2. docs/README.md - Documentation navigation hub
2. For Specific Tasks
Working on Application Features
- Load entire feature directory:
backend/src/features/[feature-name]/ - Start with README.md for complete API and business rules
- Everything needed is in this single directory
- Remember: Features are modules within a single application service, not independent microservices
Working on Platform Services
- Load
docs/PLATFORM-SERVICES.mdfor complete service architecture - Hierarchical vehicle API patterns
- Service-to-service communication
- Platform service deployment and operations
Cross-Service Work
- Load platform service docs + consuming feature documentation
Database Work
- Application DB: Load
docs/DATABASE-SCHEMA.mdfor app schema - Platform Services: Load
docs/PLATFORM-SERVICES.mdfor service schemas
Testing Work
- Load
docs/TESTING.mdfor Docker-based testing workflow - Only use docker containers for testing. Never install local tools if they do not exist already
- Frontend now uses Jest (like backend).
make testruns backend + frontend tests - Jest config file:
frontend/jest.config.ts(TypeScript configuration) - Only vehicles feature has implemented tests; other features have scaffolded test directories
Architecture Context for AI
Hybrid Platform Architecture
MotoVaultPro uses a hybrid architecture: MVP Platform Services are true microservices, while the application is a modular monolith containing feature capsules. Application features in backend/src/features/[name]/ are self-contained modules within a single service that consumes platform services via HTTP APIs.
Key Principles for AI Understanding
- Production-Only: All services use production builds and configuration
- Docker-First: All development in containers, no local installs
- Platform Service Independence: Platform services are independent microservices
- Feature Capsule Organization: Application features are self-contained modules within a monolith
- Hybrid Deployment: Platform services deploy independently, application features deploy together
- Service Boundaries: Clear separation between platform microservices and application monolith
- User-Scoped Data: All application data isolated by user_id
Common AI Tasks
# Run all migrations (inside containers)
make migrate
# Run all tests (backend + frontend) inside containers
make test
# Run specific application feature tests (backend)
make shell-backend
npm test -- features/vehicles
# Run frontend tests only (inside disposable node container)
make test-frontend
# View logs (all services)
make logs
# Container shell access
make shell-backend # Application service
Never Use Emojis
Maintain professional documentation standards without emoji usage.
Mobile + Desktop Requirement
ALL features MUST be implemented and tested on BOTH mobile and desktop. This is a hard requirement that cannot be skipped. Every component, page, and feature needs responsive design and mobile-first considerations.