Merge pull request 'fix: Implement tiered backup retention classification (refs #6)' (#21) from issue-6-tiered-backup-retention into main
All checks were successful
Deploy to Staging / Build Images (push) Successful in 17s
Deploy to Staging / Deploy to Staging (push) Successful in 38s
Deploy to Staging / Verify Staging (push) Successful in 6s
Deploy to Staging / Notify Staging Ready (push) Successful in 6s
Deploy to Staging / Notify Staging Failure (push) Has been skipped

Reviewed-on: #21
This commit was merged in pull request #21.
This commit is contained in:
2026-01-11 04:07:26 +00:00
12 changed files with 681 additions and 114 deletions

View File

@@ -0,0 +1,30 @@
# backup/
Complete backup and restore system with tiered retention.
## Files
| File | What | When to read |
| ---- | ---- | ------------ |
| `README.md` | Feature documentation | Understanding backup architecture |
## Subdirectories
| Directory | What | When to read |
| --------- | ---- | ------------ |
| `api/` | HTTP endpoints, validation | API changes |
| `domain/` | Business logic, services | Core backup/retention logic |
| `data/` | Repository, database queries | Database operations |
| `jobs/` | Scheduled job handlers | Cron job modifications |
| `migrations/` | Database schema | Schema changes |
| `tests/` | Unit and integration tests | Adding or modifying tests |
## Key Files
| File | What | When to read |
| ---- | ---- | ------------ |
| `domain/backup.types.ts` | Types, constants, TIERED_RETENTION | Type definitions |
| `domain/backup.service.ts` | Core backup operations | Creating/managing backups |
| `domain/backup-classification.service.ts` | Tiered retention classification | Category/expiration logic |
| `domain/backup-retention.service.ts` | Retention enforcement | Deletion logic |
| `data/backup.repository.ts` | Database queries | Data access patterns |

View File

@@ -19,11 +19,12 @@ backup/
backup.controller.ts # Request handlers backup.controller.ts # Request handlers
backup.validation.ts # Zod schemas backup.validation.ts # Zod schemas
domain/ # Business logic domain/ # Business logic
backup.types.ts # TypeScript types backup.types.ts # TypeScript types and constants
backup.service.ts # Core backup operations backup.service.ts # Core backup operations
backup-archive.service.ts # Archive creation backup-archive.service.ts # Archive creation
backup-restore.service.ts # Restore operations backup-restore.service.ts # Restore operations
backup-retention.service.ts # Retention enforcement backup-retention.service.ts # Tiered retention enforcement
backup-classification.service.ts # Backup category classification
data/ # Data access data/ # Data access
backup.repository.ts # Database queries backup.repository.ts # Database queries
jobs/ # Scheduled jobs jobs/ # Scheduled jobs
@@ -31,6 +32,10 @@ backup/
backup-cleanup.job.ts # Retention cleanup backup-cleanup.job.ts # Retention cleanup
migrations/ # Database schema migrations/ # Database schema
001_create_backup_tables.sql 001_create_backup_tables.sql
002_add_retention_categories.sql # Tiered retention columns
tests/ # Test files
unit/
backup-classification.service.test.ts # Classification tests
``` ```
## API Endpoints ## API Endpoints
@@ -122,11 +127,45 @@ Scheduled backups use Redis distributed locking to prevent duplicate backups whe
- Lock TTL: 5 minutes (auto-release if container crashes) - Lock TTL: 5 minutes (auto-release if container crashes)
- Only one container creates the backup; others skip - Only one container creates the backup; others skip
**Retention cleanup:** **Retention cleanup (tiered):**
- Runs immediately after each successful scheduled backup - Runs immediately after each successful scheduled backup
- Deletes backups exceeding the schedule's retention count - Uses tiered classification: each backup can belong to multiple categories
- A backup is only deleted when it exceeds ALL applicable category quotas
- Also runs globally at 4 AM daily as a safety net - Also runs globally at 4 AM daily as a safety net
## Tiered Retention System
Backups are classified by their creation timestamp into categories:
| Category | Qualification | Retention Count |
|----------|--------------|-----------------|
| hourly | All backups | 8 |
| daily | First backup at midnight UTC | 7 |
| weekly | First backup on Sunday at midnight UTC | 4 |
| monthly | First backup on 1st of month at midnight UTC | 12 |
**Multi-category classification:**
- A backup can belong to multiple categories simultaneously
- Example: Backup at midnight on Sunday, January 1st qualifies as: hourly + daily + weekly + monthly
**Retention logic:**
```
For each category (hourly, daily, weekly, monthly):
1. Get all backups with this category
2. Keep top N (sorted by started_at DESC)
3. Add to protected set
A backup is deleted ONLY if it's NOT in the protected set
(i.e., exceeds quota for ALL its categories)
```
**Expiration calculation:**
- Each backup's `expires_at` is calculated based on its longest retention period
- Monthly backup: 12 months from creation
- Weekly-only backup: 4 weeks from creation
- Daily-only backup: 7 days from creation
- Hourly-only backup: 8 hours from creation
See `backend/src/core/scheduler/README.md` for the distributed locking pattern. See `backend/src/core/scheduler/README.md` for the distributed locking pattern.
### Admin Routes ### Admin Routes

View File

@@ -12,6 +12,7 @@ import {
BackupType, BackupType,
BackupStatus, BackupStatus,
BackupMetadata, BackupMetadata,
BackupCategory,
ListBackupsParams, ListBackupsParams,
CRON_EXPRESSIONS, CRON_EXPRESSIONS,
} from '../domain/backup.types'; } from '../domain/backup.types';
@@ -54,6 +55,8 @@ export class BackupRepository {
completedAt: row.completed_at ? new Date(row.completed_at) : null, completedAt: row.completed_at ? new Date(row.completed_at) : null,
createdBy: row.created_by, createdBy: row.created_by,
metadata: row.metadata as BackupMetadata, metadata: row.metadata as BackupMetadata,
categories: (row.categories || ['hourly']) as BackupCategory[],
expiresAt: row.expires_at ? new Date(row.expires_at) : null,
}; };
} }
@@ -261,11 +264,13 @@ export class BackupRepository {
fileSizeBytes: number; fileSizeBytes: number;
createdBy?: string | null; createdBy?: string | null;
metadata?: BackupMetadata; metadata?: BackupMetadata;
categories?: BackupCategory[];
expiresAt?: Date | null;
}): Promise<BackupHistory> { }): Promise<BackupHistory> {
const result = await this.pool.query( const result = await this.pool.query(
`INSERT INTO backup_history `INSERT INTO backup_history
(schedule_id, backup_type, filename, file_path, file_size_bytes, status, created_by, metadata) (schedule_id, backup_type, filename, file_path, file_size_bytes, status, created_by, metadata, categories, expires_at)
VALUES ($1, $2, $3, $4, $5, 'in_progress', $6, $7) VALUES ($1, $2, $3, $4, $5, 'in_progress', $6, $7, $8, $9)
RETURNING *`, RETURNING *`,
[ [
data.scheduleId || null, data.scheduleId || null,
@@ -275,6 +280,8 @@ export class BackupRepository {
data.fileSizeBytes, data.fileSizeBytes,
data.createdBy || null, data.createdBy || null,
JSON.stringify(data.metadata || {}), JSON.stringify(data.metadata || {}),
data.categories || ['hourly'],
data.expiresAt || null,
] ]
); );
return this.mapHistoryRow(result.rows[0]); return this.mapHistoryRow(result.rows[0]);
@@ -351,6 +358,38 @@ export class BackupRepository {
return result.rows.map(this.mapHistoryRow); return result.rows.map(this.mapHistoryRow);
} }
// ============================================
// Tiered Retention Operations
// ============================================
/**
* Gets all completed backups that have a specific category.
* Sorted by started_at DESC (newest first).
*/
async getBackupsByCategory(category: BackupCategory): Promise<BackupHistory[]> {
const result = await this.pool.query(
`SELECT * FROM backup_history
WHERE status = 'completed'
AND $1 = ANY(categories)
ORDER BY started_at DESC`,
[category]
);
return result.rows.map(row => this.mapHistoryRow(row));
}
/**
* Gets all completed backups for tiered retention processing.
* Returns backups sorted by started_at DESC.
*/
async getAllCompletedBackups(): Promise<BackupHistory[]> {
const result = await this.pool.query(
`SELECT * FROM backup_history
WHERE status = 'completed'
ORDER BY started_at DESC`
);
return result.rows.map(row => this.mapHistoryRow(row));
}
// ============================================ // ============================================
// Settings Operations // Settings Operations
// ============================================ // ============================================

View File

@@ -0,0 +1,106 @@
/**
* @ai-summary Service for classifying backups into tiered retention categories
* @ai-context Pure functions for timestamp-based classification, no database dependencies
*/
import { BackupCategory, TIERED_RETENTION } from './backup.types';
/**
* Classifies a backup by its timestamp into retention categories.
* A backup can belong to multiple categories simultaneously.
*
* Categories:
* - hourly: All backups
* - daily: First backup at midnight UTC (hour = 0)
* - weekly: First backup on Sunday at midnight UTC
* - monthly: First backup on 1st of month at midnight UTC
*/
export function classifyBackup(timestamp: Date): BackupCategory[] {
const categories: BackupCategory[] = ['hourly'];
const utcHour = timestamp.getUTCHours();
const utcDay = timestamp.getUTCDate();
const utcDayOfWeek = timestamp.getUTCDay(); // 0 = Sunday
// Midnight UTC qualifies for daily
if (utcHour === 0) {
categories.push('daily');
// Sunday at midnight qualifies for weekly
if (utcDayOfWeek === 0) {
categories.push('weekly');
}
// 1st of month at midnight qualifies for monthly
if (utcDay === 1) {
categories.push('monthly');
}
}
return categories;
}
/**
* Calculates the expiration date based on the backup's categories.
* Uses the longest retention period among all applicable categories.
*
* Retention periods are count-based in the actual cleanup, but for display
* we estimate based on typical backup frequency:
* - hourly: 8 hours (8 backups * 1 hour)
* - daily: 7 days (7 backups * 1 day)
* - weekly: 4 weeks (4 backups * 1 week)
* - monthly: 12 months (12 backups * 1 month)
*/
export function calculateExpiration(
categories: BackupCategory[],
timestamp: Date
): Date {
const expirationDate = new Date(timestamp);
if (categories.includes('monthly')) {
expirationDate.setUTCMonth(expirationDate.getUTCMonth() + TIERED_RETENTION.monthly);
} else if (categories.includes('weekly')) {
expirationDate.setUTCDate(expirationDate.getUTCDate() + TIERED_RETENTION.weekly * 7);
} else if (categories.includes('daily')) {
expirationDate.setUTCDate(expirationDate.getUTCDate() + TIERED_RETENTION.daily);
} else {
// Hourly only - 8 hours
expirationDate.setUTCHours(expirationDate.getUTCHours() + TIERED_RETENTION.hourly);
}
return expirationDate;
}
/**
* Checks if a backup timestamp represents the first backup of the day (midnight UTC).
*/
export function isFirstBackupOfDay(timestamp: Date): boolean {
return timestamp.getUTCHours() === 0;
}
/**
* Checks if a timestamp falls on a Sunday.
*/
export function isSunday(timestamp: Date): boolean {
return timestamp.getUTCDay() === 0;
}
/**
* Checks if a timestamp falls on the first day of the month.
*/
export function isFirstDayOfMonth(timestamp: Date): boolean {
return timestamp.getUTCDate() === 1;
}
/**
* Classifies a backup and calculates its expiration in one call.
* Convenience function for backup creation flow.
*/
export function classifyAndCalculateExpiration(timestamp: Date): {
categories: BackupCategory[];
expiresAt: Date;
} {
const categories = classifyBackup(timestamp);
const expiresAt = calculateExpiration(categories, timestamp);
return { categories, expiresAt };
}

View File

@@ -10,6 +10,9 @@ import { BackupRepository } from '../data/backup.repository';
import { import {
RetentionCleanupResult, RetentionCleanupResult,
RetentionCleanupJobResult, RetentionCleanupJobResult,
BackupCategory,
BackupHistory,
TIERED_RETENTION,
} from './backup.types'; } from './backup.types';
export class BackupRetentionService { export class BackupRetentionService {
@@ -20,61 +23,47 @@ export class BackupRetentionService {
} }
/** /**
* Processes retention cleanup for all schedules * Processes retention cleanup using tiered classification.
* A backup can only be deleted if it exceeds the quota for ALL of its categories.
*/ */
async processRetentionCleanup(): Promise<RetentionCleanupJobResult> { async processRetentionCleanup(): Promise<RetentionCleanupJobResult> {
logger.info('Starting backup retention cleanup'); logger.info('Starting tiered backup retention cleanup');
const schedules = await this.repository.listSchedules();
const results: RetentionCleanupResult[] = []; const results: RetentionCleanupResult[] = [];
const errors: Array<{ scheduleId: string; error: string }> = []; const errors: Array<{ scheduleId: string; error: string }> = [];
let totalDeleted = 0; let totalDeleted = 0;
let totalFreedBytes = 0; let totalFreedBytes = 0;
for (const schedule of schedules) { try {
try { const result = await this.processTieredRetentionCleanup();
const result = await this.cleanupScheduleBackups( results.push(result);
schedule.id, totalDeleted = result.deletedCount;
schedule.name, totalFreedBytes = result.freedBytes;
schedule.retentionCount } catch (error) {
); const errorMessage = error instanceof Error ? error.message : String(error);
results.push(result); logger.error('Tiered retention cleanup failed', { error: errorMessage });
totalDeleted += result.deletedCount; errors.push({ scheduleId: 'tiered', error: errorMessage });
totalFreedBytes += result.freedBytes;
} catch (error) {
const errorMessage = error instanceof Error ? error.message : String(error);
logger.error('Retention cleanup failed for schedule', {
scheduleId: schedule.id,
scheduleName: schedule.name,
error: errorMessage,
});
errors.push({ scheduleId: schedule.id, error: errorMessage });
}
} }
// Also cleanup orphaned backups (from deleted schedules) // Also cleanup failed backups older than 24 hours
try { try {
const orphanResult = await this.cleanupOrphanedBackups(); const failedCount = await this.cleanupFailedBackups();
if (orphanResult.deletedCount > 0) { if (failedCount > 0) {
results.push(orphanResult); logger.info('Cleaned up failed backups', { count: failedCount });
totalDeleted += orphanResult.deletedCount;
totalFreedBytes += orphanResult.freedBytes;
} }
} catch (error) { } catch (error) {
const errorMessage = error instanceof Error ? error.message : String(error); const errorMessage = error instanceof Error ? error.message : String(error);
logger.error('Orphaned backup cleanup failed', { error: errorMessage }); logger.error('Failed backup cleanup failed', { error: errorMessage });
errors.push({ scheduleId: 'orphaned', error: errorMessage });
} }
logger.info('Backup retention cleanup completed', { logger.info('Backup retention cleanup completed', {
processed: schedules.length,
totalDeleted, totalDeleted,
totalFreedBytes, totalFreedBytes,
errors: errors.length, errors: errors.length,
}); });
return { return {
processed: schedules.length, processed: 1, // Single tiered process
totalDeleted, totalDeleted,
totalFreedBytes, totalFreedBytes,
results, results,
@@ -82,6 +71,140 @@ export class BackupRetentionService {
}; };
} }
/**
* Implements tiered retention: keeps N backups per category.
* A backup is protected if it's in the top N for ANY of its categories.
* Only deletes backups that exceed ALL applicable category quotas.
*/
private async processTieredRetentionCleanup(): Promise<RetentionCleanupResult> {
const allBackups = await this.repository.getAllCompletedBackups();
if (allBackups.length === 0) {
logger.debug('No completed backups to process');
return {
scheduleId: 'tiered',
scheduleName: 'Tiered Retention',
deletedCount: 0,
retainedCount: 0,
freedBytes: 0,
};
}
// Build sets of protected backup IDs for each category
const protectedIds = new Set<string>();
const categoryRetained: Record<BackupCategory, string[]> = {
hourly: [],
daily: [],
weekly: [],
monthly: [],
};
// For each category, identify which backups to keep
const categories: BackupCategory[] = ['hourly', 'daily', 'weekly', 'monthly'];
for (const category of categories) {
const limit = TIERED_RETENTION[category];
const backupsInCategory = allBackups.filter(b =>
b.categories && b.categories.includes(category)
);
// Keep the top N (already sorted by started_at DESC)
const toKeep = backupsInCategory.slice(0, limit);
for (const backup of toKeep) {
protectedIds.add(backup.id);
categoryRetained[category].push(backup.id);
}
logger.debug('Category retention analysis', {
category,
limit,
totalInCategory: backupsInCategory.length,
keeping: toKeep.length,
});
}
// Find backups to delete (not protected by any category)
const backupsToDelete = allBackups.filter(b => !protectedIds.has(b.id));
logger.info('Tiered retention analysis complete', {
totalBackups: allBackups.length,
protected: protectedIds.size,
toDelete: backupsToDelete.length,
hourlyRetained: categoryRetained.hourly.length,
dailyRetained: categoryRetained.daily.length,
weeklyRetained: categoryRetained.weekly.length,
monthlyRetained: categoryRetained.monthly.length,
});
// Delete unprotected backups
let deletedCount = 0;
let freedBytes = 0;
for (const backup of backupsToDelete) {
try {
// Log retention decision with category reasoning
logger.info('Deleting backup - exceeded all category quotas', {
backupId: backup.id,
filename: backup.filename,
categories: backup.categories,
startedAt: backup.startedAt,
reason: this.buildDeletionReason(backup, categoryRetained),
});
// Delete the file
const filePath = (backup.metadata as any)?.archivePath || backup.filePath;
if (filePath) {
try {
const stats = await fsp.stat(filePath);
freedBytes += stats.size;
await fsp.unlink(filePath);
} catch (error) {
logger.warn('Failed to delete backup file', {
backupId: backup.id,
filePath,
});
}
}
// Delete the database record
await this.repository.deleteBackupRecord(backup.id);
deletedCount++;
} catch (error) {
logger.error('Failed to delete backup during retention cleanup', {
backupId: backup.id,
error: error instanceof Error ? error.message : String(error),
});
}
}
return {
scheduleId: 'tiered',
scheduleName: 'Tiered Retention',
deletedCount,
retainedCount: protectedIds.size,
freedBytes,
};
}
/**
* Builds a human-readable reason for why a backup is being deleted.
*/
private buildDeletionReason(
backup: BackupHistory,
categoryRetained: Record<BackupCategory, string[]>
): string {
const reasons: string[] = [];
for (const category of (backup.categories || ['hourly']) as BackupCategory[]) {
const kept = categoryRetained[category];
const limit = TIERED_RETENTION[category];
if (!kept.includes(backup.id)) {
reasons.push(`${category}: not in top ${limit}`);
}
}
return reasons.join('; ') || 'no categories';
}
/** /**
* Cleans up old backups for a specific schedule * Cleans up old backups for a specific schedule
*/ */
@@ -200,75 +323,4 @@ export class BackupRetentionService {
return deletedCount; return deletedCount;
} }
/**
* Cleans up orphaned backups (from deleted schedules)
* Keeps manual backups indefinitely
*/
private async cleanupOrphanedBackups(): Promise<RetentionCleanupResult> {
const { items } = await this.repository.listBackups({
backupType: 'scheduled',
pageSize: 1000,
});
// Get all valid schedule IDs
const schedules = await this.repository.listSchedules();
const validScheduleIds = new Set(schedules.map(s => s.id));
// Find orphaned scheduled backups (schedule was deleted)
const orphanedBackups = items.filter(
backup => backup.scheduleId && !validScheduleIds.has(backup.scheduleId)
);
// Keep only the most recent 5 orphaned backups per deleted schedule
const orphansBySchedule = new Map<string, typeof orphanedBackups>();
for (const backup of orphanedBackups) {
const scheduleId = backup.scheduleId!;
if (!orphansBySchedule.has(scheduleId)) {
orphansBySchedule.set(scheduleId, []);
}
orphansBySchedule.get(scheduleId)!.push(backup);
}
let deletedCount = 0;
let freedBytes = 0;
let retainedCount = 0;
for (const [_scheduleId, backups] of orphansBySchedule) {
// Sort by date descending and keep first 5
backups.sort((a, b) => b.startedAt.getTime() - a.startedAt.getTime());
const toDelete = backups.slice(5);
retainedCount += Math.min(backups.length, 5);
for (const backup of toDelete) {
try {
const filePath = (backup.metadata as any)?.archivePath || backup.filePath;
if (filePath) {
try {
const stats = await fsp.stat(filePath);
freedBytes += stats.size;
await fsp.unlink(filePath);
} catch {
// File might not exist
}
}
await this.repository.deleteBackupRecord(backup.id);
deletedCount++;
} catch (error) {
logger.warn('Failed to delete orphaned backup', {
backupId: backup.id,
error: error instanceof Error ? error.message : String(error),
});
}
}
}
return {
scheduleId: 'orphaned',
scheduleName: 'Orphaned Backups',
deletedCount,
retainedCount,
freedBytes,
};
}
} }

View File

@@ -22,6 +22,7 @@ import {
BackupFrequency, BackupFrequency,
ScheduleResponse, ScheduleResponse,
} from './backup.types'; } from './backup.types';
import { classifyAndCalculateExpiration } from './backup-classification.service';
export class BackupService { export class BackupService {
private repository: BackupRepository; private repository: BackupRepository;
@@ -40,10 +41,14 @@ export class BackupService {
* Creates a new backup * Creates a new backup
*/ */
async createBackup(options: CreateBackupOptions): Promise<BackupResult> { async createBackup(options: CreateBackupOptions): Promise<BackupResult> {
const timestamp = new Date().toISOString().replace(/[:.]/g, '-').slice(0, 19); const now = new Date();
const timestamp = now.toISOString().replace(/[:.]/g, '-').slice(0, 19);
const tempFilename = `backup_${timestamp}`; const tempFilename = `backup_${timestamp}`;
// Create initial backup record // Classify the backup based on its creation timestamp
const { categories, expiresAt } = classifyAndCalculateExpiration(now);
// Create initial backup record with classification
const backupRecord = await this.repository.createBackupRecord({ const backupRecord = await this.repository.createBackupRecord({
scheduleId: options.scheduleId, scheduleId: options.scheduleId,
backupType: options.backupType, backupType: options.backupType,
@@ -52,12 +57,16 @@ export class BackupService {
fileSizeBytes: 0, fileSizeBytes: 0,
createdBy: options.createdBy, createdBy: options.createdBy,
metadata: { name: options.name }, metadata: { name: options.name },
categories,
expiresAt,
}); });
logger.info('Starting backup creation', { logger.info('Starting backup creation', {
backupId: backupRecord.id, backupId: backupRecord.id,
backupType: options.backupType, backupType: options.backupType,
scheduleName: options.name, scheduleName: options.name,
categories,
expiresAt: expiresAt.toISOString(),
}); });
try { try {

View File

@@ -29,6 +29,17 @@ export const DEFAULT_RETENTION = {
monthly: 12, monthly: 12,
} as const; } as const;
/**
* Tiered retention counts for unified classification system.
* Each backup can belong to multiple categories; expiration is based on longest retention.
*/
export const TIERED_RETENTION = {
hourly: 8,
daily: 7,
weekly: 4,
monthly: 12,
} as const;
// ============================================ // ============================================
// Enums and Union Types // Enums and Union Types
// ============================================ // ============================================
@@ -36,6 +47,7 @@ export const DEFAULT_RETENTION = {
export type BackupFrequency = 'hourly' | 'daily' | 'weekly' | 'monthly'; export type BackupFrequency = 'hourly' | 'daily' | 'weekly' | 'monthly';
export type BackupType = 'scheduled' | 'manual'; export type BackupType = 'scheduled' | 'manual';
export type BackupStatus = 'in_progress' | 'completed' | 'failed'; export type BackupStatus = 'in_progress' | 'completed' | 'failed';
export type BackupCategory = 'hourly' | 'daily' | 'weekly' | 'monthly';
// ============================================ // ============================================
// Database Entity Types // Database Entity Types
@@ -69,6 +81,8 @@ export interface BackupHistory {
completedAt: Date | null; completedAt: Date | null;
createdBy: string | null; createdBy: string | null;
metadata: BackupMetadata; metadata: BackupMetadata;
categories: BackupCategory[];
expiresAt: Date | null;
} }
export interface BackupSettings { export interface BackupSettings {

View File

@@ -0,0 +1,78 @@
-- Migration: Add tiered retention classification columns
-- Description: Adds categories array and expires_at for tiered backup retention
-- Issue: #6 - Backup retention purges all backups
-- ============================================
-- Add new columns to backup_history
-- ============================================
ALTER TABLE backup_history
ADD COLUMN IF NOT EXISTS categories TEXT[] DEFAULT '{}',
ADD COLUMN IF NOT EXISTS expires_at TIMESTAMP WITH TIME ZONE;
-- ============================================
-- Indexes for efficient category queries
-- ============================================
CREATE INDEX IF NOT EXISTS idx_backup_history_categories ON backup_history USING GIN(categories);
CREATE INDEX IF NOT EXISTS idx_backup_history_expires ON backup_history(expires_at);
-- ============================================
-- Populate categories for existing backups based on started_at
-- Classification logic:
-- - All backups: 'hourly'
-- - Hour = 0 (midnight UTC): + 'daily'
-- - Hour = 0 AND Sunday: + 'weekly'
-- - Hour = 0 AND day = 1: + 'monthly'
-- ============================================
UPDATE backup_history
SET categories = ARRAY(
SELECT unnest(
CASE
-- Midnight on Sunday, 1st of month: all categories
WHEN EXTRACT(HOUR FROM started_at AT TIME ZONE 'UTC') = 0
AND EXTRACT(DOW FROM started_at AT TIME ZONE 'UTC') = 0
AND EXTRACT(DAY FROM started_at AT TIME ZONE 'UTC') = 1
THEN ARRAY['hourly', 'daily', 'weekly', 'monthly']
-- Midnight on Sunday (not 1st): hourly + daily + weekly
WHEN EXTRACT(HOUR FROM started_at AT TIME ZONE 'UTC') = 0
AND EXTRACT(DOW FROM started_at AT TIME ZONE 'UTC') = 0
THEN ARRAY['hourly', 'daily', 'weekly']
-- Midnight on 1st (not Sunday): hourly + daily + monthly
WHEN EXTRACT(HOUR FROM started_at AT TIME ZONE 'UTC') = 0
AND EXTRACT(DAY FROM started_at AT TIME ZONE 'UTC') = 1
THEN ARRAY['hourly', 'daily', 'monthly']
-- Midnight (not Sunday, not 1st): hourly + daily
WHEN EXTRACT(HOUR FROM started_at AT TIME ZONE 'UTC') = 0
THEN ARRAY['hourly', 'daily']
-- Non-midnight: hourly only
ELSE ARRAY['hourly']
END
)
)
WHERE categories = '{}' OR categories IS NULL;
-- ============================================
-- Calculate expires_at based on categories
-- Retention periods: hourly=8hrs, daily=7days, weekly=4wks, monthly=12mo
-- Use longest applicable retention period
-- ============================================
UPDATE backup_history
SET expires_at = CASE
WHEN 'monthly' = ANY(categories) THEN started_at + INTERVAL '12 months'
WHEN 'weekly' = ANY(categories) THEN started_at + INTERVAL '4 weeks'
WHEN 'daily' = ANY(categories) THEN started_at + INTERVAL '7 days'
ELSE started_at + INTERVAL '8 hours'
END
WHERE expires_at IS NULL;
-- ============================================
-- Add NOT NULL constraint after populating data
-- ============================================
ALTER TABLE backup_history
ALTER COLUMN categories SET DEFAULT ARRAY['hourly']::TEXT[];
-- Ensure all rows have categories
UPDATE backup_history SET categories = ARRAY['hourly'] WHERE categories = '{}' OR categories IS NULL;

View File

@@ -0,0 +1,188 @@
/**
* @ai-summary Unit tests for BackupClassificationService
* @ai-context Tests pure timestamp-based classification functions
*/
import {
classifyBackup,
calculateExpiration,
isFirstBackupOfDay,
isSunday,
isFirstDayOfMonth,
classifyAndCalculateExpiration,
} from '../../domain/backup-classification.service';
import { TIERED_RETENTION } from '../../domain/backup.types';
describe('BackupClassificationService', () => {
describe('classifyBackup', () => {
it('should classify regular hourly backup (non-midnight)', () => {
// Tuesday, January 7, 2026 at 14:30 UTC
const timestamp = new Date('2026-01-07T14:30:00.000Z');
const categories = classifyBackup(timestamp);
expect(categories).toEqual(['hourly']);
});
it('should classify midnight backup as hourly + daily', () => {
// Wednesday, January 8, 2026 at 00:00 UTC
const timestamp = new Date('2026-01-08T00:00:00.000Z');
const categories = classifyBackup(timestamp);
expect(categories).toEqual(['hourly', 'daily']);
});
it('should classify Sunday midnight backup as hourly + daily + weekly', () => {
// Sunday, January 4, 2026 at 00:00 UTC
const timestamp = new Date('2026-01-04T00:00:00.000Z');
const categories = classifyBackup(timestamp);
expect(categories).toEqual(['hourly', 'daily', 'weekly']);
});
it('should classify 1st of month midnight backup as hourly + daily + monthly', () => {
// Thursday, January 1, 2026 at 00:00 UTC (not Sunday)
const timestamp = new Date('2026-01-01T00:00:00.000Z');
const categories = classifyBackup(timestamp);
expect(categories).toEqual(['hourly', 'daily', 'monthly']);
});
it('should classify Sunday 1st of month midnight as all categories', () => {
// Sunday, February 1, 2026 at 00:00 UTC
const timestamp = new Date('2026-02-01T00:00:00.000Z');
const categories = classifyBackup(timestamp);
expect(categories).toEqual(['hourly', 'daily', 'weekly', 'monthly']);
});
it('should not classify non-midnight on 1st as monthly', () => {
// Thursday, January 1, 2026 at 10:00 UTC
const timestamp = new Date('2026-01-01T10:00:00.000Z');
const categories = classifyBackup(timestamp);
expect(categories).toEqual(['hourly']);
});
it('should not classify non-midnight on Sunday as weekly', () => {
// Sunday, January 4, 2026 at 15:00 UTC
const timestamp = new Date('2026-01-04T15:00:00.000Z');
const categories = classifyBackup(timestamp);
expect(categories).toEqual(['hourly']);
});
});
describe('calculateExpiration', () => {
const baseTimestamp = new Date('2026-01-05T00:00:00.000Z');
it('should calculate 8 hours for hourly-only backup', () => {
const expiresAt = calculateExpiration(['hourly'], baseTimestamp);
const expectedDate = new Date('2026-01-05T08:00:00.000Z');
expect(expiresAt).toEqual(expectedDate);
});
it('should calculate 7 days for daily backup', () => {
const expiresAt = calculateExpiration(['hourly', 'daily'], baseTimestamp);
const expectedDate = new Date('2026-01-12T00:00:00.000Z');
expect(expiresAt).toEqual(expectedDate);
});
it('should calculate 4 weeks for weekly backup', () => {
const expiresAt = calculateExpiration(['hourly', 'daily', 'weekly'], baseTimestamp);
const expectedDate = new Date('2026-02-02T00:00:00.000Z');
expect(expiresAt).toEqual(expectedDate);
});
it('should calculate 12 months for monthly backup', () => {
const expiresAt = calculateExpiration(
['hourly', 'daily', 'weekly', 'monthly'],
baseTimestamp
);
const expectedDate = new Date('2027-01-05T00:00:00.000Z');
expect(expiresAt).toEqual(expectedDate);
});
it('should use longest retention when monthly is present (even without weekly)', () => {
const expiresAt = calculateExpiration(['hourly', 'daily', 'monthly'], baseTimestamp);
const expectedDate = new Date('2027-01-05T00:00:00.000Z');
expect(expiresAt).toEqual(expectedDate);
});
});
describe('isFirstBackupOfDay', () => {
it('should return true for midnight UTC', () => {
const timestamp = new Date('2026-01-05T00:00:00.000Z');
expect(isFirstBackupOfDay(timestamp)).toBe(true);
});
it('should return false for non-midnight', () => {
const timestamp = new Date('2026-01-05T01:00:00.000Z');
expect(isFirstBackupOfDay(timestamp)).toBe(false);
});
it('should return true for midnight with minutes/seconds', () => {
// 00:30:45 is still hour 0
const timestamp = new Date('2026-01-05T00:30:45.000Z');
expect(isFirstBackupOfDay(timestamp)).toBe(true);
});
});
describe('isSunday', () => {
it('should return true for Sunday', () => {
// January 4, 2026 is a Sunday
const timestamp = new Date('2026-01-04T12:00:00.000Z');
expect(isSunday(timestamp)).toBe(true);
});
it('should return false for non-Sunday', () => {
// January 5, 2026 is a Monday
const timestamp = new Date('2026-01-05T12:00:00.000Z');
expect(isSunday(timestamp)).toBe(false);
});
});
describe('isFirstDayOfMonth', () => {
it('should return true for 1st of month', () => {
const timestamp = new Date('2026-01-01T12:00:00.000Z');
expect(isFirstDayOfMonth(timestamp)).toBe(true);
});
it('should return false for non-1st', () => {
const timestamp = new Date('2026-01-15T12:00:00.000Z');
expect(isFirstDayOfMonth(timestamp)).toBe(false);
});
});
describe('classifyAndCalculateExpiration', () => {
it('should return both categories and expiresAt', () => {
// Sunday, February 1, 2026 at 00:00 UTC - all categories
const timestamp = new Date('2026-02-01T00:00:00.000Z');
const result = classifyAndCalculateExpiration(timestamp);
expect(result.categories).toEqual(['hourly', 'daily', 'weekly', 'monthly']);
expect(result.expiresAt).toEqual(new Date('2027-02-01T00:00:00.000Z'));
});
it('should work for hourly-only backup', () => {
const timestamp = new Date('2026-01-07T14:30:00.000Z');
const result = classifyAndCalculateExpiration(timestamp);
expect(result.categories).toEqual(['hourly']);
expect(result.expiresAt).toEqual(new Date('2026-01-07T22:30:00.000Z'));
});
});
describe('TIERED_RETENTION constants', () => {
it('should have correct retention values', () => {
expect(TIERED_RETENTION.hourly).toBe(8);
expect(TIERED_RETENTION.daily).toBe(7);
expect(TIERED_RETENTION.weekly).toBe(4);
expect(TIERED_RETENTION.monthly).toBe(12);
});
});
});

View File

@@ -773,7 +773,7 @@ export const AdminBackupMobileScreen: React.FC = () => {
</span> </span>
</div> </div>
<div className="flex items-center gap-4 mb-3 text-xs text-slate-600"> <div className="flex flex-wrap items-center gap-2 mb-3 text-xs text-slate-600">
<span <span
className={`px-2 py-1 rounded ${ className={`px-2 py-1 rounded ${
backup.backupType === 'scheduled' backup.backupType === 'scheduled'
@@ -784,6 +784,11 @@ export const AdminBackupMobileScreen: React.FC = () => {
{backup.backupType} {backup.backupType}
</span> </span>
<span>{formatFileSize(backup.fileSizeBytes)}</span> <span>{formatFileSize(backup.fileSizeBytes)}</span>
{backup.expiresAt && (
<span className="text-slate-500">
Expires: {formatDate(backup.expiresAt)}
</span>
)}
</div> </div>
<div className="flex gap-2"> <div className="flex gap-2">

View File

@@ -288,6 +288,7 @@ export interface PromoteToAdminRequest {
export type BackupFrequency = 'hourly' | 'daily' | 'weekly' | 'monthly'; export type BackupFrequency = 'hourly' | 'daily' | 'weekly' | 'monthly';
export type BackupType = 'scheduled' | 'manual'; export type BackupType = 'scheduled' | 'manual';
export type BackupStatus = 'in_progress' | 'completed' | 'failed'; export type BackupStatus = 'in_progress' | 'completed' | 'failed';
export type BackupCategory = 'hourly' | 'daily' | 'weekly' | 'monthly';
export interface BackupHistory { export interface BackupHistory {
id: string; id: string;
@@ -304,6 +305,8 @@ export interface BackupHistory {
completedAt: string | null; completedAt: string | null;
createdBy: string | null; createdBy: string | null;
metadata: Record<string, unknown>; metadata: Record<string, unknown>;
categories: BackupCategory[];
expiresAt: string | null;
} }
export interface BackupSchedule { export interface BackupSchedule {

View File

@@ -386,6 +386,7 @@ export const AdminBackupPage: React.FC = () => {
<TableCell>Size</TableCell> <TableCell>Size</TableCell>
<TableCell>Status</TableCell> <TableCell>Status</TableCell>
<TableCell>Created</TableCell> <TableCell>Created</TableCell>
<TableCell>Expires</TableCell>
<TableCell align="right">Actions</TableCell> <TableCell align="right">Actions</TableCell>
</TableRow> </TableRow>
</TableHead> </TableHead>
@@ -415,6 +416,9 @@ export const AdminBackupPage: React.FC = () => {
/> />
</TableCell> </TableCell>
<TableCell>{formatDate(backup.startedAt)}</TableCell> <TableCell>{formatDate(backup.startedAt)}</TableCell>
<TableCell>
{backup.expiresAt ? formatDate(backup.expiresAt) : '-'}
</TableCell>
<TableCell align="right"> <TableCell align="right">
<IconButton <IconButton
size="small" size="small"